Skip to main content

Compilation and compilers

A compiler is a program that reads a program written in one language, the source language and translate into an equivalent program in another language that is target language.




                                Compiler view


Compilers are sometimes classified as single pass, multi pass, load-and-go, debugging or optimising, depending on how they have been constructed.

Bootstrap compiler: For bootstrapping, a compiler is characterized by 3 languages; the source language "S" that compile the target language "T" that generates code for and implementation language "I" that it is written in. We represent them using following diagram:

Phases of a compiler:
conceptually compiler operates in phases, each of which transforms the source program from one representation to another. 
the first three pages from the bulk of the analysis portion of a compiler.
Lexical analysis:
in a compiler, linear analysis is called lexical analysis or scanning. For example, in lexical analysis the characters in the assignment statement
               Position := initial + rate * 60
Would be grouped into the following tokens:
The identifier (position)
The assignment symbol (:=)
The identifier (initial)
The Plus sign (+)
The identifier (rate)
The multiplication sign (*)
The number (60)
The blanks separating the characters of these token would normally be eliminated during lexical analysis.
Intermediate code generation:
after syntax and semantic analysis some compiler generate an explicitly intermediate representation of the source program. This intermediate representation should have two important properties; it should be easy to produce and easy to translate into the target program.
The intermediate representation can have a variety of forms.
We consider an intermediate form called "three-address-code" which is like the assembly language for a machine in which every memory location can act like a register. Three address code consists of sequence of instructions, each of which has at most three operands. The source program might appear in three address code as:
temp 1 := int to real (60)
temp 2 := id3 * temp1
temp 3 := id2 + temp2
idl := temp 3
This intermediate form has several properties. First, each three address instruction has at most one operator in addition to the assignment. Thus, when generating these instructions, the compiler has to decide on the order in which operations are to be done; the multiplication precedes the addition in the source program. Secondly, the compiler was generated temporary name to hold the value computed by each instruction. Third, some "three address instructions" have fewer than three operands example the first and last instruction in 4.



Comments

Popular posts from this blog

Data Warehousing

  Data Warehouse is open to an almost limitless range of definitions. Simply put, data warehouses store and aggregation of a company's data. Data warehouses are an important asset for organisations to maintain efficiency, profitability and competitive advantages, organisations collect data through many sources- online, call centre, sales needs, inventory management. The data collected have degrees values and business relevance. Figure shown below shows the architecture of a typical data warehouse and illustrate the gathering of data, the storage of data, and the quaring and data analysis support. Different steps involved in getting data into a warehouse are called as extract, transform and lode or ELT tasks; extraction refers to getting data from the sources, while loaders reference to loading the data into data warehouse. Characteristics of data warehouse: Multidimensional conceptual view Generic dimensionality Unlimited dimensions and aggregation le...

DBMS: Normalization

Normalization : Normalization is the process of transformation of the conceptual schema of the database into a computer represent table form. Normalization is the process of removing the redundancies from incoming data.  Normalization is a technique to which helps the user to group the data and place the data in a table.  Normalization is a process which ensure the inconsistencies are not introduced into the database. Need of Normalization : we know with the time, most of databases grow time to time by adding new relations and relationships, the data may be used in different ways. Regularly the information may undergo series of updations in such situations, the performance of a database is entirely dependent upon its design.      A bad  database design  may lead to certain undesirable things: Repetition of information Inability to represent certain information  Loss of information Uses of Normalization: When data is large a...

Hub, repeater, switch, router, gateway, bridge

HUB Hub is a controller that controls the traffic on the network.  The following important properties of hub are:  1) It amplify signals. 2) It propagates signals through the network. 3) It does not require filtering. 4) It does not require path determination for switching. 5) It is used as network concentration points. Hubs are basically two types: 1) Active hub 2) Passive hub Active hub: A ctive hub works as repeater which is a hardware device that regenerates the received bit pattern before sending them out . Passive hub : A passive hub is a simple hardware device which provide a simple physical connection between the attached devices. Advantages of hub: It cannot filter the traffic full stop feeling generally refers to a process or device that screens network traffic for certain characteristics such as source address and destination address and protocol. Disadvantages of hub: On a hub, more than one user may try to send data on the netwo...