Vous êtes sur la page 1sur 4

COMPILER CONSTRUCTION TOOLS:

The compiler writer like any programmer can profitably use software tools such as
debuggers,version managers,profiters and so on.

Compiler construction tools are

• Parser generators

• Scanner generators

• Syntax-directed translations engines

• Automatic code generators

• Dataflow engines

PARSER GENERATORS:

These produce syntax analysers,normally from input that is based on CFG.

Eg: PIC,EQM

SCANNER GENERATOR:

These automatically generate lexical analyser,normally from a specification based on regular


expressions.

SYNTAX-DIRECTED TRANSLATION ENGINES:

These produce intermediate code with three address format,normally from input that is based
on the parse tree.

AUTOMATIC CODE GENERATOR:

• It takes a collection of rules that define the translation of each operation of the
intermediate language in to the machine language for the target machine.

• The input specification for these systems may contain:

1. A description of the lexical and syntactic structure of the source language.

2. A description of what output is to be generated for each source language


construct.

3. A description of the target machine.

DATAFLOW ENGINES:
Much of the information needed to perform good code optimization involves “dataflow
analysis”, the gathering of information about how values are transmitted from one part of a
program to each other part.

These systems have often been referred as,

• Compiler- compilers.

• Compiler-generators

• Translator-writing systems

ROLE OF LEXICAL ANALYSER:

To read the input characters and produce as output a sequence of tokens that the parser uses for
syntax analysis.

tokens

Source program Lexical parser


analyser
Get next token

Symbol table

• Receiving a “get next token” command from the parser, the lexical analyser reads input
characters until it can dentify the next token.

• Its secondary takes are,

1. One task is stripping out from the source program comments and while space in
the form of blank,tale,newline characters.

2. Another task is converting error messages from the compiler with the source
program.

• Two phases
1. Scanning

2. Lexical analysis

FUNCTIONS:

1. It produces the stream of tokens.

2. It eliminates blank and commands.

3. It generates symbol table which stores the information about ID,constants encounted
in the input.

4. It keeps track of line number.

5. It reports the error encountered while interrupting the tokens.

The scanner is responsible for doing simple tasks, while the lexical analyser proper
does the more complex operations.

ISSUES IN LEXICAL ANALYSIS:

There are several reasons for separating the analysis phase of compiling into lexical
analysis and parsing.

• Simpler design.

• Compiler efficiency is improved.

• Compiler portability is enhanced.

TOKEN:

It is a sequence of character that can be treated as a single logical entity. Typical tokens are,

1. Identifiers

2. Keywords

3. Operators

4. Special symbols

5. Constants

PATTERN:
A set of strings in the input for which the same token is produced as output. This set of
strings is described by a rule called a pattern associated with the token.

LEXEME:

It is sequence of characters in the source program that is matched by the pattern foe a token.

Vous aimerez peut-être aussi