1 languages and compilers (sprog og oversættere) lecture 2 bent thomsen department of computer...

84
1 Languages and Compilers (SProg og Oversættere) Lecture 2 Bent Thomsen Department of Computer Science Aalborg University nowledgement to Norm Hutchinson whose slides this lecture is based on.

Post on 19-Dec-2015

219 views

Category:

Documents


0 download

TRANSCRIPT

1

Languages and Compilers(SProg og Oversættere)

Lecture 2

Bent Thomsen

Department of Computer Science

Aalborg University

With acknowledgement to Norm Hutchinson whose slides this lecture is based on.

2

Today’s lecture

• Three topics– Treating Compilers and Interpreters as black-boxes

• Tombstone- or T- diagrams

– A first look inside the black-box

• Your guided tour

– Some Language Design Issues

3

Terminology

Translatorinput output

source program object program

is expressed in thesource language

is expressed in theimplementation language

is expressed in thetarget language

Q: Which programming languages play a role in this picture?

A: All of them!

4

Tombstone Diagrams

What are they?– diagrams consisting out of a set of “puzzle pieces” we can use

to reason about language processors and programs

– different kinds of pieces

– combination rules (not all diagrams are “well formed”)

M

Machine implemented in hardware

S -> T

L

Translator implemented in L

ML

Language interpreter in L

Program P implemented in L

LP

5

Tombstone diagrams: Combination rules

SP P

TS -> T

M

M

LP

S -> T

MWRONG!

OK!OK!

OK!MMP

OK!

M

LP

WRONG!

6

Tetrisx86C

Tetris

Compilation

x86

Example: Compilation of C programs on an x86 machine

C -> x86

x86x86

Tetris

x86

8

TetrisPPCC

Tetris

Cross compilation

x86

Example: A C “cross compiler” from x86 to PPC

C -> PPC

x86

A cross compiler is a compiler which runs on one machine (the host machine) but emits code for another machine (the target machine).

Host ≠ Target

Q: Are cross compilers useful? Why would/could we use them?

PPCTetris

PPC

download

9

Tetrisx86

TetrisJVMJava

Tetris

Two Stage Compilation

x86

Java->JVM

x86

A two-stage translator is a composition of two translators. The output of the first translator is provided as input to the second translator.

x86

JVM->x86

x86

10

x86

Java->x86

Compiling a Compiler

Observation: A compiler is a program! Therefore it can be provided as input to a language processor.Example: compiling a compiler.

Java->x86

C

x86

C -> x86

x86

11

Interpreters

An interpreter is a language processor implemented in software, i.e. as a program.

Terminology: abstract (or virtual) machine versus real machine

Example: The Java Virtual Machine

JVMx86

x86

JVMTetris

Q: Why are abstract machines useful?

12

Interpreters

Q: Why are abstract machines useful?

1) Abstract machines provide better platform independence

JVMx86

x86 PPC

JVMTetris

JVMPPC

JVMTetris

13

Interpreters

Q: Why are abstract machines useful?

2) Abstract machines are useful for testing and debugging.

Example: Testing the “Ultima” processor using hardware emulation

Ultimax86

x86

UltimaUltima

P

UltimaP

Functional equivalence

Note: we don’t have to implement Ultima emulator in x86 we can use a high-level language and compile it.

14

Interpreters versus Compilers

Q: What are the tradeoffs between compilation and interpretation?

Compilers typically offer more advantages when – programs are deployed in a production setting– programs are “repetitive”– the instructions of the programming language are complex

Interpreters typically are a better choice when– we are in a development/testing/debugging stage– programs are run once and then discarded – the instructions of the language are simple – the execution speed is overshadowed by other factors

• e.g. on a web server where communications costs are much higher than execution speed

15

Interpretive Compilers

Why?A tradeoff between fast(er) compilation and a reasonable runtime performance.

How?Use an “intermediate language”• more high-level than machine code => easier to compile to• more low-level than source language => easy to implement as an

interpreter

Example: A “Java Development Kit” for machine M

Java->JVM

M

JVMM

16

PJVMJava

P

Interpretive Compilers

Example: Here is how we use our “Java Development Kit” to run a Java program P

Java->JVM

M JVMMM

JVMP

Mjavac java

17

Portable Compilers

Example: Two different “Java Development Kits”

Java->JVM

JVM

JVMM

Kit 2:

Java->JVM

M

JVMM

Kit 1:

Q: Which one is “more portable”?

18

Portable Compilers

In the previous example we have seen that portability is not an “all or nothing” kind of deal.

It is useful to talk about a “degree of portability” as the percentage of code that needs to be re-written when moving to a different machine.

In practice 100% portability is as good as impossible.

19

Example: a “portable” compiler kit

Java->JVM

Java

JVMJava

Java->JVM

JVM

Q: Suppose we want to run this kit on some machine M. How could we go about realizing that goal? (with the least amount of effort)

Portable Compiler Kit:

20

Example: a “portable” compiler kit

Java->JVM

Java

JVMJava

Java->JVM

JVM

Q: Suppose we want to run this kit on some machine M. How could we go about realizing that goal? (with the least amount of effort)

JVMJava

JVMC

reimplement

C->M

M

JVMM

M

21

Example: a “portable” compiler kit

Java->JVM

Java

JVMJava

Java->JVM

JVM

JVMM

This is what we have now:

Now, how do we run our Tetris program?

TetrisJVMJava

Tetris

M

Java->JVM

JVMJVM

M

JVMTetris

JVMM

M

22

Bootstrapping

Java->JVM

Java

JVMJava

Java->JVM

JVM

Remember our “portable compiler kit”:

We haven’t used this yet!

Java->JVM

Java

Same language! Q: What can we do with a compiler written in itself? Is that useful at all?

JVMM

23

Bootstrapping

Java->JVM

Java

Same language!

Q: What can we do with a compiler written in itself? Is that useful at all?

• By implementing the compiler in (a subset of) its own language, we become less dependent on the target platform => more portable implementation.

• But… “chicken and egg problem”? How do to get around that?=> BOOTSTRAPPING: requires some work to make the first “egg”.

There are many possible variations on how to bootstrap a compiler written in its own language.

24

Bootstrapping an Interpretive Compiler to Generate M code

Java->JVM

Java

JVMJava

Java->JVM

JVM

Our “portable compiler kit”:

PMJava

P

Goal: we want to get a “completely native” Java compiler on machine M

Java->M

M

JVMM

M

25

Bootstrapping an Interpretive Compiler to Generate M code (first approach)

Step 1: implement

Java->M

Java JVM

Java ->MJava->JVM

JVMJVM

M

M

Java ->M

Java

Step 2: compile it

Step 3: Use this to compile again

by rewriting Java ->JVM

Java

26

Bootstrapping an Interpretive Compiler to Generate M code (first approach)

Step 3: “Self compile” the Java (in Java) compiler

M

Java->M

JVMM

M

Java->M

Java Java->M

JVM

This is our desired compiler!

Step 4: use this to compile the P program

PMJava

PJava->M

M

27

Bootstrapping an Interpretive Compiler to Generate M code (second approach)

Idea: we will build a two-stage Java -> M compiler.

PM

PMJava

P PJVM

M

Java->JVM

MM

JVM->M

M

We will make this by compiling

To get this we implement

JVM->M

JavaJava->JVM

JVM and compile it

28

Bootstrapping an Interpretive Compiler to Generate M code (second approach)

Step 1: implement

JVM->M

Java JVM

JVM->MJava->JVM

JVMJVM

M

M

JVM->M

Java

Step 2: compile it

Step 3: compile this

29

Bootstrapping an Interpretive Compiler to Generate M code (second approach)

Step 3: “Self compile” the JVM (in JVM) compiler

M

JVM->M

JVMM

M

JVM->M

JVM JVM->M

JVM

This is the second stage of our compiler!

Step 4: use this to compile the Java compiler

30

Bootstrapping an Interpretive Compiler to Generate M code

Step 4: Compile the Java->JVM compiler into machine code

M

Java->JVM

M

Java->JVM

JVM JVM->M

M

The first stage of our compiler!

We are DONE!

PM

PMJava

P PJVM

M

Java->JVM

M

M

JVM->M

M

31

Full Bootstrap

A full bootstrap is necessary when we are building a new compiler from scratch.

Example:We want to implement an Ada compiler for machine M. We don’t currently have access to any Ada compiler (not on M, nor on any other machine).

Idea: Ada is very large, we will implement the compiler in a subset of Ada and bootstrap it from a subset-of-Ada compiler in another language. (e.g. C)

Ada-S ->M

C

v1Step 1: build a compiler for Ada-S in another language

32

Full Bootstrap

Ada-S ->M

C

v1

Step 1a: build a compiler (v1) for Ada-S in another language.

Ada-S ->M

C

v1

M

Ada-S->Mv1

Step 1b: Compile v1 compiler on M

M

C->M

MThis compiler can be used for bootstrapping on machine M but we do not want to rely on it permanently!

33

Full Bootstrap

Ada-S ->M

Ada-S

v2

Step 2a: Implement v2 of Ada-S compiler in Ada-S

Ada-S ->M

Ada-S

v2

M

M

Ada-S->Mv2

Step 2b: Compile v2 compiler with v1 compiler

Ada-S ->M

M

v1

Q: Is it hard to rewrite the compiler in Ada-S?

We are now no longer dependent on the availability of a C compiler!

34

Full Bootstrap

Step 3a: Build a full Ada compiler in Ada-S

Ada->M

Ada-S

v3

M

M

Ada->Mv3

Ada-S ->M

M

v2

Step 3b: Compile with v2 compiler

Ada->M

Ada-S

v3

From this point on we can maintain the compiler in Ada. Subsequent versions v4,v5,... of the compiler can be written in Ada and each compiled with the previous version.

35

Half Bootstrap

We discussed full bootstrap which is required when we have no access to a compiler for our language at all.

Q: What if we have access to a compiler for our language on a different machine HM but want to develop one for TM ?

Ada->HM

HM

We have:

Ada->TM

TM

We want:

Idea: We can use cross compilation from HM to TM to bootstrap the TM compiler.

Ada->HM

Ada

36

HM

Ada->TM

Half Bootstrap

Idea: We can use cross compilation from HM to M to bootstrap the M compiler.

Step 1: Implement Ada->TM compiler in Ada

Ada->TM

Ada

Step 2: Compile on HM

Ada->TM

Ada Ada->HM

HM

HM

Cross compiler: running on HM but

emits TM code

37

TM

Ada->TM

Half Bootstrap

Step 3: Cross compile our TM compiler.

Ada->TM

Ada Ada->TM

HM

HM

DONE!

From now on we can develop subsequent versions of the compiler completely on TM

38

Bootstrapping to Improve Efficiency

The efficiency of programs and compilers:Efficiency of programs:

- memory usage- runtime

Efficiency of compilers: - Efficiency of the compiler itself- Efficiency of the emitted code

Idea: We start from a simple compiler (generating inefficient code) and develop more sophisticated versions of it. We can then use bootstrapping to improve performance of the compiler.

39

Bootstrapping to Improve Efficiency

We have:Ada->Mslow

Ada

Ada-> Mslow

Mslow

We implement:Ada->Mfast

Ada

Ada->Mfast

Ada

M

Ada->Mfast

Mslow

Step 1

Ada-> Mslow

Mslow

Step 2 Ada->Mfast

Ada

M

Ada->Mfast

MfastAda-> Mfast

Mslow

Fast compiler that emits fast code!

40

Conclusion

• To write a good compiler you may be writing several simpler ones first

• You have to think about the source language, the target language and the implementation language.

• Strategies for implementing a compiler1. Write it in machine code

2. Write it in a lower level language and compile it using an existing compiler

3. Write it in the same language that it compiles and bootstrap

• The work of a compiler writer is never finished, there is always version 1.x and version 2.0 and …

41

Compilation

So far we have treated language processors (including compilers) as “black boxes”

Now we take a first look "inside the box": how are compilers built.

And we take a look at the different “phases” and their relationships

42

The “Phases” of a Compiler

Syntax Analysis

Contextual Analysis

Code Generation

Source Program

Abstract Syntax Tree

Decorated Abstract Syntax Tree

Object Code

Error Reports

Error Reports

43

Different Phases of a Compiler

The different phases can be seen as different transformation steps to transform source code into object code.

The different phases correspond roughly to the different parts of the language specification:

• Syntax analysis <-> Syntax• Contextual analysis <-> Contextual constraints• Code generation <-> Semantics

44

Mini Triangle

Mini Triangle is a very simple Pascal-like programming language.

An example program:

!This is a comment.let const m ~ 7; var nin begin n := 2 * m * m ; putint(n) end

!This is a comment.let const m ~ 7; var nin begin n := 2 * m * m ; putint(n) end

Declarations

Command

Expression

45

Syntax of Mini Triangle

Program ::= single-Commandsingle-Command ::= V-name := Expression | Identifier ( Expression ) | if Expression then single-Command else single-Command | while Expression do single-Command | let Declaration in single-Command | begin Command endCommand ::= single-Command | Command ; single-Command

Program ::= single-Commandsingle-Command ::= V-name := Expression | Identifier ( Expression ) | if Expression then single-Command else single-Command | while Expression do single-Command | let Declaration in single-Command | begin Command endCommand ::= single-Command | Command ; single-Command

46

Syntax of Mini Triangle (continued)

Expression ::= primary-Expression | Expression Operator primary-Expressionprimary-Expression ::= Integer-Literal | V-name | Operator primary-Expression | ( Expression ) V-name ::= IdentifierIdentifier ::= Letter | Identifier Letter | Identifier DigitInteger-Literal ::= Digit | Integer-Literal DigitOperator ::= + | - | * | / | < | > | =

Expression ::= primary-Expression | Expression Operator primary-Expressionprimary-Expression ::= Integer-Literal | V-name | Operator primary-Expression | ( Expression ) V-name ::= IdentifierIdentifier ::= Letter | Identifier Letter | Identifier DigitInteger-Literal ::= Digit | Integer-Literal DigitOperator ::= + | - | * | / | < | > | =

47

Syntax of Mini Triangle (continued)

Declaration ::= single-Declaration | Declaration ; single-Declarationsingle-Declaration ::= const Identifier ~ Expression | var Identifier : Type-denoterType-denoter ::= Identifier

Declaration ::= single-Declaration | Declaration ; single-Declarationsingle-Declaration ::= const Identifier ~ Expression | var Identifier : Type-denoterType-denoter ::= Identifier

Comment ::= ! CommentLine eolCommentLine ::= Graphic CommentLineGraphic ::= any printable character or space

Comment ::= ! CommentLine eolCommentLine ::= Graphic CommentLineGraphic ::= any printable character or space

48

Syntax Trees

A syntax tree is an ordered labeled tree such that:a) terminal nodes (leaf nodes) are labeled by terminal symbols

b) non-terminal nodes (internal nodes) are labeled by non terminal symbols.

c) each non-terminal node labeled by N has children X1,X2,...Xn (in this order) such that N := X1,X2,...Xn is a production.

49

Syntax Trees

Example:

Expression

Expression

V-name

primary-Exp.

Expression

Ident

d +

primary-Exp

Op Int-Lit

10 *

Op

V-name

primary-Exp.

Ident

d

Expression ::= Expression Op primary-Exp1 2 3

1

2

3

50

Concrete and Abstract Syntax

The previous grammar specified the concrete syntax of Mini Triangle.

The concrete syntax is important for the programmer who needs to know exactly how to write syntactically well-formed programs.

The abstract syntax omits irrelevant syntactic details and only specifies the essential structure of programs.

Example: different concrete syntaxes for an assignmentv := e (set! v e)e -> vv = e

51

Concrete Syntax of Commands

single-Command ::= V-name := Expression | Identifier ( Expression ) | if Expression then single-Command else single-Command | while Expression do single-Command | let Declaration in single-Command | begin Command endCommand ::= single-Command | Command ; single-Command

single-Command ::= V-name := Expression | Identifier ( Expression ) | if Expression then single-Command else single-Command | while Expression do single-Command | let Declaration in single-Command | begin Command endCommand ::= single-Command | Command ; single-Command

52

Abstract Syntax of Commands

Command ::= V-name := Expression AssignCmd | Identifier ( Expression ) CallCmd | if Expression then Command else Command IfCmd | while Expression do Command WhileCmd | let Declaration in Command LetCmd | Command ; Command SequentialCmd

Command ::= V-name := Expression AssignCmd | Identifier ( Expression ) CallCmd | if Expression then Command else Command IfCmd | while Expression do Command WhileCmd | let Declaration in Command LetCmd | Command ; Command SequentialCmd

53

Concrete Syntax of Expressions (recap)

Expression ::= primary-Expression | Expression Operator primary-Expressionprimary-Expression ::= Integer-Literal | V-name | Operator primary-Expression | ( Expression ) V-name ::= Identifier

Expression ::= primary-Expression | Expression Operator primary-Expressionprimary-Expression ::= Integer-Literal | V-name | Operator primary-Expression | ( Expression ) V-name ::= Identifier

54

Abstract Syntax of Expressions

Expression ::= Integer-Literal IntegerExp | V-name VnameExp | Operator Expression UnaryExp | Expression Op Expression BinaryExpV-name::= Identifier SimpleVName

Expression ::= Integer-Literal IntegerExp | V-name VnameExp | Operator Expression UnaryExp | Expression Op Expression BinaryExpV-name::= Identifier SimpleVName

55

Abstract Syntax Trees

Abstract Syntax Tree for: d:=d+10*n

BinaryExpression

VNameExp

BinaryExpression

Ident

d +

Op Int-Lit

10 *

Op

SimpleVName

IntegerExp VNameExp

Ident

n

SimpleVName

AssignmentCmd

d

Ident

VName

SimpleVName

56

1) Syntax Analysis

Syntax Analysis

Source Program

Abstract Syntax Tree

Error Reports

Note: Not all compilers construct anexplicit representation of an AST. (e.g. on a “single pass compiler” generally no need to construct an AST)

Note: Not all compilers construct anexplicit representation of an AST. (e.g. on a “single pass compiler” generally no need to construct an AST)

57

Example Program

We now look at each of the three different phases in a little more detail. We look at each of the steps in transforming an example Mini Triangle program into TAM code.

! This program is useless except for! illustrationlet var n: integer; var c: charin begin c := ‘&’; n := n+1end

! This program is useless except for! illustrationlet var n: integer; var c: charin begin c := ‘&’; n := n+1end

58

1) Syntax Analysis -> AST

Program

LetCommand

SequentialDeclaration

n Integer c Char c ‘&’ n n + 1

Ident Ident Ident Ident Ident Ident Ident OpChar.Lit Int.Lit

SimpleT

VarDecl

SimpleT

VarDecl

SimpleV

Char.Expr

SimpleV

VNameExp Int.Expr

AssignCommand BinaryExpr

SequentialCommand

AssignCommand

59

2) Contextual Analysis -> Decorated AST

Contextual Analysis

Decorated Abstract Syntax Tree

Error Reports

Abstract Syntax Tree

Contextual analysis:

• Scope checking: verify that all applied occurrences of identifiers are declared

• Type checking: verify that all operations in the program are used according to their type rules.

Annotate AST:• Applied identifier occurrences => declaration• Expressions => Type

60

2) Contextual Analysis -> Decorated AST

Program

LetCommand

SequentialDeclaration

n

Ident Ident Ident Ident

SimpleT

VarDecl

SimpleT

VarDecl

Integer c Char c ‘&’ n n + 1

Ident Ident Ident OpChar.Lit Int.Lit

SimpleV

Char.Expr

SimpleV

VNameExp Int.Expr

AssignCommand BinaryExpr

SequentialCommand

AssignCommand

:char

:char

:int

:int

:int :int

61

Contextual Analysis

Finds scope and type errors.

AssignCommand

:char

Example 1:

:int

***TYPE ERROR (incompatible types in assigncommand)

Example 2:

foo

Ident

SimpleV

foo not found

***SCOPE ERROR: undeclared variable foo

62

3) Code Generation

• Assumes that program has been thoroughly checked and is well formed (scope & type rules)

• Takes into account semantics of the source language as well as the target language.

• Transforms source program into target code.

Code Generation

Decorated Abstract Syntax Tree

Object Code

63

3) Code Generation

let var n: integer; var c: charin begin c := ‘&’; n := n+1end

PUSH 2LOADL 38STORE 1[SB]LOAD 0LOADL 1CALL addSTORE 0[SB]POP 2HALT

n

Ident Ident

SimpleT

VarDecl

Integer

address = 0[SB]

64

Compiler Passes

• A pass is a complete traversal of the source program, or a complete traversal of some internal representation of the source program.

• A pass can correspond to a “phase” but it does not have to!

• Sometimes a single “pass” corresponds to several phases that are interleaved in time.

• What and how many passes a compiler does over the source program is an important design decision.

65

Single Pass Compiler

Compiler Driver

Syntactic Analyzer

calls

calls

Contextual Analyzer Code Generator

calls

Dependency diagram of a typical Single Pass Compiler:

A single pass compiler makes a single pass over the source text, parsing, analyzing and generating code all at once.

66

Multi Pass Compiler

Compiler Driver

Syntactic Analyzer

callscalls

Contextual Analyzer Code Generator

calls

Dependency diagram of a typical Multi Pass Compiler:

A multi pass compiler makes several passes over the program. The output of a preceding phase is stored in a data structure and used by subsequent phases.

input

Source Text

output

AST

input output

Decorated AST

input output

Object Code

67

The Mini Triangle Compiler Driver

public class Compiler { public static void compileProgram(...) {

Parser parser = new Parser(...);Checker checker = new Checker(...);Encoder generator = new Encoder(...);

Program theAST = parser.parse();

checker.check(theAST);generator.encode(theAST);

} public void main(String[] args) {

... compileProgram(...) ... }}

public class Compiler { public static void compileProgram(...) {

Parser parser = new Parser(...);Checker checker = new Checker(...);Encoder generator = new Encoder(...);

Program theAST = parser.parse();

checker.check(theAST);generator.encode(theAST);

} public void main(String[] args) {

... compileProgram(...) ... }}

68

Compiler Design Issues

Single Pass Multi Pass

Speed

Memory

Modularity

Flexibility

“Global” optimization

Source Language

better worse

better for large programs

(potentially) better for small programs

worse better

betterworse

impossible possible

single pass compilers are not possible for many programming languages

69

Language Issues

Example Pascal:

Pascal was explicitly designed to be easy to implement with a single pass compiler:– Every identifier must be declared before it is first used.

var n:integer;

procedure inc;begin n:=n+1end

Undeclared Variable!

procedure inc;begin n:=n+1end;

var n:integer;

?

70

Language Issues

Example Pascal:– Every identifier must be declared before it is used.

– How to handle mutual recursion then?

procedure ping(x:integer)begin ... pong(x-1); ...end;

procedure pong(x:integer)begin ... ping(x); ...end;

71

Language Issues

Example Pascal:– Every identifier must be declared before it is used.

– How to handle mutual recursion then?

forward procedure pong(x:integer)

procedure ping(x:integer)begin ... pong(x-1); ...end;

procedure pong(x:integer)begin ... ping(x); ...end;

OK!

72

Language Issues

Example Java:– identifiers can be declared before they are used.

– thus a Java compiler needs at least two passes

Class Example {

void inc() { n = n + 1; }

int n;

void use() { n = 0 ; inc(); }

}

73

Scope of Variable

• Range of program that can reference that variable (ie access the corresponding data object by the variable’s name)

• Variable is local to program or block if it is declared there

• Variable is non-local to program unit if it is visible there but not declared there

74

Static vs. Dynamic Scope

• Under static, sometimes called lexical, scope, sub1 will always reference the x defined in big

• Under dynamic scope, the x it references depends on the dynamic state of execution

procedure big; var x:

integer; procedure

sub1; begin {sub1} ... x ... end; {sub1} procedure

sub2; var x:

integer; begin {sub2} ... sub1; ... end; {sub2}

begin {big}

...

sub1;

sub2;

...

end; {big}

75

Static Scoping

• Scope computed at compile time, based on program text

• To determine the name of a used variable we must find statement declaring variable

• Subprograms and blocks generate hierarchy of scopes

– Subprogram or block that declares current subprogram or contains current block is its static parent

• General procedure to find declaration:– First see if variable is local; if yes, done

– If non-local to current subprogram or block recursively search static parent until declaration is found

– If no declaration is found this way, undeclared variable error detected

76

Example

program main;

var x : integer;

procedure sub1;

var x : integer;

begin { sub1 }

… x …

end; { sub1 }

begin { main }

… x …

end; { main }

77

Dynamic Scope

• Now generally thought to have been a mistake • Main example of use: original versions of LISP

– Scheme uses static scope

– Perl allows variables to be declared to have dynamic scope

• Determined by the calling sequence of program units, not static layout

• Name bound to corresponding variable most recently declared among still active subprograms and blocks

78

Example

program main;

var x : integer;

procedure sub1;

begin { sub1 }

… x …

end; { sub1 }

procedure sub2;

var x : integer;

begin { sub2 }

… call sub1 …

end; { sub2 }

… call sub2…

end; { main }

79

Binding

• Binding: an association between an attribute and its entity

• Binding Time: when does it happen?

• … and, when can it happen?

80

Binding of Data Objects and Variables

• Attributes of data objects and variables have different binding times

• If a binding is made before run time and remains fixed through execution, it is called static

• If the binding first occurs or can change during execution, it is called dynamic

81

Binding Time

Static• Language definition time• Language

implementation time• Program writing time• Compile time• Link time• Load time

Dynamic• Run time

– At the start of execution (program)

– On entry to a subprogram or block

– When the expression is evaluated

– When the data is accessed

82

X = X + 10

• Set of types for variable X• Type of variable X• Set of possible values for variable X• Value of variable X• Scope of X

– lexical or dynamic scope• Representation of constant 10

– Value (10)– Value representation (10102)

• big-endian vs. little-endian– Type (int)– Storage (4 bytes)

• stack or global allocation• Properties of the operator +

– Overloaded or not

83

Little- vs. Big-Endians

• Big-endian

– A computer architecture in which, within a given multi-byte numeric representation, the most significant byte has the lowest address (the word is stored `big-end-first').

– Motorola and Sun processors

• Little-endian

– a computer architecture in which, within a given 16- or 32-bit word, bytes at lower addresses have lower significance (the word is stored `little-end-first').

– Intel processors

from The Jargon Dictionary - http://info.astrian.net/jargon

84

Binding Times summary

• Language definition time: – language syntax and semantics, scope discipline

• Language implementation time: – interpreter versus compiler, – aspects left flexible in definition, – set of available libraries

• Compile time: – some initial data layout, internal data structures

• Link time (load time): – binding of values to identifiers across program modules

• Run time (execution time): – actual values assigned to non-constant identifiers

The Programming language designer and compiler implementer have to make decisions about binding times

85

Summary of today’s lecture

• Three topics– Treating Compilers and Interpreters as black-boxes

• Tombstone- or T- diagrams

– A first look inside the black-box

• Your guided tour

– Some Language Design Issues