Programming Language Concepts Flashcards
What is aliasing?
When 2 variables point to the same memory location
Static vs dynamic execution time
Static: before execution - compile time
Dynamic: during execution - runtime
What is the extent of a variable?
The time between allocation & deallocation
What is the scope of a variable?
The part of the code in which the variable can be referenced
Lexical vs dynamic scope
Lexical: scope determined by variable location
Dynamic: scope determined by sequence of function calls
Syntax vs semantics
Syntax: the structure of statments
Semantics: the meaning and behavior of syntactically correct statements and expressions
Concrete vs abstract syntax tree
Concrete: includes all elements explicitly as they appear in the source (a lot of unnecessary nodes)
Abstract: omits unnecessary syntactic details for simplicity & clarity
How can a grammar be ambiguous?
If a string has more than one valid derivation (different parse trees)
How can ambiguity be removed from a grammar?
Use precedence: higher precedence operators (*) appear lower in the tree (than +)
Use left associativity: 2+3+4 = (2+3)+4 (left) and put higher precedence operators on the left (<expr> ::= <expr> + <mexpr>
)
What is a lexeme?
A pattern of characters in the input string that are considered meaningful e.g. “while” is a lexeme identified as the while command token
How are lexemes identified?
With a scanner
e.g. for input “…123,456…” & pattern [1-9]+, has lexemes “123” & “456”. if pattern was [1-9]+,[1-9]+, lexemes “123,456”
What is maximal munch?
Where the scanner consumes the longest possible sequence of characters that form a valid token
What are lexer generators?
A software tool that:
1) Takes input
2) Describes lexemes and what tokens they produce
3) Generates code for you that performs lexical analysis
Top-down vs bottom-up approach to parsing
Top-down: builds parse tree from root down (uses LL parsers)
Bottom-up: builds parse tree from leaves up (uses LR parsers)
What are LL and LR parsers?
LL: reads parse trees from left to right and forms leftmost derivations
LR: reads parse trees from left to right and forms rightmost derivations
Left recursive vs right recursive grammars
LRG: A -> Aa | B
RRG: A -> aA | B
What is shift-reduce?
A bottom-up parsing technique where the parser shifts input symbols onto a stack until a rule can be applied to reduce the stack’s top elements to a non-terminal, gradually building up a parse tree from the input
What is a shift-reduce conflict?
When the parser can either shift an expression (1+2+3 -> 1+2+3) or reduce it (1+2+3 -> 3+3)
What 2 ways can shift-reduce conflicts be avoided?
Layering non-terminals: defining separate non-terminal symbols for different levels of operator precedence (higher-level terminals defined in terms of lower-level terminals)
Specify precedence: use directives such as %left (%left ‘+’ ‘-‘ so “1+2-3” -> (1+2)+3) -> , %right, and %nonassoc (“1>2>3” is disallowed)
What is a parser generator?
A software that, given input describing rules of grammar and what to perform when each rule is used, will generate code for you that performs parsing
Static typing vs dynamic typing
Static: performs type checking at compile time; run time type values are approximated
Dynamic: performs type checking at run time; exact types used
Strong vs weak typing
Strong: when operation invoked, arguments provided are of the correct type
Weak: wrong types may be passed to a function and function chooses how to behave (e.g. coerces data to be required type)
What is type checking?
A part of compilation from statically typed languages
Done on AST of program
What is inversion lemma?
Where we infer types of subprograms from the type of the whole program by reading type rules from bottom to top
What are structured types?
Types that make the distinction between data’s shape & size e.g. a pair of 2 ints vs 2 ints in an array
What are the 3 approaches for subtyping?
Types as sets: type T is a subtype of U is set [[T]] is a subset of [[U]]
Structural subtyping: T is a subtype of U if every operation that can be performed on U can be performed on T
Nominal subtyping: declare types as subtypes of others
Covariance vs contravariance
Covariance: If S is a subtype of T, then C<S> is considered a subtype of C<T> | if T <: U and V <: V then T x V <: U x V
Contravariance: If S is a subtype of T, then C<T> is considered a subtype of C<S> | if T <: U then Foo U <: Foo T</S></T></T></S>
What is the subtyping situation of arrays?
Non-structural
When reading arrays, array type should be covariant; when writing arrays, array type should be contravariant
Arrays are INVARIANT
What is the Curry-Howard correspondence?
Ideates that types are logical propositions & programs are constructive proofs
Function types in lambda T → U correspond to implication A => B in natural deduction
What are the 3 approaches to formal semantics?
Denotational semantics: programs are mapped to some point in a mathematical structure that represents the values that the program calculates e.g. [[if (0<1) then 0 else 1]] = 0
Operational semantics: define relations between programs and the values they produce e.g. if (0<1) then 0 else 1 -> if (true) then 0 else 1 -> 0
Axiomatic semantics: the meaning of a program is the properties you can prove of it using formal semantics
What is big step operational semantics?
Specifies the execution of programs in terms of a relation between the initial program state and its final result, typically expressed as a set of inference rules
What is small step operational semantics?
Breaks down the execution of programs into atomic computational steps, typically expressed as a relation between successive program states
What is determinstic reduction?
Where the reduction rules for evaluating expressions or executing program steps are unambiguous and lead to a unique result for a given initial state
What are error relations?
Relations that describe when an evaluation reaches an error state e.g. if (n) then E else E’↯
What is preservation?
If a well-typed program takes a step of execution, the resulting program state remains well-typed
if ⊢ E:T and E↓V then ⊢ V:T
What is progress?
Ensures that a well-typed program either takes a step of execution or is already in its final state, guaranteeing that a well-typed program will never get “stuck”
How is type safety achieved?
By repeatedly applying progress then preservation
Signal & wait vs signal & continue
Both paradigms of synchronisation
Signal & Wait:
* for synchronous communication
* a thread signals an event to another thread and cannot be scheduled further until an ACK is received
Signal & Continue:
* for asynchronous communication
* a thread signals an event to another thread and continues execution immediately after signalling
What do each of these CCS (Calculus of Communication Systems) operations do: nil, a!P, a?P, τP, P||P, P\a, rec X
- nil - termination
- a!P - sends signal on channel a and continues execution as P
- a?P - receives signal on channel a and continues execution as P
- τP - internal computation step
- P||P - represents 2 concurrent processes
- P\a - no communication on channel a originating inside of P can be seen
- rec X - recursive behaviour e.g. rec X . a!b?X repeatedly sends a signal on a and receives a signal on b
What are the 2 types of nondeterminism?
Internal: the machine chooses
External: the environment chooses e.g. vending machine & user can be thought of as a concurrent system
What is a trace?
A sequence of observations from some state (including ε)
What is trace equivalence?
When 2 states have the same set of all possible traces
What does it mean (in an LTS) for state y to simulate state x?
When x does an action and becomes x’, y can do the same action becoming y’ in the process
y’ can do everything x’ can do
What is similarity?
For an LTS L there is a largest simulation on L
Denoted ≤
What is simulation equivalence?
States x & y are simulation equivalent if x≤y and y≤x (each system can simulate the behaviour of the other)
Also means x and y are trace equivalent
What is bisimulation?
A simulation that goes both ways - the relation & its reverse are both simulations
Denoted x~y if there’s a bisimulation containing (x,y)
What is bisimilarity?
The largest bisimulation on an LTS
Denoted ~
What is context switch?
The operation of switching control from one thread to another
What is a race condition and its 2 necessities?
The situation in which 2+ threads simultaneously try to write or read from shared memory
Aliasing: the same location in shared memory must be accessible from multiple threads
Mutability: data on the heap can be altered
Fine-grained vs coarse-grained concurrency
Fine-grained: systems with more locking of small blocks of code; usually protected by lock-free approaches using CAS
Coarse-grained: systems with fewer locks over large blocks of code; protected by mutex lock
Why must a balance between fine-grained & coarse-grained concurrency be found?
Too specific (fine) -> overhead increases (computation cycles for waiting, acquiring, releasing locks)
Too general (coarse) -> deadlock/contention risk increases
What is intrinsic locking?
1) When a thread wants to access shared resource, it calls lock() on mutex object
2) If mutex isn’t currently locked, it grants access
3) Once thread is done, it calls unlock() on the mutex object
What are the 3 advantages of message passing concurrency?
- Don’t have to worry about critical regions & mutex locks
- Less risk of deadlock as not necessarily holding locks
- Avoid problems with cache visibility as we know which thread is the intended recipient of the message
What are the 2 disadvantages of message passing concurrency?
- Slow
- Hard to ensure global consensus
Coroutines vs threads
Both concurrent, only threading parellel
Coroutines designed for cooperative multi-tasking style; threads designed for pre-emptive multi-tasking
How do futures & promises work?
1) Construct a promise object
2) Get the (invalid but usable) future object from the promise
3) Move the promise to another thread/function
4) When the function completes, a return value/exception is placed in the promise and the future becomes available
5) get() on the future object ro retreive data
What 2 ways can performance be measured in a concurrent system?
Throughput: tasks per unit time
Responsiveness: time until first response from a task
What is Amdahl’s Law?
The overall performance improvement gained by optimizing a single part of a system is limited by the fraction of time that the improved part is actually used
Amdahl’s Law equation
Let F be the fraction of the task that is sequential
Let P be the no. processors available
speedup <= 1 / ((1-F) + F/P)
What is CAS (Compare-And-Set)
An atomic operation
Has semantics: CAS (V,A,B)
* V - the memory reference
* A - the old value stored in reference
* B - the new value to store in reference
If V contains A, B is written to V and returns ture
If V doesn’t contain A, nothing is written to V and returns false
What is exponential backoff?
Where the duration threads wait doubles each time after failing access
This reduces contention
What are elimination arrays?
If a thread is trying to push a value onto a stack and another thread is trying pop a value from the stack, instead of putting the value of the stack, just return the value directly to the thread that’s trying to pop
Reduces contention at the stack top