Programming Language Concepts Flashcards

1
Q

What is aliasing?

A

When 2 variables point to the same memory location

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Static vs dynamic execution time

A

Static: before execution - compile time
Dynamic: during execution - runtime

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is the extent of a variable?

A

The time between allocation & deallocation

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the scope of a variable?

A

The part of the code in which the variable can be referenced

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Lexical vs dynamic scope

A

Lexical: scope determined by variable location
Dynamic: scope determined by sequence of function calls

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Syntax vs semantics

A

Syntax: the structure of statments
Semantics: the meaning and behavior of syntactically correct statements and expressions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Concrete vs abstract syntax tree

A

Concrete: includes all elements explicitly as they appear in the source (a lot of unnecessary nodes)
Abstract: omits unnecessary syntactic details for simplicity & clarity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

How can a grammar be ambiguous?

A

If a string has more than one valid derivation (different parse trees)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How can ambiguity be removed from a grammar?

A

Use precedence: higher precedence operators (*) appear lower in the tree (than +)
Use left associativity: 2+3+4 = (2+3)+4 (left) and put higher precedence operators on the left (<expr> ::= <expr> + <mexpr>)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is a lexeme?

A

A pattern of characters in the input string that are considered meaningful e.g. “while” is a lexeme identified as the while command token

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How are lexemes identified?

A

With a scanner
e.g. for input “…123,456…” & pattern [1-9]+, has lexemes “123” & “456”. if pattern was [1-9]+,[1-9]+, lexemes “123,456”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is maximal munch?

A

Where the scanner consumes the longest possible sequence of characters that form a valid token

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What are lexer generators?

A

A software tool that:
1) Takes input
2) Describes lexemes and what tokens they produce
3) Generates code for you that performs lexical analysis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Top-down vs bottom-up approach to parsing

A

Top-down: builds parse tree from root down (uses LL parsers)
Bottom-up: builds parse tree from leaves up (uses LR parsers)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What are LL and LR parsers?

A

LL: reads parse trees from left to right and forms leftmost derivations
LR: reads parse trees from left to right and forms rightmost derivations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Left recursive vs right recursive grammars

A

LRG: A -> Aa | B
RRG: A -> aA | B

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

What is shift-reduce?

A

A bottom-up parsing technique where the parser shifts input symbols onto a stack until a rule can be applied to reduce the stack’s top elements to a non-terminal, gradually building up a parse tree from the input

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

What is a shift-reduce conflict?

A

When the parser can either shift an expression (1+2+3 -> 1+2+3) or reduce it (1+2+3 -> 3+3)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

What 2 ways can shift-reduce conflicts be avoided?

A

Layering non-terminals: defining separate non-terminal symbols for different levels of operator precedence (higher-level terminals defined in terms of lower-level terminals)
Specify precedence: use directives such as %left (%left ‘+’ ‘-‘ so “1+2-3” -> (1+2)+3) -> , %right, and %nonassoc (“1>2>3” is disallowed)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

What is a parser generator?

A

A software that, given input describing rules of grammar and what to perform when each rule is used, will generate code for you that performs parsing

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

Static typing vs dynamic typing

A

Static: performs type checking at compile time; run time type values are approximated
Dynamic: performs type checking at run time; exact types used

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Strong vs weak typing

A

Strong: when operation invoked, arguments provided are of the correct type
Weak: wrong types may be passed to a function and function chooses how to behave (e.g. coerces data to be required type)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

What is type checking?

A

A part of compilation from statically typed languages
Done on AST of program

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

What is inversion lemma?

A

Where we infer types of subprograms from the type of the whole program by reading type rules from bottom to top

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

What are structured types?

A

Types that make the distinction between data’s shape & size e.g. a pair of 2 ints vs 2 ints in an array

26
Q

What are the 3 approaches for subtyping?

A

Types as sets: type T is a subtype of U is set [[T]] is a subset of [[U]]
Structural subtyping: T is a subtype of U if every operation that can be performed on U can be performed on T
Nominal subtyping: declare types as subtypes of others

27
Q

Covariance vs contravariance

A

Covariance: If S is a subtype of T, then C<S> is considered a subtype of C<T> | if T <: U and V <: V then T x V <: U x V
Contravariance: If S is a subtype of T, then C<T> is considered a subtype of C<S> | if T <: U then Foo U <: Foo T</S></T></T></S>

28
Q

What is the subtyping situation of arrays?

A

Non-structural
When reading arrays, array type should be covariant; when writing arrays, array type should be contravariant
Arrays are INVARIANT

29
Q

What is the Curry-Howard correspondence?

A

Ideates that types are logical propositions & programs are constructive proofs
Function types in lambda T → U correspond to implication A => B in natural deduction

30
Q

What are the 3 approaches to formal semantics?

A

Denotational semantics: programs are mapped to some point in a mathematical structure that represents the values that the program calculates e.g. [[if (0<1) then 0 else 1]] = 0

Operational semantics: define relations between programs and the values they produce e.g. if (0<1) then 0 else 1 -> if (true) then 0 else 1 -> 0

Axiomatic semantics: the meaning of a program is the properties you can prove of it using formal semantics

31
Q

What is big step operational semantics?

A

Specifies the execution of programs in terms of a relation between the initial program state and its final result, typically expressed as a set of inference rules

32
Q

What is small step operational semantics?

A

Breaks down the execution of programs into atomic computational steps, typically expressed as a relation between successive program states

33
Q

What is determinstic reduction?

A

Where the reduction rules for evaluating expressions or executing program steps are unambiguous and lead to a unique result for a given initial state

34
Q

What are error relations?

A

Relations that describe when an evaluation reaches an error state e.g. if (n) then E else E’↯

35
Q

What is preservation?

A

If a well-typed program takes a step of execution, the resulting program state remains well-typed
if ⊢ E:T and E↓V then ⊢ V:T

36
Q

What is progress?

A

Ensures that a well-typed program either takes a step of execution or is already in its final state, guaranteeing that a well-typed program will never get “stuck”

37
Q

How is type safety achieved?

A

By repeatedly applying progress then preservation

38
Q

Signal & wait vs signal & continue

A

Both paradigms of synchronisation

Signal & Wait:
* for synchronous communication
* a thread signals an event to another thread and cannot be scheduled further until an ACK is received

Signal & Continue:
* for asynchronous communication
* a thread signals an event to another thread and continues execution immediately after signalling

39
Q

What do each of these CCS (Calculus of Communication Systems) operations do: nil, a!P, a?P, τP, P||P, P\a, rec X

A
  • nil - termination
  • a!P - sends signal on channel a and continues execution as P
  • a?P - receives signal on channel a and continues execution as P
  • τP - internal computation step
  • P||P - represents 2 concurrent processes
  • P\a - no communication on channel a originating inside of P can be seen
  • rec X - recursive behaviour e.g. rec X . a!b?X repeatedly sends a signal on a and receives a signal on b
40
Q

What are the 2 types of nondeterminism?

A

Internal: the machine chooses
External: the environment chooses e.g. vending machine & user can be thought of as a concurrent system

41
Q

What is a trace?

A

A sequence of observations from some state (including ε)

42
Q

What is trace equivalence?

A

When 2 states have the same set of all possible traces

43
Q

What does it mean (in an LTS) for state y to simulate state x?

A

When x does an action and becomes x’, y can do the same action becoming y’ in the process
y’ can do everything x’ can do

44
Q

What is similarity?

A

For an LTS L there is a largest simulation on L
Denoted

45
Q

What is simulation equivalence?

A

States x & y are simulation equivalent if x≤y and y≤x (each system can simulate the behaviour of the other)
Also means x and y are trace equivalent

46
Q

What is bisimulation?

A

A simulation that goes both ways - the relation & its reverse are both simulations
Denoted x~y if there’s a bisimulation containing (x,y)

47
Q

What is bisimilarity?

A

The largest bisimulation on an LTS
Denoted ~

48
Q

What is context switch?

A

The operation of switching control from one thread to another

49
Q

What is a race condition and its 2 necessities?

A

The situation in which 2+ threads simultaneously try to write or read from shared memory

Aliasing: the same location in shared memory must be accessible from multiple threads
Mutability: data on the heap can be altered

50
Q

Fine-grained vs coarse-grained concurrency

A

Fine-grained: systems with more locking of small blocks of code; usually protected by lock-free approaches using CAS
Coarse-grained: systems with fewer locks over large blocks of code; protected by mutex lock

51
Q

Why must a balance between fine-grained & coarse-grained concurrency be found?

A

Too specific (fine) -> overhead increases (computation cycles for waiting, acquiring, releasing locks)
Too general (coarse) -> deadlock/contention risk increases

52
Q

What is intrinsic locking?

A

1) When a thread wants to access shared resource, it calls lock() on mutex object
2) If mutex isn’t currently locked, it grants access
3) Once thread is done, it calls unlock() on the mutex object

53
Q

What are the 3 advantages of message passing concurrency?

A
  • Don’t have to worry about critical regions & mutex locks
  • Less risk of deadlock as not necessarily holding locks
  • Avoid problems with cache visibility as we know which thread is the intended recipient of the message
54
Q

What are the 2 disadvantages of message passing concurrency?

A
  • Slow
  • Hard to ensure global consensus
55
Q

Coroutines vs threads

A

Both concurrent, only threading parellel
Coroutines designed for cooperative multi-tasking style; threads designed for pre-emptive multi-tasking

56
Q

How do futures & promises work?

A

1) Construct a promise object
2) Get the (invalid but usable) future object from the promise
3) Move the promise to another thread/function
4) When the function completes, a return value/exception is placed in the promise and the future becomes available
5) get() on the future object ro retreive data

57
Q

What 2 ways can performance be measured in a concurrent system?

A

Throughput: tasks per unit time
Responsiveness: time until first response from a task

58
Q

What is Amdahl’s Law?

A

The overall performance improvement gained by optimizing a single part of a system is limited by the fraction of time that the improved part is actually used

59
Q

Amdahl’s Law equation

A

Let F be the fraction of the task that is sequential
Let P be the no. processors available
speedup <= 1 / ((1-F) + F/P)

60
Q

What is CAS (Compare-And-Set)

A

An atomic operation
Has semantics: CAS (V,A,B)
* V - the memory reference
* A - the old value stored in reference
* B - the new value to store in reference
If V contains A, B is written to V and returns ture
If V doesn’t contain A, nothing is written to V and returns false

61
Q

What is exponential backoff?

A

Where the duration threads wait doubles each time after failing access
This reduces contention

62
Q

What are elimination arrays?

A

If a thread is trying to push a value onto a stack and another thread is trying pop a value from the stack, instead of putting the value of the stack, just return the value directly to the thread that’s trying to pop
Reduces contention at the stack top