ECM2418 Computer Languages and Representation Flashcards
Haskell inbuilt functions to use
filter
foldr
map
even
odd
(++)
sel
not
polymorphic type of conditional functor (even, odd) haskell
(a -> Bool)
polymorphism syntax haskell
function :: a..
list split haskell
(x:xs)
list split Prolog
([A|AS])
Prolog predicates to remember
reverse
between
findall
permutations
(EXAMPLE) what’s stopping this predicate from working
alpha(XS).
it’s missing base cases:
alpha([]).
alpha([_]).
Functional programming
- Programs are represented by expressions
- Computation is implemented by reduction
- The foundation is the lambda calculus
- Functions akin to mathematics functions
Logic Programming
- Programs are represented by clauses
- Computation is represented by proof
- The foundation is the first-order logic
Finite State Machines
Simple idea occurring throughout computer science
- Programs are represented by state machines
- Computation is represented by transitions
tuples in haskell
(a1,a2,a3)
can have different types in one tuple
Recursive definitions in haskell
Contains a base case that can act as a halt for the function to return a value
Guards in Haskell
|
a boolean expression that must be true for the equation to apply
acts similarly to if and else
guard keywords
where
otherwise
generate-and-select design pattern
the generator constructs a large number of values that might be solutions to a problem
the selector filters out those values that are solutions to the problem
explanations of the productivity paradox
1) Uneven/concentrated gains
2) Implementation lags
3) Mismeasurements
uneven/concentrated gains
One explanation of the productivity paradox is an uneven/concentrated distribution of gains - the gains are in few productive firms and sectors with limited weight int the overall economy
Implementation lags
Another explanation of the productivity paradox is implementation lags - it takes considerable time for new technologies to achieve critical mass, or for necessary complementary ones appear. There are gains, but it takes a long time time
Mismeasurement
adopting new technologies can lead workers to move from more productive adopting sectors to less productive ones, and so to negligible aggregate productivity growth
Behind the scenes, Artificial Neural Networks (ANNs) work as follows:
1) The core ANN is trained on text from the Internet to respond to a prompt with a list of most probable next words after the prompt
2) The core ANN is tweaked by scoring its responses to sample queries. A second ANN is trained on these scores to predict the most likely one to be assigned to a prompt
3) The second ANN is used in reinforcement learning to adjust the weights in the core ANN so that its outputs are even more likely to satisfy humans
4) Sometimes, data from user responses to LLM responses is fed back to fine-tune for still better results
Among the concerns when using Large Language Models are:
- copyright
- education
- code quality
- code security
- not an expert
Copyright and LLMs
although the code produced is the melding of that form many sources, there may still be some licensing problems
Education and LLMs
good solutions to many assignments can be completed directly, reducing their challenge and worth
code quality and LLMs
There is no guarantee of code quality when using a large language model, and a code review will need to be conducted before code is deployed
Code security and LLMs
There is no guarantee of code security when using a LLM, and a security audit will need to be conducted before code is deployed
LLMs are not experts
it does not know that it does not know, able to confabulate is necessary, mixing high-quality text with low-quality rubbish
List comprehensions
[ | ]
used to test and transform a list
conditionals in haskell
case _ of _ ->
or
if _ then _ else
cody copying is
error-prone
inefficient