Week 2 Flashcards
Explain function approximation
Finding a pattern from examples so we can make good guesses for new data. Like learning that bigger houses tend to cost more, we create a rule to predict prices of houses we haven’t seen yet.
What is a “feature” in a decision tree?
Something we know about the options or examples.
When we want to choose a restaurant features are type of food, price category etc.
What are weight in AI neural networks and how are these weights determined
Weights work as classifiers and are determined by supervised learning so it learns to recognize patterns
Explain back propagation in artificial neural networks
Process to improve accuracy: if the network’s guess is wrong, we adjust the weights backward, layer by layer, to reduce future mistakes.
Why are AI neural networks effective?
1) They can understand complex patterns by organizing information in a hierarchy
2) They can handle huge amounts of data
3) They can sometimes outperform experts (diagnosing skin conditions)
Explain universal approximation theorem
With the right structure and enough neurons, a neural network can learn almost any continuous function (pattern).
However, finding the right setup (topology) and training it correctly is key to making it work well.
How do you make predictions via K-NN for classification and regression
1) Classification: most common label (= plurality vote) > 3 for dog, 4 for cat > it is a cat
2) Regression: Mean of the closest neighbor (= arithmic mean) > 200, 250 en 300 > (200+250+300)/3 = 250
What are three goals of unsupervised learning?
1) Understand underlying structure
2) Reducing he dimensionality of a data set (describe, summarize, simplify) using less data
3) Organise similar data in groups or clusters