AI Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

What is Artificial Intelligence?

A

The creation of intelligent machines that react like humans.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

What is Data Science?

A

The extraction of insights and knowledge from data using mathematics techniques, and computer science (for example AI).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

What is Data Mining?

A

Data Mining is the process of discovering patterns, correlations, and anomalies within large sets of data to predict outcomes using statistical and computational techniques.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is Machine Learning?

A

ML is a subset of AI that involves training algorithms to make predictions or decisions without being explicitly programmed to perform the task.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is Deep Learning?

A

Deep Learning is a subset of ML that uses neural networks with many layers (deep networks) to analyse various factors of data, enabling advanced pattern recognition and decision-making.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What is Strong AI?

A

A theoretical form of AI that possesses the ability to understand, learn and apply knowledge across a wide range of tasks, equal to or surpassing a human.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

What is Supervised Learning?

A

Supervised Learning is a type of machine learning where the model is trained on a labelled dataset, meaning each training example is paired with an output label. The model learns to make predictions or decisions based on this input-output mapping.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

What is Unsupervised Learning?

A

Supervised Learning is a type of machine learning where the model is trained on a labelled dataset, meaning each training example is paired with an output label. The model learns to make predictions or decisions based on this input-output mapping.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

What is Classification?

A

Classification is a type of supervised learning where the model learns to categorize data into predefined classes or labels, such as spam vs. not spam or different types of animals.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What is Regression?

A

Regression is a type of supervised learning where the model learns to predict a continuous numerical value based on input data, such as predicting house prices or stock prices.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How do Machine and Deep Learning implement feature extraction differently?

A

In Machine Learning, feature extraction is often a manual process where domain experts select and engineer relevant features from raw data to improve model performance. In Deep Learning, feature extraction is automated, as neural networks learn to identify and extract features directly from raw data through multiple layers of abstraction, which is particularly useful for complex data like images and audio.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Binary cross-entropy

A

A loss function commonly used in binary classification. P is the prediction and T the target. -tlog(p) - (1-t)log(1-p)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Categorical cross-entropy:

A

A loss function used in multiclass logarithmic loss.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is backpropagation?

A

Backpropagation is an algorithm used to train neural networks by propagating the error backwards from the output layer to the input layer. It calculates the gradient of the loss function with respect to each weight and updates the weights to minimize the loss.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What is gradient descent?

A

Gradient Descent is an optimization algorithm used to minimize a loss function by iteratively adjusting the model’s parameters (weights) in the direction of the steepest descent of the gradient, reducing the error in predictions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

What is Stochastic gradient descent?

A

Stochastic Gradient Descent (SGD) is an optimization algorithm used to minimize the loss function in machine learning models. Unlike traditional gradient descent, which uses the entire dataset to compute the gradient, SGD updates the model’s parameters using only a single randomly chosen data point (or a small batch) at each iteration. This makes SGD faster and more efficient for large datasets, though it introduces more noise in the updates.

17
Q

Epochs

A

Epochs determine how many times the learning algorithm will work through the entire training dataset. More epochs can improve model performance up to a point but may lead to overfitting if too many are used.

18
Q

Batch Size

A

Batch Size affects the stability and speed of training. Smaller batches lead to noisier gradient estimates but can help generalize better. Larger batches provide more stable and accurate gradient estimates but require more memory and can be computationally expensive.

19
Q

Learning Rate

A

Learning Rate influences how quickly or slowly a model learns. An appropriate learning rate is crucial; if it’s too high, the model might overshoot the optimal solution, while if it’s too low, the training process can be very slow and might get stuck in local minima.

20
Q

What is Overfitting?

A

Overfitting occurs when a machine learning model learns to capture the noise or random fluctuations in the training data rather than the underlying patterns. As a result, the model performs well on the training data but poorly on new, unseen data.

21
Q

What is Dropout?

A

A technique used to prevent overfitting, where, during each training iteration, several neurons are randomly temporarily deactivated, which encourages the network to learn more generalizable features.

22
Q

What are activation functions?

A

Activation functions are mathematical functions used in neural networks to introduce non-linearity into the model, enabling it to learn complex patterns. They determine the output of a neuron based on its input and can affect the network’s ability to converge and generalize.

23
Q

Non-linear and linear activation functions:

A

Non-linear activation functions solve the problems caused by linear activation functions. They allow backpropagation as the derivative function won’t be related to the input. They allow the stacking of multiple layers of neurons. Linear activation functions are however used in the output layer of the network.

24
Q

Sigmoid

A

f(x)=1/(1+e^(-x) )

Result between 0 and 1
Can suffer from vanishing gradient problem.
Output layers used in binary classification.
25
Q
A