LESSON 2/3 - Mathematical background Flashcards

1
Q

What is the relationship between a 1D vector and a 2D matrix in a matrix space?

A

In a matrix space, a 1D vector corresponds to a hyperplane in the context of a 2D matrix.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Why is graph theory important in describing systems with interacting elements?

A

Graph theory provides a mathematical language for describing systems with interacting elements, such as neural networks in this course, where each element represents a neuron, and links between elements indicate connections and information exchange.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How is graphical formalism used in neuroscience to describe the brain?

A

In neuroscience, the brain or specific areas are often described as graphs. This graphical representation helps depict connectivity at both the whole brain level and local circuit levels.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Why is probability theory powerful for describing stochastic systems like the brain?

A

Probability theory is powerful for stochastic systems because complex systems, like the brain, may not be fully observable. Additionally, approximations, errors in data, and intrinsic stochasticity make probabilistic approaches valuable.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

What is the significance of conditional independence in graphical representation?

A

Conditional independence allows exclusion of variables that do not directly influence each other, simplifying graphical representations. In neural networks, this concept helps create more organized graphs by excluding irrelevant connections.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

How are random variables like uniform distribution different from normal distribution?

A

Uniform distribution involves totally random variables with no predictable outcome, while normal distribution has a peak, making certain outcomes more probable, allowing predictions based on mean values.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Explain the Bayesian perspective in probability theory.

A

The Bayesian perspective focuses on establishing a degree of belief that can change over time based on observations. It involves updating opinions and assigning probabilities to hypotheses given evidence.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Why is linear algebra relevant in neural networks?

A

Neural networks, often non-linear systems, use linear algebra to project vectors into a more complex space, facilitating the separation of classes by hyperplanes.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

How does calculus relate to neural networks?

A

Calculus is essential in understanding how functions change in neural networks. Input variables, connections, and weights in a neural network involve calculus to model their transformations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What defines an algorithm, and how is it different from a logarithm?

A

An algorithm is a procedure or sequence of operations that produces a well-determined result in a finite amount of time. It is distinct from a logarithm, which is a mathematical function describing a non-linear increase over time.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

How does graph theory represent neural networks in artificial intelligence?

A

In artificial intelligence, neural networks are represented using graph theory, where each element corresponds to a neuron, and connections between elements signify information exchange and connectivity.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Why is probability theory valuable in cases where a system cannot be fully observed, such as the brain?

A

Probability theory is valuable in situations like studying the brain, where complete observation is often impossible due to the complexity of the system. It allows for making probabilistic guesses and handling uncertainties.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What is the role of conditional independence in simplifying graphical representations in neural networks?

A

Conditional independence helps simplify neural network graphs by excluding connections that do not directly influence each other, leading to more organized and interpretable representations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

How does the concept of attractors apply in the context of neural networks?

A

In neural networks, attractors are patterns or states to which the system tends to converge over time. They play a role in stability and memory recall in the network’s behavior.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

What does the term “Black Swan” refer to in a Bayesian perspective?

A

In a Bayesian perspective, a “Black Swan” event refers to an unexpected observation that can lead to a change in beliefs or probabilities. It emphasizes the adaptability of beliefs based on new evidence.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

How is calculus utilized in the context of building neural networks?

A

In the construction of neural networks, calculus is employed to understand and model how functions change. This is particularly relevant when dealing with input variables, connections, and the impact of weights on network transformations.

17
Q

What distinguishes random variables like exponential distribution from totally random variables like uniform distribution?

A

While both involve random events, exponential distribution, like normal distribution, has more predictable outcomes with certain values being more probable. In contrast, uniform distribution represents totally random variables with no predictable patterns.

18
Q

Define an algorithm and differentiate it from a logarithm.

A

An algorithm is a step-by-step procedure that yields a well-determined result in a finite amount of time. It is distinct from a logarithm, which is a mathematical function describing a non-linear increase over time.