sigmoid function Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

Unnamed: 0

A

Unnamed: 1

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

sigmoid function

A

The sigmoid function, also known as the logistic function, is a type of activation function used in machine learning algorithms, especially in neural networks. Its mathematical formula is:
S(x) = 1 / (1 + e^-x) Here, ‘e’ is the base of the natural logarithm and ‘x’ is the input to the function.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q
  1. Value Range
A

The sigmoid function outputs a value between 0 and 1. This property makes it particularly useful for models where the output needs to be a probability, such as in binary classification problems.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q
  1. S-Shaped Curve
A

The sigmoid function forms an ‘S’-shaped curve when plotted on a graph. It starts near 0 for negative infinity, increases to 0.5 at x=0, and approaches 1 as x goes to positive infinity.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q
  1. Non-linear
A

The sigmoid function is non-linear. This makes it possible to adjust the output of the neurons based on the input and weights, which allows neural networks to learn from the error gradients during the backpropagation process.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q
  1. Gradient/Vanishing Gradient
A

The gradient of the sigmoid function is highest when the input is 0, and it is very close to 0 when the input is far from 0 (either in the positive or negative direction). This results in a problem called the vanishing gradient problem, where the gradients are so small that the weights and biases of the network hardly change during the learning process.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q
  1. Binary Classification
A

The sigmoid function is widely used in binary classification problems. It can take any real-valued number and map it into a value between 0 and 1, making it suitable for transforming the output of a linear model into a probability.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q
  1. Drawbacks
A

As mentioned, the sigmoid function can lead to a vanishing gradient problem, slowing down the training. It is also not zero-centered, which can make the optimization process more challenging.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly