WK 8 - Neural Networks 1 (AI, The Brain and Neural Computing) Flashcards
Brain Structure
- Contains around 100 billion neurons
- Neurons communicate through synapses – effectively a configurable chemical junction between neurons
Neurons parts
Dendritic tree: receive signals
Cell body: process signals
Axon: transmit signals
Neuronal Function
A neuron receives electrical activity from other neurons along its dendrites (inputs)
The axon (effectively the output of the neuron) will produce a pulse based on the strength of the incoming pulse
This is then passed to other neurons connected to this one
Synapse
chemical junction which can be modified and therefore is thought to be where learning takes place
The synapse can release more neurotransmitter to enhance the coupling between cells
Artificial Neuron
When two neurons fire together the connection between the neurons is strengthened
The activity of firing is one of the fundamental operations necessary for learning and memory
Rosenblatt’s Perceptron Architecture
The Perceptron consists of a single layer of artificial neurons or “perceptrons.”
Each perceptron takes a set of input features and produces a binary output (0 or 1).
Rosenblatt’s Perceptron Training Data:
The training data for the Perceptron algorithm consists of labeled examples, where each example is
epresented by a set of input features and a corresponding target class label (0 or 1).
Rosenblatt’s Perceptron how it works:
the system was able to learn by means of weighted
connections
Problems with the Rosenblatt’s Perceptron
The Perceptron algorithm can only learn and classify linearly separable data.
Binary Classification : The Perceptron algorithm is designed for binary classification tasks, where it assigns instances to one of two classes.
Perceptron could not correctly solve the
XOR function.
Connectionism
Add a further layer of neurons to the network and create a Multi-Layer Perceptron to resolve the XOR issue of teh single layer perceptron.
Neural computing
Neural Computing is based on artificial neural networks (ANNs) that consist of interconnected nodes (neurons) and learn from data through training algorithms. ANNs are inspired by the structure and functioning of biological neural networks in the brain.
Traditional AI:
Traditional AI encompasses various techniques such as symbolic logic, rule-based systems, expert systems, and search algorithms. It focuses on explicit
epresentation of knowledge and logical reasoning.
Artificial Neural Networks (ANNs) learning :
- Supervised Learning
- Unsupervised Learning
Supervised Learning
machine learning approach where:
- artificial neural networks are trained using labeled input-output
-The network then corrects itself based on that output adjusting its internal parameters (weights and biases) during training to minimize the difference between predicted outputs and actual outputs
Unsupervised Learning
-The network organises itself according to patterns in the data
-No external ‘desired output’ is provided
Perceptron
Consists of a set of weighted connections, the neuron
(incorporating the activation function) and the output axon
Modified Versions of Percepron Learning
The larning can be slowed down with a decimal term between 0 and 1 when the weight is updated.