Chapter 6 Flashcards

1
Q

activation function

A

alias of transfer function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

artificial neural networks (ANN)

A

Computer technology that attempts to build computers that operate like a human brain. The machines possess simultaneous memory storage and work with ambiguous information. Sometimes called, simply, a neural network.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

backpropagation

A

The best-known learning algorithm in neural computing where the learning is done by comparing computed outputs to desired outputs of training cases.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

black-box syndrome

A

ANN are typically known as black boxes, capable of solving complex problems but lacking the explanation of their capabilities.

The lack of transparency present in ANN, that is it is a lack of knowing how the model does what it does.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Caffe

A

This is an open-source deep learning framework developed at UC Berkeley and Berkeley AI Research.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

cognitive analytics

A

Is a term that refers to cognitive computing–branded technology platforms, such as IBM Watson, that specialize in processing and analyzing large, unstructured data sets.

Cognitive analytics systems can use machine learning to adapt to different contexts with minimal human supervision.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

cognitive computing

A

The application of knowledge derived from cognitive science in order to simulate the human thought process so that computers can exhibit or support decision-making and problem-solving capabilities.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

cognitive search

A

A new generation of search method that uses artificial intelligence (e.g., advanced indexing, NLP, and machine learning) to return results that are much more relevant to the user.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

connection weight

A

The weight associated with each link in a neural network model. Neural networks learning algorithms assess connection weights

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

constant error carousel (CEC)

A

(aka state unit) One of four additional layer’s (based on Recurrent NN) in a history aware ANN architecture (e.g. LSTM long short term memory) that is responsible for integrating and interacting with the other 3 (input gate, forget gate, output gate).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

convolution function

A

This is a parameter sharing method to address the issue of computational efficiency in defining and training a very large number of weight parameters that exist in CNN.

Useful when the number of weight parameters required for a CNN is prohibitively large (long processing time).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

convolutional neural network (CNN)

A

(CNNs) These are among the most popular deep learning methods. CNNs are in essence a variation of the deep MLP-type neural network architecture, initially designed for computer vision applications (e.g., image processing, video processing, text recognition) but also applicable to nonimage data sets.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

deep belief network (DBN)

A

(CBD) A type of a large class of deep neural networks called generative models.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

deep learning

A

The newest and perhaps the most popular member of the artificial intelligence and machine learning family, deep learning has a goal similar to those of the other machine learning methods that came before it: mimic the thought process of humans - using mathematical algorithms to learn from data (both representation of the variables and their interrelationships).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

deep neural network

A

An unsupervised deep learning method use to pretrain network parameters. Part of the class of deep neural networks called generative models.

The primary application of DBNs today is to improve classification models by pretraining.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

DeepQA

A

A massively parallel, text mining–focused, probabilistic evidence–based computational architecture developed by IBM, this is the system behind Watson.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

Google Lens

A

An app that uses deep learning artificial neural network algorithms to deliver information about the images captured by users from their nearby objects.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

GoogLeNet

A

(a.k.a. Inception), a deep convolutional network architecture designed by Google researchers, was the winning architecture at ILSVRC 2014

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Google Neural Machine Translator (GNMT)

A

A machine translation (language) system that uses an LSTM network.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

graphics processing unit (GPU)

A

It is the part of a computer that normally processes/renders graphical outputs; nowadays, it is also being used for efficient processing of deep learning algorithms

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

hidden layer

A

The middle layer of an artificial neural network that has three or more layers and takes input from the previous layer and converts those inputs into outputs for further processing.

22
Q

IBM Watson

A

It is an extraordinary computer system - a novel combination of advanced hardware, software, and machine-learning algorithms - designed to answer questions posed in natural human language.

23
Q

ImageNet

A

This is an ongoing research project that provides researchers with a large database of images, each linked to a set of synonym words (known as synset) from WordNet (a word hierarchy database).

24
Q

Keras

A

An open-source neural network library written in Python that functions as a high-level application programming interface (API) and is able to run on top of various deep learning frameworks including Theano and TensorFlow.

25
Q

long short-term memory (LSTM)

A

A variation of recurrent neural networks that are known as the most effective sequence modeling techniques and are the foundation of many practical applications.

26
Q

machine learning

A

Teaching computers to learn from examples and large amounts of data.

27
Q

Microsoft Skype Translator

A

A machine translation (language) system that uses an LSTM network.

28
Q

multilayer perceptron (MLP)

A

Consists of at least three layers, input layer, hidden layer, output player. Uses backpropagation (supervised) for training.

29
Q

MYCIN

A

In the early 1970s, several researchers at Stanford University developed a computer system, MYCIN, to identify bacteria causing severe infections, such as bacteremia and meningitis, and to recommend antibiotics with the dosage adjusted for the specifics of an individual patient.

30
Q

network structure

A

Each ANN is composed of a collection of neurons that are grouped into layers. Typical ANN structures include an input layer, one or more hidden layers and an output layer. Network structure refers to the organization of these layers.

31
Q

neural network

A

alias artificial neural network.

32
Q

neuron

A

A cell (i.e., processing element) of a biological or artificial neural network.

33
Q

overfitting

A

A central concern in the training of any type of machine-learning model is overfitting. It happens when the trained model is highly fitted to the training data set but performs poorly with regard to external data sets.

A large group of strategies known as regularization strategies is designed to prevent models from overfitting by making changes or defining constraints for the model parameters or the performance function

34
Q

perceptron

A

An early neural network structure that uses no hidden layer.

35
Q

performance function

A

Usually, the performance function is nothing but a measure of error (i.e., the difference between the actual input and the target) across all inputs of a network. akak cost or loss function.

36
Q

pooling

A

In CNN, it refers to the process of consolidating the elements in the input matrix in order to produce a smaller output matrix, while maintaining the important features.

37
Q

processing element (PE)

A

A neuron in a neural network.

38
Q

recurrent neural network (RNN)

A

The type of neural networks that have memory and can apply that memory to determine the future outputs.

39
Q

representation learning

A

A type of machine learning in which the emphasis is on learning and discovering features/variables by the system in addition to mapping of those features to the output/target variable.

40
Q

sensitivity analysis

A

A study of the effect of a change in one or more input variables on a proposed solution.

41
Q

stochastic gradient descent (SGD)

A

Is a popular boosting algorithm that uses prediction residuals/errors to guide the gradual development of the future decision trees.

42
Q

summation function

A

A mechanism to add all the inputs coming into a particular neuron.

43
Q

supervised learning

A

A method of training artificial neural networks in which sample cases are shown to the network as input, and the weights are adjusted to minimize the error in the outputs.

44
Q

TensorFlow

A

A popular open-source deep learning framework originally developed by the Google Brain Group in 2011 as DistBelief, and further developed into TensorFlow in 2015.

45
Q

Theano

A

This was developed by the Deep Learning Group at the University of Montreal in 2007 as a Python library to define, optimize, and evaluate mathematical expressions involving multidimensional arrays (i.e., tensors) on CPU or GPU platforms.

46
Q

threshold value

A

A hurdle value for the output of a neuron to trigger the next level of neurons. If an output value is smaller than the threshold value, it will not be passed to the next level of neurons.

47
Q

Torch

A

An open-source scientific computing framework for implementing machine-learning algorithms using GPUs.

48
Q

transfer function

A

In a neural network, the function that sums and transforms inputs before a neuron fires. It shows the relationship between the internal activation level and the output of a neuron.

49
Q

word embeddings

A

The output from word2vec (word vectors or word embeddings) used in many deep learning projects as inputs.

50
Q

word2vec

A

A two-layer neural network that gets a large text corpus as the input and converts each word in the corpus to a numeric vector of any given size, typically ranging from 100 to 1000.