Deep Learning Flashcards

1
Q

Deep learning

A

Neural network with 3 or more layers

Issues with vanishing/exploding gradients and they need a lot of data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Long Short Term Memory

A

Gated memory cells
Allow long range dependencies to be learned
Good at language modelling
Protein homology detection

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

LTSM uses

A

Protein secondary structure prediction (not better than PSI pred)
Protein homology detection (One hot encodings, no BLOSOM, faster than BLAST, multiple alignment, learns profile via error propogation)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Deep Belief nets

A

Trained layer at a time
Fix previous weights as each layer added
Train next layer
Tune with back propogation

Protein model quality/stability prediction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Convolutional Neural Nets

A

Inspired from the animal visual cortex
Filters
Weight sharing
Translational invariance

Computer Vision (MNIST, medical imagery (classification and segmentation), gene expression)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Autoencoders

A

Trained to reproduce the input
Builds encoding layer
Lower dimensional representation can include good deal of semantic meaning

Word2vec,prot2vec,DNA2vec
Estimating protein model quality/stability prediction

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Variational autoencoders

A

Reconstruct data from estimate distribution
Trained via reconstruction loss, KL divergence

Protein encoding/Protein design
methyINet

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Attention

A

In a sentence, some words more important
Bidirecional LSTMs to generate encoding
Weighted sum of hidden states

Accurate in sentinment analysis and translation tasks

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Transformer

A

Use of Attention (Google Brain)
Self-attention
Encoder section (N=6)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

ResNets

A

Builds on convolutional networks
Uses skip connections, can go deeper

Used for protein structure prediction
Sequence alignment profiles
Showed significant improvement

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

CASP

A

Critical Assessment of protein structure prediction
Competition for best prediction of protein structures
Evaluated on alpha C prositions
GDT_TS scores from 0-100

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Molecular modelling

A

Used traditionally for proteins structure prediction

Very computer intensive

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Coevolution

A

HMM SVMs with multiple sequence alignments (up to CASP 2010)
Sequence alignments give evolutionary information
Look for correlated changes in the protein sequence

Information about potential interactions allowed 3d structure to be inferred

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Rosetta

A

Combination of physical calculations
Monte Carlo assembly
PSI BLAST Alignment

Up until CASP 13

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

AlphaFold

A

CASP13-2018
Free modelling category

AlphaFold2 builds on a lot of previous research (Use of resnets)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly