BIO6 Flashcards
BIO6
DeepMind’s AlphaFold is known for its breakthrough in which area?
AlphaFold achieved a breakthrough in predicting 3D protein structures.
Which data type often requires AI and big data techniques for analysis in proteomics?
Proteomics generates complex data needing AI for detailed analysis.
How do neural networks help in predicting protein structures?
Neural networks learn sequence patterns to predict protein shapes.
Why is big data crucial for protein-protein interaction studies?
Big data enables comprehensive mapping of protein interactions.
Which database is commonly used for retrieving protein structures for AI modeling?
The Protein Data Bank (PDB) is commonly used in AI modeling.
What is the main challenge of protein folding that AI aims to tackle?
AI tackles the complexity of predicting protein 3D structures from sequences.
What feature do deep learning models in protein science use for sequence pattern recognition?
Deep learning models use sequence and structure patterns for recognition.
Describe the relationship between Artificial Intelligence (AI), Machine Learning (ML), and Deep Learning (DL).
AI encompasses ML, which includes DL as a specialized subset.
What is the advantage of transfer learning in protein prediction tasks?
Transfer learning saves time by using pre-trained models for new tasks.
What is a pitfall when training AI models on biased protein databases?
Bias in databases can lead to AI models with limited generalizability.
How do Supervised, Unsupervised, and Reinforcement Learning differ?
Supervised uses labeled data, unsupervised finds patterns, reinforcement learns via feedback.
What describes Neural Networks in Machine Learning?
Neural Networks are layered models that learn from input data patterns.
What is overfitting in machine learning models?
Overfitting occurs when a model performs well on training but poorly on new data.
Why is data split into training, validation, and test sets in machine learning?
Splitting data helps evaluate model performance and prevent overfitting.
What role does the ‘attention’ mechanism play in deep learning models like Transformers?
Attention helps models focus on important parts of the input sequence.