Generative AI Model Fundamentals Flashcards
What is self-attention?
Self-attention is a mechanism used in neural networks, particularly in natural language processing (NLP) models, that allows each element in a sequence (like a word in a sentence) to focus on, or “attend to,” other elements in the same sequence. Learns relationships between words.
What are tokens?
Numerical representations of words or parts of words.
Whare are embeddings?
Mathematical representations (Vectors) that encode the meaning of a token.
What is Top P?
the model picks the next word from the smallest set of options that together make up a certain probability “p” (like 90%), focusing on the most likely words
What is Top K
The model picks the next word from the top “k” most likely options, limiting choices to the top few possibilities.
What is Tempature?
The randomness / creativity of the window.
What is the context window?
How many tokens an llm can process at once?
What is the Jurassic-2 foundation model good at?
Multilingual LLM for text generation
What is stabiliy.ai good at?
Image generation
How can you use a foundation model in SageMaker?
By using SageMaker Jumpstart