Computer Science Flashcards
What is a Turing machine, and why is it significant in computational theory?
A Turing machine is a theoretical device that manipulates symbols on a strip of tape according to a set of rules. It is significant because it provides a mathematical model of computation, showing that any computation can be performed if it can be broken down into simple steps, even without modern technology
What is SHRDLU, and what did it demonstrate in AI research?
SHRDLU was an early natural language processing program developed in the 1970s. It operated in a virtual environment with blocks and demonstrated how AI could understand, process, and respond to language inputs within a defined environment, making progress in symbolic manipulation and language comprehension
What are rule-based systems, and how do they differ from parallel processing systems?
Rule-based systems rely on explicit “if-then” logic to solve problems, making their reasoning process transparent and easy to follow. Parallel processing systems, inspired by neural networks, use multiple processors to solve problems simultaneously, allowing faster but less transparent decision-making, making them more powerful but harder to interpret
How do genetic algorithms work, and how are they used in AI?
Genetic algorithms mimic the process of natural selection, where potential solutions to a problem are treated like individuals in a population. Through crossover and mutation, these “individuals” evolve over generations to improve their solutions. They are used in AI to solve optimization problems by iteratively refining the solution set
What is backpropagation in neural networks?
Backpropagation is a supervised learning algorithm used in neural networks. It involves adjusting the weights of the network’s connections by calculating the error at the output and propagating that error backward through the layers, allowing the network to learn by minimizing the error
What is gradient descent, and what is the challenge of encountering local minimums?
Gradient descent is an optimization algorithm used to minimize error by adjusting model parameters in the direction that reduces the error most. However, it can get stuck in local minimums, which are points where the error is lower than surrounding points but not the lowest possible, preventing the algorithm from finding the global minimum
What is Hebbian learning, and how does it relate to neural networks?
Hebbian learning is an unsupervised learning principle where neurons that fire together, wire together. In neural networks, this idea is used to strengthen the connections between neurons that frequently activate simultaneously, allowing the network to learn statistical patterns in the input data without supervision
What are attractors in dynamical systems, and how do they apply to neural networks?
Attractors in dynamical systems are states toward which a system tends to evolve, regardless of the initial conditions. In neural networks, attractors represent stable patterns of activity that the network converges on, helping the network to settle into consistent outputs after processing inputs
How do symbol systems contribute to artificial intelligence?
Symbol systems, a traditional approach to AI, rely on formal logic and structured rules to represent knowledge and process information. These systems are explicit, meaning the steps in reasoning can be easily followed, but they lack the adaptability and flexibility of more modern neural network approaches
What is the difference between symbolic AI and neural networks?
Symbolic AI uses predefined rules and logical structures, making it transparent and easily interpretable, but limited in scope. Neural networks, by contrast, are inspired by the brain’s structure, using many simple processors (neurons) working in parallel, enabling them to learn from data but often lack transparency
What is the role of search algorithms in AI?
Search algorithms systematically explore a problem’s possible states (often represented as a search tree) to find the most optimal solution. They are especially useful in decision-making tasks like playing chess or solving puzzles, where multiple possible outcomes need to be evaluated
How does parallel processing accelerate decision-making in AI?
Parallel processing allows a system to evaluate multiple paths or solutions simultaneously rather than sequentially. This reduces the time needed to reach a decision and is a hallmark of neural networks, which mimic the massively parallel nature of biological brains
What is the state space concept in AI?
State space refers to the set of all possible states that a system can be in. In AI, it is used to map out potential decisions or actions, where each state is a point in this space, and the goal is to navigate through it to find optimal or desired outcomes
What is a local minimum in the context of learning algorithms?
A local minimum is a point in a system where the error or cost is lower than its immediate surroundings, but not necessarily the lowest possible. Learning algorithms like gradient descent can get stuck in local minimums, preventing them from finding the global minimum, which represents the best possible solution
What is the significance of Monte Carlo simulations in AI?
Monte Carlo simulations use random sampling to approximate solutions to complex problems. In AI, they are often used to estimate probabilities and decision outcomes when exact solutions are computationally expensive or difficult to determine