Principles of AI Programming (Theory) Flashcards
[HEURISTICS] What does the User-Item Matrix represent in collaborative filtering recommendation systems?
Users’ preferences or ratings for various items
The User-Item Matrix is a fundamental data structure in collaborative filtering systems that represents how each user has rated or interacted with different items. Each row typically represents a user, each column represents an item, and each cell contains the user’s rating or preference for that item.
[HEURISTICS] What popular metric is used to calculate similarity between users in collaborative filtering?
Cosine similarity
Cosine similarity is a popular metric used to measure the similarity between two users based on their rating patterns. It calculates the cosine of the angle between their rating vectors, which effectively measures how similarly they rate items regardless of the magnitude of their ratings.
[HEURISTICS] What is the “cold start” problem in recommendation systems?
The difficulty of making recommendations for new users or items with little data
The cold start problem refers to the challenge of providing accurate recommendations for new users who haven’t provided many ratings or new items that haven’t been rated much. Without sufficient data, it’s difficult for collaborative filtering to find meaningful patterns or similarities.
[HEURISTICS] How does the Nearest Neighbor heuristic work when applied to the Travelling Salesman Problem (TSP)?
It constructs a path by repeatedly visiting the nearest unvisited city
The Nearest Neighbor heuristic is a greedy algorithm for solving the TSP. It works by starting at one city and repeatedly visiting the nearest unvisited city until all cities have been visited, then returning to the starting city. While fast and intuitive, it doesn’t guarantee an optimal solution.
[HEURISTICS] How is Euclidean distance used in recommendation systems?
Calculating the similarity between users’ preference profiles
Euclidean distance is used to measure the similarity or dissimilarity between users based on their ratings. A smaller Euclidean distance indicates that two users have similar preferences, which is useful for finding like-minded users whose ratings can inform recommendations.
[HEURISTICS] How does user-based collaborative filtering generate recommendations?
By finding similar users and recommending items they rated highly
User-based collaborative filtering works by first identifying users who have similar preferences to the target user (neighbors). It then recommends items that these similar users have rated highly but the target user hasn’t yet rated or interacted with.
[HEURISTICS] What major limitation does the Nearest Neighbor heuristic have when applied to the TSP?
It can produce suboptimal routes by making locally optimal choices
The Nearest Neighbor heuristic is a greedy algorithm that makes the locally optimal choice at each step. However, this doesn’t guarantee a globally optimal solution. It may lead to inefficient routes, especially when the best overall route requires temporarily moving away from nearby cities.
[HEURISTICS] What is Mean Squared Error (MSE) used for in recommendation systems?
To measure the accuracy of predicted ratings
Mean Squared Error (MSE) measures the average squared difference between the predicted ratings and the actual ratings given by users. A lower MSE indicates more accurate predictions, which is crucial for effective recommendation systems.
[HEURISTICS] How does item-based collaborative filtering differ from user-based collaborative filtering?
It finds similar items rather than similar users
Item-based collaborative filtering identifies relationships between items based on user ratings. Instead of finding similar users, it finds items similar to those the user has already rated positively, and then recommends those similar items to the user.
[HEURISTICS] Why is distance calculation important in solving the Travelling Salesman Problem?
To analyze the feasibility of travel and calculate total path length
Distance calculation is essential in TSP as it forms the basis for evaluating different possible routes. By calculating distances between cities, the algorithm can determine which routes are shorter and thus more efficient for the salesperson to travel.
[HEURISTICS] What is content-based filtering in recommendation systems?
Recommending items based on item features and user preferences
Content-based filtering makes recommendations by comparing the features of items with the user’s preferences. Unlike collaborative filtering, which uses patterns in user ratings, content-based approaches analyze item attributes (like genre, actors, or keywords for movies) and match them to user preference profiles.
[HEURISTICS] What is a “sparse” User-Item Matrix in collaborative filtering?
A matrix where most cells contain zero values or are empty
A sparse User-Item Matrix contains many empty entries because most users have only rated a small fraction of all available items. This sparsity is a common challenge in recommendation systems, making it difficult to find reliable patterns and similarities.
[HEURISTICS] What key difference distinguishes collaborative filtering from content-based filtering?
Collaborative filtering uses ratings data, while content-based uses item features
The fundamental difference is that collaborative filtering makes recommendations based on user rating patterns and similarities between users or items, without needing to know anything about the items themselves. Content-based filtering, in contrast, relies on item features (e.g., genre, actors, keywords) and user preferences for those features.
[HEURISTICS] What is serendipity in recommendation systems?
The ability to recommend surprising but relevant items users might not have discovered
Serendipity in recommendation systems refers to recommending items that are both unexpected/surprising and relevant to the user. Good recommendation systems balance accuracy (recommending items users will definitely like) with serendipity (helping users discover new items they wouldn’t have found themselves).
[HEURISTICS] What is precision as an evaluation metric in recommendation systems?
The fraction of recommended items that are relevant to the user
Precision is an evaluation metric that measures the proportion of recommended items that are truly relevant to the user. High precision means the system is making accurate recommendations without suggesting many irrelevant items. It’s often used alongside recall (the proportion of relevant items that were successfully recommended).
[CELLULAR AUTOMATA] What is a cellular automaton?
A discrete model of computation based on a grid of cells with simple rules
A cellular automaton is a computational model consisting of a grid of cells, each with a finite number of states. The cells evolve over discrete time steps according to fixed rules based on the states of neighboring cells, creating complex patterns from simple rules.
[CELLULAR AUTOMATA] What does the state “S” typically represent in a disease spread simulation?
Susceptible individuals
In epidemic simulations, cells often follow the SIR model: Susceptible (S), Infected (I), and Recovered (R). “S” represents individuals who are currently healthy but susceptible to catching the disease if exposed to infected neighbors.
[CELLULAR AUTOMATA] What is the “neighborhood” concept in cellular automata?
The set of cells that influence a cell’s next state
The neighborhood in cellular automata defines which surrounding cells affect the future state of a given cell. Common neighborhoods include the von Neumann neighborhood (the four orthogonally adjacent cells) and the Moore neighborhood (all eight surrounding cells).
[CELLULAR AUTOMATA] What is Conway’s Game of Life?
A famous cellular automaton with simple rules that create complex patterns
Conway’s Game of Life is one of the most well-known cellular automata. It follows simple rules: a dead cell with exactly three live neighbors becomes alive (birth); a live cell with two or three live neighbors stays alive (survival); otherwise, cells die or remain dead. These simple rules create remarkably complex and sometimes unpredictable patterns.
[CELLULAR AUTOMATA] What does “infection probability” represent in epidemic spread simulations?
The likelihood that an infected cell will transmit the disease to a neighboring susceptible cell
Infection probability is a parameter that determines how likely it is for a susceptible individual (cell) to become infected when exposed to an infected neighbor. This parameter models the transmissibility of the disease and significantly affects how quickly the epidemic spreads in the simulation.
[CELLULAR AUTOMATA] How might vaccination be modeled as an intervention strategy in epidemic simulations?
By making some cells immune to infection (transitioning directly to recovered state)
Vaccination can be modeled by transitioning some susceptible cells directly to a recovered or immune state, bypassing the infected state. This simulates how vaccination protects individuals from becoming infected even when exposed to the disease.
[CELLULAR AUTOMATA] What state is NOT typically included in the basic SIR model for epidemic spread?
Treated
The standard SIR model includes Susceptible (S), Infected (I), and Recovered (R) states. “Treated” is not part of the basic model, though more complex extensions like SIRT might include treatment as a separate state or process.
[CELLULAR AUTOMATA] What visualization technique works best for showing disease progression in cellular automata models?
A grid with different colors representing different cell states at each time step
Visualizing the grid directly with color-coded cells (e.g., green for susceptible, red for infected, blue for recovered) provides an intuitive way to see how the disease spreads spatially over time in the cellular automaton model.
[CELLULAR AUTOMATA] What additional state is often added to extend the basic SIR model for greater realism?
Exposed
The SEIR model adds an “Exposed” state between Susceptible and Infected to represent individuals who have contracted the disease but are not yet infectious. This better models diseases with an incubation period before symptoms and infectiousness develop.
[CELLULAR AUTOMATA] What key advantage do cellular automata offer for epidemic modeling?
They can model spatial dynamics and local interactions in disease spread
Cellular automata excel at modeling how diseases spread through local interactions in space, which is often crucial for realistic epidemic modeling. They capture neighborhood effects and can show how geographic constraints affect transmission patterns.
[CELLULAR AUTOMATA] What is a von Neumann neighborhood in cellular automata?
The four orthogonally adjacent cells (north, east, south, west)
The von Neumann neighborhood includes only the four cells sharing an edge with the central cell (the cells to the north, east, south, and west). This is one of the common neighborhood definitions used in cellular automata to determine which cells influence a given cell’s next state.
[CELLULAR AUTOMATA] What is a Moore neighborhood in cellular automata?
All eight surrounding cells (including diagonals)
The Moore neighborhood includes all eight cells surrounding the central cell (north, northeast, east, southeast, south, southwest, west, and northwest). This more comprehensive neighborhood definition is often used in cellular automata when diagonal interactions are relevant to the model.
[CELLULAR AUTOMATA] What does R0 (R-naught) represent in epidemic simulations?
The basic reproduction number - how many new infections each infected individual causes
R0 (pronounced “R-naught”) is the basic reproduction number that represents the expected number of secondary infections caused by each infected individual in a completely susceptible population. It’s a key parameter in epidemiology - values greater than 1 indicate that an epidemic will grow.
[CELLULAR AUTOMATA] What does a cell transitioning from “infected” to “recovered” represent in an epidemic model?
An individual recovering from the disease and gaining immunity
This transition represents an infected individual recovering from the disease. In most SIR models, recovered individuals gain immunity and cannot be infected again (though more complex models might include waning immunity where recovered individuals can return to susceptible after some time).
[CELLULAR AUTOMATA] What is a lattice model in cellular automata?
A mathematical framework where cells are arranged in a regular grid pattern
A lattice model in cellular automata refers to the arrangement of cells in a regular, repeating pattern (lattice) such as a square grid or hexagonal grid. This structure provides a spatial framework where each cell has a well-defined neighborhood of adjacent cells with which it can interact.
[GENETIC ALGORITHMS] What inspired the development of genetic algorithms?
Biological evolution and natural selection
Genetic algorithms draw inspiration from Darwin’s theory of evolution and natural selection. They mimic the process of natural selection where the fittest individuals are selected for reproduction to produce offspring for the next generation, gradually improving the solution.
[GENETIC ALGORITHMS] What is the “fitness function” in genetic algorithms?
A function that evaluates how close a solution is to the optimal solution
The fitness function assigns a fitness score to each candidate solution, indicating how close it is to solving the problem optimally. Solutions with higher fitness scores are more likely to be selected for reproduction, driving the evolution toward better solutions.
[GENETIC ALGORITHMS] What does “selection” refer to in genetic algorithms?
The process of selecting which candidate solutions will reproduce
Selection is the process that determines which individuals from the current population will be chosen to reproduce and create offspring for the next generation. Typically, individuals with higher fitness scores have a higher probability of being selected, simulating “survival of the fittest.”
[GENETIC ALGORITHMS] What is crossover in genetic algorithms?
The exchange of genetic material between two parent solutions to create offspring
Crossover (or recombination) is a genetic operation that combines parts of two parent solutions to create one or more offspring solutions. It mimics biological reproduction and allows the algorithm to explore new areas of the solution space by combining features of successful solutions.
[GENETIC ALGORITHMS] What purpose does mutation serve in genetic algorithms?
To introduce random small changes to maintain genetic diversity
Mutation randomly alters small parts of a solution, introducing new genetic material that might not have been present in the initial population. This helps maintain genetic diversity, prevents premature convergence to suboptimal solutions, and allows exploration of new regions of the solution space.
[GENETIC ALGORITHMS] What does a “chromosome” represent in resource allocation using genetic algorithms?
A complete candidate solution representing one possible resource allocation
In genetic algorithms, a chromosome represents one complete candidate solution to the problem. For resource allocation, a chromosome might encode which resources are assigned to which projects, with each “gene” in the chromosome representing a specific resource assignment.
[GENETIC ALGORITHMS] What is a constraint in resource allocation problems?
A restriction on resources that valid solutions must satisfy
Constraints in resource allocation represent limitations on how resources can be used or distributed. For example, constraints might include limited total budget, personnel availability, or time restrictions. Valid solutions must satisfy all constraints to be considered feasible.
[GENETIC ALGORITHMS] What does “elitism” mean in genetic algorithms?
A strategy to ensure the best solutions are preserved between generations
Elitism is a strategy where the best solutions from each generation are guaranteed to survive unchanged into the next generation. This ensures that the best-found solutions aren’t lost due to crossover or mutation, preventing the algorithm from losing ground on solution quality.
[GENETIC ALGORITHMS] What is a generation in genetic algorithms?
A complete iteration of the algorithm, producing a new population
A generation in genetic algorithms represents one complete cycle of selection, crossover, mutation, and replacement, resulting in a new population. The algorithm runs for multiple generations, with each generation ideally producing populations with improved fitness.
[GENETIC ALGORITHMS] How do genetic algorithms typically handle infeasible solutions in resource allocation?
By assigning penalty values to reduce their fitness score
When solutions violate constraints, they are considered infeasible. Rather than discarding them entirely, genetic algorithms often apply penalty values that reduce their fitness score proportionally to the severity of constraint violation. This allows promising but slightly infeasible solutions to still contribute genetic material to future generations.
[GENETIC ALGORITHMS] What might a “gene” represent in genetic algorithms for resource allocation?
A specific resource assignment decision
In genetic algorithms for resource allocation, each gene typically represents a specific decision about resource assignment, such as which employee is assigned to which project, or how much of a budget is allocated to a particular activity. The complete chromosome (collection of genes) represents the entire resource allocation plan.
[GENETIC ALGORITHMS] What does the population size parameter control in genetic algorithms?
How many individual solutions are evaluated in each generation
Population size determines how many individual candidate solutions (chromosomes) exist in each generation of the genetic algorithm. A larger population provides more genetic diversity and exploration of the solution space but requires more computation per generation.
[GENETIC ALGORITHMS] What is mutation rate in genetic algorithms?
The probability of random changes being applied to genes
The mutation rate determines how likely it is for each gene (or bit) in a solution to be randomly altered during the mutation phase. A higher mutation rate increases exploration of the solution space but can disrupt good solutions; a lower rate provides more stability but might lead to premature convergence.
[GENETIC ALGORITHMS] What is roulette wheel selection in genetic algorithms?
A selection method where each solution’s chance of being selected is proportional to its fitness
Roulette wheel selection (also called fitness proportionate selection) is a method where the probability of selecting a solution for reproduction is proportional to its fitness. Conceptually, it’s like having a roulette wheel where better solutions have larger sections, giving them a higher chance of being selected.
[GENETIC ALGORITHMS] What is tournament selection in genetic algorithms?
A selection method that randomly picks several solutions and selects the best among them
Tournament selection works by randomly selecting a small group of individuals from the population (the “tournament”), and then choosing the best individual from this group for reproduction. This process is repeated until enough parents are selected. It provides selection pressure toward better solutions while maintaining diversity.
[GENETIC ALGORITHMS] What is convergence in genetic algorithms?
When the population stops improving significantly and contains similar solutions
Convergence in genetic algorithms occurs when the population has evolved to a point where most individuals are very similar, and further evolution produces minimal improvement in fitness. This indicates the algorithm has settled on a solution (hopefully close to optimal), though it might also represent premature convergence to a suboptimal solution.
[GENETIC ALGORITHMS] What is single-point crossover in genetic algorithms?
A reproduction method that selects a single point in the chromosome where genetic material is exchanged
Single-point crossover is a genetic operation where a single crossover point is selected on both parent chromosomes. All genetic material beyond that point is exchanged between the parents to create two offspring. One offspring gets the first part from parent A and the second part from parent B, while the other offspring gets the inverse.
[GENETIC ALGORITHMS] What purpose does a repair function serve in genetic algorithms for resource allocation?
To modify infeasible solutions to make them feasible
A repair function in genetic algorithms takes infeasible solutions (those that violate constraints) and attempts to modify them to make them feasible while preserving as much of the original solution as possible. This can be an alternative or complement to penalty functions for handling constraints.
[GENETIC ALGORITHMS] What is multi-objective optimization in resource allocation?
Finding solutions that balance multiple competing objectives simultaneously
Multi-objective optimization addresses problems with multiple, often competing objectives. In resource allocation, these might include maximizing total benefit, minimizing costs, ensuring equitable distribution, and maximizing utilization rates. Rather than a single optimal solution, multi-objective optimization often produces a set of Pareto-optimal solutions representing different trade-offs.
[AGENT-BASED MODELING] What is an agent in agent-based modeling?
An autonomous entity that follows rules and interacts with its environment
In agent-based modeling, agents are autonomous, decision-making entities that follow defined behavioral rules. Each agent can interact with other agents and the environment, make decisions based on its current state and surroundings, and adapt its behavior based on these interactions.
[AGENT-BASED MODELING] What key characteristic defines agent-based modeling?
It models system behavior emerging from interactions of individual agents
A fundamental characteristic of agent-based modeling is emergence—complex system-level behaviors and patterns that emerge from the interactions of individual agents following relatively simple rules, rather than being explicitly programmed into the model.
[AGENT-BASED MODELING] What information might an agent’s “state” include in autonomous vehicle path planning?
The vehicle’s location, velocity, direction, and environmental awareness
An agent’s state typically encompasses all relevant information about the agent at a given time. For an autonomous vehicle, this would include its current position, speed, direction of travel, awareness of nearby obstacles or other vehicles, and possibly its planned route.
[AGENT-BASED MODELING] What pathfinding approach is commonly used in agent-based models for autonomous vehicles?
Using algorithms like A* to find efficient paths while avoiding obstacles
Algorithms like A* are commonly used for path planning because they can efficiently find optimal or near-optimal paths while avoiding obstacles. These algorithms consider both the distance traveled so far and the estimated distance to the goal when evaluating potential paths.
[AGENT-BASED MODELING] How do agents make decisions in traffic flow simulations?
By making independent decisions based on local rules and perception of their environment
In agent-based traffic simulations, vehicle agents typically make independent decisions based on their perception of the local environment (e.g., nearby vehicles, traffic signals, road conditions) and a set of rules (e.g., maintain safe distance, obey speed limits, change lanes when beneficial).
[AGENT-BASED MODELING] What is “emergence” in agent-based modeling?
Complex patterns and behaviors that arise from simple agent rules and interactions
Emergence refers to the complex, system-level patterns and behaviors that weren’t explicitly programmed but arise naturally from the interactions of agents following relatively simple rules. Traffic jams, flocking patterns, and market dynamics are examples of emergent phenomena in agent-based models.
[AGENT-BASED MODELING] What do lane-switching rules control in traffic flow simulations?
When and how vehicle agents decide to change from one lane to another
Lane-switching rules define the conditions under which a vehicle agent will attempt to change lanes and the process it follows to do so. These might consider factors like current speed, desired speed, presence of slower vehicles ahead, available gaps in adjacent lanes, and safety constraints.
[AGENT-BASED MODELING] What advantage does agent-based modeling have over equation-based modeling for traffic simulation?
It can capture heterogeneity and individual decision-making
Agent-based models excel at representing diverse agent types with different behaviors (heterogeneity) and modeling individual decision-making processes. This allows for more realistic representations of traffic, where different drivers have different behaviors, preferences, and goals.
[AGENT-BASED MODELING] What is an “obstacle” in agent-based path planning?
A prohibited location that the agent must avoid
In path planning, obstacles represent areas or objects in the environment that the agent cannot pass through and must navigate around. These could include physical barriers, restricted zones, or other agents occupying space that must be avoided.
[AGENT-BASED MODELING] What is a “grid” in agent-based path planning?
A discretized representation of the environment in which agents navigate
In many agent-based models, the environment is represented as a grid of cells or locations. This discretization simplifies navigation and collision detection, allowing agents to move from cell to cell while following their decision rules within a structured environment.
[AGENT-BASED MODELING] What is the exploration vs. exploitation trade-off in path planning?
Deciding whether to search new areas or follow the best-known path
The exploration vs. exploitation trade-off refers to balancing between exploring unknown areas that might contain better paths (exploration) and following the best path found so far (exploitation). Too much exploration wastes time, while too much exploitation might miss better solutions.
[AGENT-BASED MODELING] What causes emergent congestion in traffic flow simulations?
Complex traffic patterns emerging from individual vehicle behaviors and interactions
Emergent congestion occurs when traffic jams form without any specific programmed cause, but rather as an emergent property of many vehicles following simple rules (like slowing when too close to the vehicle ahead). This mirrors real-world traffic where jams can form without accidents or bottlenecks.
[AGENT-BASED MODELING] What is the A* (A-star) algorithm used for in agent-based models?
An informed search algorithm that finds the least-cost path from start to goal
A* (A-star) is a popular pathfinding algorithm that combines the strengths of Dijkstra’s algorithm (guaranteed to find the shortest path) and greedy best-first search (efficiency through heuristic guidance). It uses a heuristic function to estimate the cost to the goal and prioritizes exploring paths that appear promising.
[AGENT-BASED MODELING] What is Manhattan distance in the context of path planning?
The sum of horizontal and vertical distances between two points
Manhattan distance (also called taxicab distance) calculates the distance between two points by summing the absolute differences of their coordinates. It’s called Manhattan distance because it’s like navigating in a grid-based city where you can only travel along streets (horizontally and vertically), not diagonally through buildings.
[AGENT-BASED MODELING] What is path smoothing in autonomous vehicle simulations?
A post-processing step to make generated paths more naturally drivable
Path smoothing is a technique applied to paths generated by grid-based or waypoint-based planning algorithms to make them more natural and drivable. It removes unnecessary zigzags and sharp turns, creating smoother curves that a vehicle can follow more efficiently and comfortably.
[AGENT-BASED MODELING] What collision avoidance approach might be used in autonomous vehicle simulations?
Rules that adjust speed and direction based on proximity to obstacles and other vehicles
Agent-based models for autonomous vehicles typically implement collision avoidance through rules that adjust the vehicle’s behavior based on sensing its environment. These might include maintaining safe distances, reducing speed when obstacles are ahead, changing lanes to avoid stationary objects, or following traffic rules.
[AGENT-BASED MODELING] What is a car following model in traffic simulation?
Rules governing how vehicles adjust their speed based on the vehicle ahead
Car following models are sets of rules determining how vehicles adjust their speed in response to the vehicle ahead of them. These models capture driver behaviors like maintaining safe following distances, adjusting speed to match the leading vehicle, and reactions to braking or acceleration of the leading vehicle.
[AGENT-BASED MODELING] What is self-organization in agent-based models?
The process of ordered patterns emerging from local interactions without central control
Self-organization refers to the process where ordered, structured patterns at the system level emerge solely from numerous local interactions among agents, without centralized control or external direction. Examples include traffic patterns, flocking behaviors, and market dynamics that arise from individual agents following simple local rules.
[AGENT-BASED MODELING] What is stigmergy in agent-based models?
Indirect coordination between agents through modifications to their environment
Stigmergy is a mechanism of indirect coordination where agents leave traces in the environment that influence the behavior of other agents. For example, in ant colony optimization algorithms, virtual “ants” leave pheromone trails that guide other ants, allowing coordination without direct communication.
[ADVANCED CONCEPTS] What is a traffic wave or phantom traffic jam?
A congestion pattern that forms and propagates without an obvious cause
Traffic waves or phantom traffic jams are congestion patterns that emerge spontaneously in traffic flow, often due to small variations in driving behavior being amplified. These waves propagate backward through traffic, even though there might be no accident or obstruction causing them—an emergent phenomenon successfully captured by agent-based traffic models.
[ADVANCED CONCEPTS] What is the curse of dimensionality in recommendation systems?
The tendency for distance-based algorithms to become less effective in high-dimensional spaces
The curse of dimensionality refers to various phenomena that arise when analyzing data in high-dimensional spaces. In recommendation systems with many attributes, distance metrics become less meaningful as most points appear almost equidistant from each other, making similarity-based methods less effective unless properly managed.
[ADVANCED CONCEPTS] What is hybrid filtering in recommendation systems?
A combination of collaborative and content-based filtering approaches
Hybrid filtering combines multiple recommendation techniques, typically collaborative filtering and content-based filtering, to leverage the strengths of each approach. This can help overcome limitations of individual methods (like the cold start problem) and provide more robust and accurate recommendations.
[ADVANCED CONCEPTS] What is importance sampling in agent-based models?
A statistical method for efficiently exploring uncertain or rare events
Importance sampling is a statistical technique used in some agent-based models to efficiently simulate rare but significant events. Rather than running many simulations where the rare event seldom occurs, the model is designed to make the event occur more frequently, with results then mathematically adjusted to reflect the true probability.
[ADVANCED CONCEPTS] What is Dynamic Time Warping used for in AI applications?
Measuring similarity between temporal sequences that may vary in speed
Dynamic Time Warping (DTW) is an algorithm for measuring similarity between two temporal sequences which may vary in speed. It can be used in recommendation systems analyzing time series data, or in comparing trajectory patterns in agent-based models, allowing for non-linear alignments of time series.
[ADVANCED CONCEPTS] What is social distancing in epidemic simulations?
An intervention that reduces contact between individuals to lower transmission
Social distancing in epidemic simulations reflects real-world interventions that reduce contact between individuals, thereby lowering disease transmission. It might be modeled by reducing the probability of transmission between neighboring cells or by increasing the distance required for transmission to occur.
[ADVANCED CONCEPTS] What is a greedy algorithm in the context of the TSP?
An algorithm that makes the locally optimal choice at each step
A greedy algorithm makes the choice that looks best at the moment without considering the future. For TSP, the Nearest Neighbor heuristic is a greedy approach because it always selects the closest unvisited city next, regardless of whether this choice leads to a good overall route.
[ADVANCED CONCEPTS] What is the two-opt heuristic used for in the TSP?
A local improvement method that swaps two edges to create a better tour
The two-opt heuristic is a local search technique that tries to improve a TSP tour by removing two edges and reconnecting the resulting path fragments in a different way. If the new tour is shorter, it’s accepted as an improvement. This process continues until no more improvements can be found.
[ADVANCED CONCEPTS] What does “overconstrained” mean in resource allocation problems?
When constraints are so restrictive that no feasible solution exists
A resource allocation problem is overconstrained when the constraints are so strict that it’s impossible to find a solution that satisfies all of them simultaneously. In such cases, the problem formulation might need to be revised, or techniques for finding approximately feasible solutions must be employed.
[ADVANCED CONCEPTS] What is a directed graph in path planning?
A network where connections between locations have direction (one-way vs. two-way)
A directed graph (or digraph) is a network where the connections (edges) between nodes have direction. In path planning, this represents scenarios where travel between locations might be one-way (like one-way streets in traffic modeling) rather than bidirectional.
[ADVANCED CONCEPTS] What visualization technique is commonly used to display intensity values in AI applications?
Heatmaps with color gradients representing data intensity
Heatmaps use color gradients to represent intensity values across a grid. In AI applications, they might visualize epidemic spread intensity, traffic density, resource allocation effectiveness, or recommendation strength patterns. The color intensity (often from cool to warm colors) corresponds to the data values.