Week 10: Particle Swarm Algorithms #2 Flashcards

You may prefer our related Brainscape-certified flashcards:
1
Q

What are some applications to PSO algorithms?

A
  • FPGA placement
  • Neural network training
  • Robot motion planning
  • Clustering
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

How does PSO compare to SA in terms of convergence?

What about for problems with larger dimensionality?

A
  • PSO has a faster convergence than SA

- For problems with larger dimensionality, PSO gets stuck in local minima

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

How does PSO compare to SA in terms of initial solution?

What about for computational cost?

A
  • PSO will start at a better solution than SA due to the initialization of many particles
  • PSO also has a higher computational cost than SA due to handling of a complete swarm
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

What is the motivation behind using adaptive PSO in terms of parameters?

A
  • The purpose is to have a user call PSO without having to specify any parameters
  • The adaptive algorithm would tune the parameters according to feedback from the environment
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

In Adaptive PSO, what does a “tribe” refer to?

A
  • A tribe is a group of connected particles
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

In Adaptive PSO:

What do tribes need in between them?

What is the reason for this?

A
  • All the tribes should have some type of connection between them to inform one another of their findings
  • This will help in deciding which is the global minimum among all the different solutions that was found by the different tribes
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

In PSO, what does a “good” particle mean?

What does “G” refer to in context to a tribe?

A
  • A “good” particle is a particle that has its pbest improved in the last iteration, otherwise, it’s “neutral”
  • “G” is the number of good particles in a tribe
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

In Adaptive PSO, what does each particle memorize?

What is an “excellent” particle known as?

A
  • Each particle memorizes the last two performance variations
  • A particle with both variations as improvements is an excellent particle
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

In Adaptive PSO, under what condition is a tribe marked as “good” or “bad”?

A

A tribe is marked as “good” depending on the value of “G”:

r = uniform random number between 0 and 1
T = number of particles in the tribe

If (r < G/T): tribe is “good”
otherwise: tribe is “bad”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

In Adaptive PSO, what happens at the end of a generation for:

  1. A good tribe
  2. A bad tribe
A
  1. A Good tribe deletes its worst particle to conserve the number of performed function evaluations
  2. A Bad tribe generates a new random particle simultaneously, and all new particles form a new tribe
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

In Adaptive PSO, whats the general idea behind forming new tribes?

A
  • All new particles form a new tribe
  • Each particle gets connected to the tribe that generated it through its best particle
  • The idea is to start with a single particle, and then generate other particles to form other tribes accordingly
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What happens in Adaptive PSO in terms of tribe creation if:

  1. The algorithm isn’t performing well?
  2. The algorithm is performing very well?
A
  1. Larger and larger tribes will be generated to increase the swarm search power
  2. Good tribes will start to occur, and they will start removing their worst particles reducing the tribes size, possibly to complete extinction
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

What are the 3 general approaches for cooperative PSO?

A
  • Concurrent PSO
  • Cooperative PSO
  • Hybrid Cooperative PSO
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Describe the Concurrent PSO approach

A
  • Two different swarms are updated in parallel, both using different algorithms
  • The swarms exchanged their gbest values every pre-determined number of iterations
  • Both swarms track the better gbest
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Describe the Cooperative PSO approach (CPSO)

A
  • Have different swarms optimizing different variables of the problem (different dimensions of the solution)
  • The fitness of any particle is determined by its value and the value of the best particles in all other swarms
  • Performs best if the problem variables are independent
  • In order to generate a solution, combine the highest fitness individuals from each swarm
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

We have the Hybrid-Cooperative PSO approach (CPSO_H)

That in CPSO_H, two swarms are serially updated. One swarm uses the normal PSO algorithm, and the other using the CPSO (Cooperative PSO), with sub-swarms

How the algorithm work?

A
  • Each swarm is updated for one iteration only
  • When the PSO swarm gets updated, its gbest values are sent to the CPSO swarm
  • The CPSO swarm uses the elements of the received gbest to update random particles of its sub-swarms
  • After CPSO gets updated, it sends its context vector (best solution) back to the PSO swarm
  • The PSO swarm uses the received context vector (best solution) to replace a randomly chosen particle