THEORY 3 Flashcards

1
Q

Objective functions, design variable and constraint definition. What is the Pareto optimal set?

A

a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Number of solutions of an optimization problem

A

a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Exhaustive methods

A

a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Single objective optimization / scalar optimization: mathematical definition

A

aa

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Definition of global minimum, local minimum, convexity

A

a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Graph of the main single optimization methods

A

aa

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Gradient: definition and physical meaning. Taylor’s expansion

A

a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Optimality conditions (unconstrained)

A

a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Optimality conditions (constrained): KKT conditions

A

a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Non linear optimization: is it possible to guarantee a global optimum? Why? List the 2 main heuristic rules on which the algorithms are based. What is the general search procedure for optimization problems?

A

a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

5 properties of a good algorithm

A

a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Grid and random methods, Pattern search and Simplex method

A

a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Basic descend methods

A

aa

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Penalty methods

A

a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

MOP: mathematical definition of the Pareto optimal solution and meaning. Local vs global Pareto optimal solution.

A

a

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Pareto optimal necessary condition. Ideal vs Nadir solution

A

a

17
Q

Low discrepancy sequences. Definition of uniformity and discrepancy

A

a

18
Q

Feasibility and boundedness

A

a

19
Q

Scalarization techniques

A

a

20
Q

Lagrange multipliers

A

a

21
Q

John Fritz optimal condition

A

aa

22
Q

Discrete programming

A

a

23
Q

Genetic algorithms: introduction and explanation. Binary coding representation for discrete, continuous and multi design variables. Description of the process in detail

A

a

24
Q

Holland theorem: schema properties and number of elements with a schema at a certain generation

A

a

25
Q

Constraints in GAs

A

a

26
Q

Termination conditions for GA and no free lunch theorem

A

a

27
Q

Multiobjective optimization with GA. Main advantages. How to assign the fitness and how to guarantee even distribution?

A

a

28
Q

Global sensitivity analysis

A

a

29
Q

Global approximation: least square

A

a

30
Q

Machine learning: introduction to neural networks. Structure of the artificial neural network, example of a multi layer feed forward network and activation functions.

A

a

31
Q

Learning / training. Back propagation

A

a

32
Q

Cross validation and regularization

A

a

33
Q

Architecture of the neural network

A

a

34
Q

k-optimality: selection of the final design solution

A

a

35
Q

Topology optimization

A

a

36
Q

Design of experiments: why choosing an OAT approach is not useful? How is it possible to represent in a graphical way a 2 levels 2 DV problem? What is screening? Fractional factorial DOE

A

a

37
Q

Principles for defining a good fractional factorial DOE

A

a

38
Q

Increasing the number of levels in a DOE for a non linear model. Difference in the n° of experiments required by a full factorial plan and the actual n° of coefficients of the empirical model. Algorithms and methods for high number of levels.

A

a

39
Q

Integrated controls

A

a