As illustrated by the introductory example, an optimization problem is typically formulated as the minimization of an objective function with respect to a set of decision variables, subject to a set of constraints. More generally, an arbitrary optimization problem can be written in abstract form as:
This formulation is intentionally general and not yet operational, but it suffices to introduce the core vocabulary and nomenclature of optimization.
Vocabulary¶
- : the vector of decision variables (or parameters) over which the optimization is performed.
- : the number of decision variables; also referred to as the dimension of the problem.
- : the objective function, also known as the cost function or criterion, which assigns a scalar value to each candidate solution.
- : the feasible set or constraint set, which contains all values of that satisfy the problem’s constraints. In practice, is often implicitly defined through equality and inequality constraints.
Nomenclature¶
The space can be:
- Finite or discrete: leads to discrete or combinatorial optimization
- Continuous, e.g., : leads to continuous optimization
If there are no constraints (i.e., ), the problem is called unconstrained optimization.
If constraints are present (i.e., ), the problem is called constrained optimization.
The nature of the objective function defines specific problem classes:
- is linear: linear optimization
- is quadratic: quadratic optimization
- is convex, and constraints are convex: convex optimization
- may be nonconvex, non-differentiable, etc.: general nonconvex nonsmooth optimization
A word about discrete optimization (not covered in this course)¶
In discrete optimization (also called combinatorial optimization), the decision variables take values from a finite or countable set, such as or a list of permutations. These problems arise in areas such as scheduling, routing, assignment, and network design, and often involve integer or binary variables.
Discrete optimization problems are generally non-convex, combinatorially complex, and may require specialized algorithms such as:
- branch-and-bound,
- dynamic programming,
- greedy methods, etc.
This course focuses exclusively on continuous optimization, where the variables belong to continuous spaces (e.g., ), and where tools from calculus and convex analysis are applicable.