Classical optimization methods solve vector optimization problems. These
methods produce approximations of Pareto-optimal sets by which we analyze the
performances and limitations. The research initially extended existing methods. The
Zeleny’s simplex algorithm illustrates this approach in extending the simplex method to
multiple linear objective functions. The simplex method is an iterative procedure that
finds an optimal solution to a linear single-objective programming problem. It is one of
the numerous techniques proposed to solve linear SOOP problems. The process uses a
finite number of iteration steps. In the beginning, a reformulation of the program is such
that slack variables introduce the inequality constraints. These slack variables are the
primary basis. The process moves on from an extreme point of the feasible space to
another adjacent point. Multi-objective simplex tableaus are augmented to solve linear
MOO problems by using similar principles and processes. There is one row for each
objective function. Numerical examples illustrate the whole computation procedure.
Weighting objective method is one another class of founding the technique to solve
nonlinear MOO problems. The method consists of aggregating (or making scalar) the
objective functions. The objectives are normalized before the aggregation. The weighting
can be a convex linear combination of objectives with different weights. These weights
differ according to the method such as with the weighted sum method, the weighted
metric method, and the weighted exponential method. Numerical examples illustrate each
case.
Keywords: Basic feasible region, Best compromise solution, Chebychev’s
problem, Excess resource variable, Ideal point, Linear programming, Neighboring
nonbasic solution, Neighboring point, Non-inferior solution, Payoff table, Pivot,
Scalarization, Simplex tableau, Simplex-based algorithm, Slack variable, Weakly
Pareto-optimal, Weighted exponential method, Weighted metric method, Weighted
sum method, Zeleny’s simplex algorithm.