1. 自由主题
  2. 自由主题
  3. Outline
    1. Advance Mathematics
    2. Linear Algebra
    3. Reference Books
      1. 《工程最优化设计》
  4. Chapter 1
    1. Basic Steps
      1. mathematical modeling
        1. identifying the objective,variables,constraints
      2. construction of a appropirate model
        1. Domain
          1. an region enclosed by multiple constraint boundaries
        2. Feasible Region
          1. 子主题 1
      3. once the model is formulated
        1. an optimization algorithm can be used to find the solution
      4. optimization is the process of minimizing or maximzing of a function subject to constraints on its variables
    2. optimization is also called "linear programming"
      1. linear problems
        1. vortex(漩涡) conversion
          1. simplex method
      2. nonlinear problems
        1. unconstrained optimization
          1. one-dimensional
          2. golden section search method(GSM)
          3. Fibonacci search method (FSM)
          4. Quadratic interpolation method
          5. Cubic interpolation method
          6. Use info of derivatives(派生物)
          7. Steepest descent method
          8. Newton method
          9. Quasi-Newton Method
          10. Conjugate Gradient method
          11. 子主题 4
          12. Don't use info
          13. Powell method
        2. constrained optimization
          1. solve directly
          2. Feasible direction method
          3. solve indirectly
          4. Penalty fuchtion method
          5. SQP method
    3. Graphical method
      1. 1. determinate the domains of the variables
      2. 2. identify the feasible region of te solutions
      3. 3. plot a few contours of objective function to search the descent direction of the function
      4. 4. find the optimal solutions of the problem
    4. iteration methods
      1. descent methods
        1. search direction:S0
        2. determinate the optimal step length to minimize the function value :a
        3. obtain the new iteration point X1=X0+a*S0
        4. a good algorithm is expected to have good convergence and a fast speed
          1. rate of convergence
          2. linear rate
          3. β=1
          4. superlinear rate
          5. 1<β<2
          6. quadratic rate
          7. β=2
          8. termination criteria(终止准则)
          9. based on point distance
          10. based on function value difference
          11. based on gradient
  5. Fundamental of Optimization
    1. partial derivative
      1. gradient
        1. the direction the function value increases most rapidly
          1. negative gradient direction decreases most rapidly
        2. a vector comprised of all the first-order partial derivatives at this point
        3. decribes the local variation of the function values at one point
    2. directional derivative
      1. 注意转置方向
        1. 梯度是列向量
        2. 方向向量也是列向量
        3. 梯度的转置 * 方向向量 = 方向导数
    3. Taylor Expansion
      1. one variable
      2. multiple variables
        1. 二次近似
          1. 二阶:Hessian matrix
          2. significant in optimization
          3. 一阶
    4. Positive Definite Quadratic Function(PDQF)
      1. type of matrix
        1. positive definite
        2. positive semi-definite
        3. negative positive
        4. negative semi-definite
        5. indefinite
      2. characteristics
        1. 1. the contours of PDQFs are a set of concentric ellipses; the center of this set is the minimum point of the PDQFs
        2. 2. for the non-positive-definite QFs, the contours near the minimum point are approximately as ellipses; the contours become irregular when they are away form the minimum point.
    5. Convexity
      1. convexity set
        1. if the straight line segment connecting any two points in S lies entirely inside S
      2. convexity function
        1. for any two points ,the straight line lies above the graph of the function
    6. Minimizer
      1. global minimizer
      2. local minimizer
      3. strong minimizer
    7. Descent direction
      1. 通过角度来判定
    8. Extremum conditions
      1. only one variables
      2. multiple variables
        1. first order
        2. second order
    9. 记住他们的表达式
  6. Linear search methods
    1. 1-D dimensional search method
      1. STEP1: an initial interval including the minimum point needs to be determined
        1. unimodal function(单峰函数)
        2. basic steps to determine the initial interval
      2. STEP2: then the length of the interval is reduced after every iteration until the minimum point is found
        1. interpolate(差补) two points inside the current known interval [a,b] , then compare the two points.
        2. when the given convergence accuracy, some point inside the interval can be approximately considered as the minimum point along this search direction.
        3. ratio of interval reduction
      3. Determining Initial Interval
        1. some basic steps
    2. GSM
      1. ratio is 0.618
        1. the principle to determine two pionts
        2. and the principle to determine the convergence
        3. the algorithm of the GSM
    3. FSM
      1. Fn递增:1,1,2,3,5,8,13,21......
      2. Ln递减
      3. how to determine n?
        1. 每一步都有不同的ratio
    4. Powell's quadratic interpolation method(QIM)
      1. 类似待定系数法
        1. the minimum point Xp
          1. 两个条件确定
        2. the direction of parabolic
        3. convergence criterion of QIM
          1. X2-Xp
    5. Davidon's cubic interpolation method (CIM)
  7. Chapter4: Unconstrained Optimization Methods
    1. Steepest Descent Method
      1. searching direction
        1. negative gradient direction at iteration point
      2. Sk=-gk=-梯度
        1. the search direction
      3. X2=X1+a*Sk
        1. df/da=0
          1. a is the step length
      4. Convergence judgement
      5. fault
        1. need a lot iterations
    2. Newton Method
      1. utilize the negative gradient and the Hessian matrix of the function
        1. Taylor Expansion of second order
      2. Newton's Direction
        1. Sk
      3. the step length α is 1
      4. fault
        1. if the Hessians are not positive definite ,the method is not applicable
    3. Quasi-Newton Method
      1. use a positive definite matrix to approximate the f的二阶偏导
        1. 两种方法获得校正值
          1. DFP
          2. 经过迭代H接近H阵的逆
          3. BFGS
          4. better
          5. 经过迭代,B接近H阵
          6. B=H的逆
      2. superlinear convergence
        1. 1<β<2
    4. Conjugate Gradient Method
      1. PRP
      2. FR
      3. 两种求β的不同形式
    5. main method : utilizing the information of the first-order and second order derivatives of the objective function
  8. Linear Programming
    1. constraints
      1. constraint equations
        1. ought to be equalities
          1. if there exists inequalities ,bring in slack variables
          2. 将不等式变为等式
      2. nonnegativity constraints
    2. solutions
      1. Basic solution
        1. only satisfies the constraint equations
      2. Basic feasible solution
        1. satisfies both constraint equations and the non-negative constraints for variables
      3. optimal solution
        1. the feasible solution which achieve the optimal value
      4. n variables ; m equations (n>m)
        1. set n-m variables to zeros
          1. non-basic variables
        2. m variables
          1. Basic variables
    3. Basic Feasible Solution
      1. all the constraint equations are the inequalities
        1. introduce slack variables
          1. basic variables
          2. 变量系数为0,其他系数为要求的表达式中的系数
      2. constraint equations are equalities
        1. introduce artificial variables
          1. object function is the sum of all the artificial variables
          2. when the value of the object function is zero, the solution is founded
          3. the basic feasible solution is the part without the artificial variables
          4. 因为人工变量时,要求的是人工变量之和。所以artificial variables的系数是1,然后其他变量系数为零
    4. The simplex method
      1. the principle column with the smallest judgement number
      2. the row has the smallest quotient(商):b/a ( a属于第k列 ; a >0 )
      3. the judgement number for the basic variable is always zero
        1. and the judgement number for the non-basic variable 有公式
  9. Constrained Optimization Methods
    1. direct methods
      1. the iteration points are always confined in the feasible region, after considering all the constraints.
        1. Feasible Direction Method
    2. indirect methods
      1. constraints are put into the objective function such that constrained problems can be converted to unconstrained problem, or nonlinear problems are converted to relatively simple quadratic programming problems.
        1. Penalty Function Method
        2. SQP (Sequential Quadratic Programming) Method
    3. effective constraints
      1. ECs for equality-constrained problems
        1. Kuhn-Tucker Condition
    4. ineffective constraints
      1. depending on whether the investigated point is on the constraint boundary or not
      2. ECs for inequality-constrained problems
        1. introducing slack variables