Adjoint-based optimization approach

TMG Correlation uses gradient-based optimization algorithms, which rely on computing the gradient of the objective function. The gradient is the variation of the objective function with respect to the design variables. TMG Correlation uses the adjoint method to compute the gradient.

With the adjoint method [1], the gradient is obtained by the summation of the derivative of the objective function with the product of the adjoint vector and the derivative of the residuals.

Where:

  • is the vector of the design variables.
  • is the derivative of the objective function with respect to the design variables.
  • is the transposed of the adjoint vector corresponding to the temperature that is computed from the adjoint equation.
  • is the derivative of the residuals of the governing equation with respect to the design variable, which is approximated by design variable perturbation.

Adjoint equation

The adjoint vector is computed based on the Jacobian matrix of the conduction equation and the derivative of objective function:

Where:

  • is the derivative of the residual with respect to the temperature.
  • is the derivative of objective function with respect to temperature.

Update of the design variable vector

The new design variable vector is computed as follows:

  • α is the step length, which is computed via the specified optimization algorithms.