Nonlinear least squares cost function does not converge for a simple dae example

Hi :wave:

I wanted to try the nonlinear ls cost function for a simple example. With the external cost function type it does work. Do you have any idea why it does not converge correctly? It also works for linear ls with Gauss Newton although I did not put it in the minimal example. Thus, the hessian approximation should not be the reason.

Here the example:

import uuid

import casadi as cd
import numpy as np
from acados_template import AcadosModel, AcadosOcp, AcadosOcpSolver

def export_simple_array_dae_model(scalar: float = 1) -> AcadosModel:
    model_name = 'minimal_example' + str(uuid.uuid4()).replace('-', '_')
    # set up states & controls
    x = cd.SX.sym('x', (2, 1)) * scalar
    u = cd.SX.sym('u', (2, 1)) * scalar
    p = cd.SX.sym('p', (2, 1)) * scalar
    # xdot
    x_dot = cd.SX.sym('x_dot', (2, 1))
    # algebraic variables
    z = cd.SX.sym('z', (2, 1)) * scalar
    # dynamics
    f_expl = (u - p) / scalar
    f_impl = x_dot - f_expl
    alg = z - 4 + u
    f_impl = cd.vertcat(f_impl, alg)

    model = AcadosModel()

    model.f_impl_expr = f_impl
    model.f_expl_expr = f_expl
    model.x = x / scalar
    model.xdot = x_dot
    model.z = z / scalar
    model.u = u / scalar
    model.p = p / scalar
    # set model_name = model_name

    return model

def test_minimal_mpc_example_with_nonlinear_ls() -> None:
    # If I switch this value to False it does work.
    use_nonlinear_ls = True

    ocp = AcadosOcp()

    # set model
    model = export_simple_array_dae_model()
    ocp.model = model

    Tf = 1.0
    N = 2

    # set dimensions
    ocp.dims.N = N
    # The cost function should be (x - 1)**2
    # Because x starts at 1, the optimal solution should be u=1 which results
    # in a cost value of 0.
    cost = cd.sum1((model.x - 1)**2) + cd.sum1(  # type: ignore
        (model.z - 3)**2)  # type: ignore
    cost_e = cd.sum1((model.x - 1)**2)  # type: ignore
    if use_nonlinear_ls:
        ocp.cost.cost_type = 'NONLINEAR_LS'
        ocp.cost.cost_type_e = 'NONLINEAR_LS'
        ocp.solver_options.hessian_approx = 'GAUSS_NEWTON'
        ocp.model.cost_y_expr = cost
        ocp.model.cost_y_expr_e = cost_e
        ocp.cost.W = np.eye(1)
        ocp.cost.W_e = np.eye(1)
        ocp.cost.yref = np.array([0])
        ocp.cost.yref_e = np.array([0])
        ocp.cost.cost_type = 'EXTERNAL'
        ocp.cost.cost_type_e = 'EXTERNAL'
        ocp.solver_options.hessian_approx = 'EXACT'
        # Squared, so that it is equal to the nonlinear ls formulation.
        ocp.model.cost_expr_ext_cost = cost**2
        ocp.model.cost_expr_ext_cost_e = cost_e**2
    # set constraints
    ocp.constraints.x0 = np.array([1.0, 1.0])

    # set options
    ocp.solver_options.qp_solver = 'PARTIAL_CONDENSING_HPIPM'
    ocp.solver_options.integrator_type = 'IRK'
    ocp.solver_options.nlp_solver_type = 'SQP'  # SQP_RTI, SQP

    # set prediction horizon = Tf
    # If I set it to [1, 1] it converges to the trivial solution.
    p_init = np.array([1, 2])
    ocp.parameter_values = p_init
    ocp_solver = AcadosOcpSolver(ocp, json_file='acados_ocp.json')
    cost_value = ocp_solver.get_cost()
    n_horizon: int = ocp_solver.acados_ocp.dims.N  # type: ignore
    result_x = []
    for stage_index in range(n_horizon + 1):
        stage_x = ocp_solver.get(stage_index, 'x')
    result_u = []
    for stage_index in range(n_horizon):
        stage_u = ocp_solver.get(stage_index, 'u')
    assert cost_value < 1, f'{cost_value=} is not smaller than 1.'

Thanks for help :pray:

Hi David,

I think you are not using the NLS cost module correctly.
You should specify the vector y not the least squares term as a scalar.

Please check this example for equivalent cost formulations: acados/ at master · acados/acados · GitHub


Hi Jonathan,

thanks for the quick response! You are right. I actually did it on purpose. I am currently trying to implement an interface from the DoMPC library to acados. However, in DoMPC I cannot define the different terms. I know it is not an optimal solution yet to just use the quadratic cost formulation for the vector y. However, I thought it should be still the same optimum. Actually if i take the squareroot of the cost function it should be the same cost function, shouldn’t it? Thus, I do not understand why the optimizer has already such difficulties for such a simple problem. Do you have any ideas how I could inspect the problem?


Hi David,

I actually did it on purpose. I am currently trying to implement an interface from the DoMPC library to acados.

That sounds really nice!

The NLS cost module needs to know what is the inner nonlinear function and what is the outer weighted least-squares to compute the GN hessian approximation.
If you just use (\sqrt{\textrm{external cost}})^2, acados surely computes a different (worse hessian approximation).
I think you can try this also on the example I pointed to.

What really would make sense though, is writing an AUTO cost type in Python and detect which kind of cost function is used.
I did this in Matlab: here acados/detect_cost_type.m at master · acados/acados · GitHub
It is surely possible to extend this to NLS cost functions.

Good luck with the do-mpc interface!


Yeah, I hope to publish it one day so that it can be of use to others.

Okay true, I looked the GN up again. I had it somehow wrong in my mind. It is a completely different approximation if I do not define each term seperately :stuck_out_tongue:

Yeah, actually, I already had the same idea. I already implemented the extraction for the linear ls term. However, for the nonlinear term I did not have any idea so far how to approach the extraction problem.

Thanks for your help,