thanks for the amazing work and for providing the very accessible Python interface!
I am working on an MPC problem with a linear LS cost with a gradient term, i.e.
0.5*||w-w_ref||^2_W + q.T*(w-w_ref), with w = [x;u]. I need the gradient as the reference is characterized by active inequality constraints (the linear LS cost approximates an economic cost function). However this option seems to be unsupported yet. Also I did not find a way to around this problem by using slacks and the corresponding linear penalty term.
For now I can use the “external cost” module, but this deprives me of using the Gauss-Newton Hessian approximation, which is a shame. I am overlooking some other option?
If not I would be very grateful if the gradient option would be added to the linear LS cost module!