Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Making Convex.jl DPP compliant #383

Open
akshay326 opened this issue Mar 31, 2020 · 14 comments
Open

Making Convex.jl DPP compliant #383

akshay326 opened this issue Mar 31, 2020 · 14 comments

Comments

@akshay326
Copy link

Hi Team

I read about the project to enable JuMP.jl differentiate solution of problem w.r.t. its parameters.

CVXPY implemented this ability and the article accompanying this feature describes a new grammar named DPP, subset of the DCP system.

Does Convex.jl defines a parameter class like a variable or constraint? (I didn't find any implementation like this one)

@akshay326
Copy link
Author

DPP compliance needs adding two more constraints apart from already existing ones. But firstly supporting a parameter type object is imperative 😅

@ericphanson
Copy link
Collaborator

Convex doesn’t have such a type yet though we could add one. One thing you can do now is fix! a variable to different values, which kind of mimics a parameter, except we can’t do the DPP checking.

If we added those, what next would we need to differentiate through programs? I haven’t read much about it yet, but it sounds cool.

@akshay326
Copy link
Author

A distinction between parameters and variables will suffice, at least on the data structure level. Then we find the derivative in following steps

  1. Convex.jl already canonicalizes a Model object to cone program. We'll need to obtain a map from parameters of original problem to data of cone program. The article proves that this map will be a matrix, affine in the parameters, if the problem is DPP compliant.
  2. Obtain perturbations of data of cone program wrt original problem parameters (I still need to understand this in detail 😅 - has a whole article devoted to this step).
  3. Feed the perturbations to AD system
    (there's another step, obtaining a reverse map, similar to the first step)

@ericphanson
Copy link
Collaborator

Interesting. To me it sounds like a lot of this will have to be on the MathOptInterface side of things, since that’s what does many of the transformations to result in the final model / problem data. But Convex definitely seems like the right place for the DPP guarantees.

@JinraeKim
Copy link
Contributor

JinraeKim commented Oct 27, 2021

Is there any timetable to support this feature? :)

EDIT: For me, it is a very attractive feature and now I consider that moving to Python for cvxpylayers.
If this feature is supported by Convex.jl like cvxpylyaers, it would be very useful for researchers and engineers who deal with machine learning and convex optimisation :) (especially for Julians like me)

@odow
Copy link
Member

odow commented Oct 27, 2021

There is no timetable. @ericphanson has left academia, so Convex.jl is essentially in maintenance mode. Although we can still review and merge PRs if you're interested in contributing.

@JinraeKim
Copy link
Contributor

There is no timetable. @ericphanson has left academia, so Convex.jl is essentially in maintenance mode. Although we can still review and merge PRs if you're interested in contributing.

Thank you :)

@matbesancon
Copy link
Contributor

This can also be closed since done in DiffOpt

@JinraeKim
Copy link
Contributor

This can also be closed since done in DiffOpt

Just for my curiosity, could you refer to any PR, issue, or docs to check this?

@odow
Copy link
Member

odow commented May 10, 2022

DiffOpt is here: https://github.com/jump-dev/DiffOpt.jl

But it'd need some integration work with Convex before I'd close this issue (even if just an example or a blog post to demonstrate how to use it).

@ericphanson
Copy link
Collaborator

Yeah, I believe this issue is about creating a tracking system to verify DPPness, just like we do for convexity (DCPness). I don’t think DiffOpt does this.

(Similar in spirit to #138 (comment) which discusses extending the tracking system to operator convexity and operator concavity.)

@ericphanson
Copy link
Collaborator

ericphanson commented May 12, 2024

I looked into this a bit, I think there's a few things here:

  1. support first-class parameters, such that we do not need to call conic_form! again to re-solve after changing a parameter value. Reading the paper DPPness is about adding some small restrictions in how parameters can be used to ensure the result can be efficiently re-solved with different parameter values without reformulating the whole thing, and perhaps also about ensuring differentiability
    • here it would probably make sense to use MOI.Parameter, since we can efficiently update those later
    • I suspect ParametricOptInterface.jl could be useful somehow but I don't understand what it adds over just MOI.Parameter
    • the hardest part is supporting parameter * variable (or generally parameter-affine * affine). In MOI, a MOI.Parameter is actually a special constraint set, and a parameter is a variable constrained to such a set. So for us it is a special variable. But the whole current Convex.jl pipeline is build around affine expressions applied to a vector of variables, and we lower to MOI.VectorAffineFunctions. I believe instead we would need to lower to MOI.VectorQuadraticFunction whenever a parameter*variable is eventually present, and I'm not sure what the internal datastructures would look like to achieve that. (edit: I guess the strict generalization would be a 3-dimensional sparse array, with 1 output dimension, 1 variable input dimension, and 1 parameter input dimension, maybe using Finch.jl)
  2. add DPP checking system, similar to our DCP checks
  3. check/test/document integration with DiffOpt

@odow
Copy link
Member

odow commented May 12, 2024

Yes, this is highly non-trivial. I don't know if it is worth pursuing unless we have a serious use/application. In the near term, using cvxpy is probably a better choice.

@JinraeKim
Copy link
Contributor

I'm not a developer of Convex.jl, but I'd like to share my experience for your information:

During my PhD, I wandered around (DiffOpt.jl, Convex.jl, cvxpy) because I needed to use the differentiable optimization. Finally, I just settled down into this for Julia: https://github.com/JuliaDecisionFocusedLearning/ImplicitDifferentiation.jl

If DPP is only for differentiable convex optimization, one can just be encouraged to use other tools such as ImplicitDifferentiation.jl. If DPP is for the full support like cvxpy, you guys can refer to cvxpy or develop such functionality here, which seems quite a lot of work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

No branches or pull requests

5 participants