I have some general questions on gradients in CVXPY for simple convex optimization problems and gradients.
In the code below, I have enabled automatic differentiation through the optimization, but cannot recover the gradient of the variable x (the sensitivity of the objective function w.r.t. x). It returns 'None'.
I can do so for Parameters but not Variables. Why is this the case? I would think that gradients of Variables would be more useful to know than Parameters.
Secondly, I know I can specify x.gradient. Is this of any use if x.gradient is being automatically calculated via automatic differentiation?
import cvxpy as cp
import numpy as np
b = cp.Parameter()
x = cp.Variable()
quadratic = cp.square(x - 2 * b)
problem = cp.Problem(cp.Minimize(quadratic), [x >= 0])
b.value = 3.
problem.solve(requires_grad=True, eps=1e-10)
problem.backward()
print(x.gradient)