how do i figure out what my exact objective gradient and hessian for a scipy optimize problem?

42 Views Asked by At

Other answers have mentioned that providing the exact objective gradient and hessian will speed up the process as it doesn't have to use an approximation. If this is my objective function, how do I figure out the gradient and hessian?

def er(x,df=df):
   return -((x*df['pred']).sum())

isn't it just -df['pred'] and 0 since df['pred'] are just constants? x would be the variable I am trying to optimize.

0

There are 0 best solutions below