python scipy using fmin_bfgs for logistic regression

1.7k Views Asked by At

I use the formula below as my hypothesis: hypothesis

And the formula below as the cost function: cost function for one sample

So the object function I try to minimize is : object function

And the gradient is: gradient

the csv file is formatted like: y0,x1,x2,x3,... y1,x1,x2,x3,... y2,x1,x2,x3,... y is either 1 or 0(for classification) the training code is below:

import numpy as np
import scipy as sp
from scipy.optimize import fmin_bfgs
import pylab as pl



data = np.genfromtxt('../data/small_train.txt', delimiter=',')
y = data[:,0]
#add 1 as the first column of x, the constant term
x = np.append(np.ones((len(y), 1)), data[:,1:], axis = 1)

#sigmoid hypothesis
def h(theta, x):
    return 1.0/(1+np.exp(-np.dot(theta, x)))

#cost function
def cost(theta, x, y):
    tot = 0
    for i in range(len(y)):
        tot += y[i]*np.log(h(theta, x[i])) + (1-y[i])*(1-np.log(h(theta, x[i])))
    return -tot / len(y)

#gradient

def deviation(theta, x, y):
    def f(theta, x, y, j):
        tot = 0.0
        for i in range(len(y)):
            tot += (h(theta, x[i]) - y[i]) * x[i][j]
        return tot / len(y)
    ret = []
    for j in range(len(x[0])):
        ret.append(f(theta, x, y, j))
    return np.array(ret)
    

init_theta = np.zeros(len(x[0]))
ret = fmin_bfgs(cost, init_theta, fprime = deviation, args=(x,y))
print ret

I run the code on a small data set, but it seems my implementation is not right.Can any one help me? One more question:As you know, fmin_bfgs do not necessarily need the fprime term, what is the difference between if we do provide it and do not?

1

There are 1 best solutions below

0
On

I would like to correct something in the above code.

I think that the cost function should as follows (the correction is in bold):

#cost function
def cost(theta, x, y):
    tot = 0
    for i in range(len(y)):
        tot += y[i]*np.log(h(theta, x[i])) + (1-y[i])*(**np.log(1-h(theta, x[i]**)))
    return -tot / len(y)

Please let me know if it is better like this, thank you very much!