scikit python - SVR regression: predicting multiple points results in the same repeating output

638 Views Asked by At

I'm trying to train and predict with an SVM as follows:

for i in range(numStocks):
    y_p = []
    X, y, X_p, scaler, stockUsed = randomStockData(stockList)
    clf = svm.SVR(kernel='rbf', C=1, gamma=0.1)
    clf.fit(X, y)
    reshapedX_p = X_p.reshape(1,-1)
    for j in range(5):
        y_p.append(clf.predict(reshapedX_p))
        reshapedX_p = np.append(reshapedX_p[0][1:],y_p[-1])
        reshapedX_p = reshapedX_p.reshape(1, reshapedX_p.shape[0])
    y_p = [x*scaler for x in y_p]
    rescaledX_p = [x*scaler for x in X_p]
    print(y_p)
    plt.plot(rescaledX_p, label='closep test')
    plt.plot(range(len(rescaledX_p),len(rescaledX_p)+len(y_p)), y_p, label='predicted')
    plt.legend(loc='lower left', shadow=True)
    pylab.savefig(homePath+'predictResults/'+stockUsed+'.png', facecolor='#ffffff', edgecolor='#ffffff')
    plt.close()

But plots all come out like this: enter image description here enter image description here

Looking at the last few "prediction input" (reshapedX_p) at each iteration, with y_p as well:

print(reshapedX_p[0][-5:], y_p) inside the inner for loop

(array([ 0.00458431,  0.00465051,  0.00470016,  0.00466706,  0.00462568]), [])
(array([ 0.00465051,  0.00470016,  0.00466706,  0.00462568,  0.00415898]), [array([ 0.00415898])])
(array([ 0.00470016,  0.00466706,  0.00462568,  0.00415898,  0.00415898]), [array([ 0.00415898]), array([ 0.00415898])])
(array([ 0.00466706,  0.00462568,  0.00415898,  0.00415898,  0.00415898]), [array([ 0.00415898]), array([ 0.00415898]), array([ 0.00415898])])
(array([ 0.00462568,  0.00415898,  0.00415898,  0.00415898,  0.00415898]), [array([ 0.00415898]), array([ 0.00415898]), array([ 0.00415898]), array([ 0.00415898])])

It's clear that the input to the prediction at each step is in fact changing, but the SVM simply spits out the exact same value as the last step after the first prediction.

Do I need to re-fit the SVM every time as well? I wouldn't think I would have to.

0

There are 0 best solutions below