The issue is manipulating the X for the predicted input into such a way that it is of the same dimension. From what I can gather, I need to "pad" out the input to compensate for an added constant and that is why the dimensions do not match. Sadly I've not found a smart canonocial way to do that.
For example, my fitting code is the following:
X = sm.add_constant(X)
model = sm.OLS(y, X).fit()
predict is
X_pred = sm.add_constant(X_Pred)
r = model.predict(X_pred)
But I get a the following issue. X_pred is an out-of-training set example that is to be tested.
ValueError: shapes (1,16) and (17,) not aligned: 16 (dim 1) != 17 (dim 0)