This is very similar to the question skflow regression predict multiple values. However, later versions of TensorFlow seem to rendered the answer from this question obsolete.
I would like to be able to have multiple output neurons in a TensorFlow Learn regression neural network (DNNRegressor). I upgraded the code from the referenced question to account for breaking changes in TensorFlow, but still get an error.
import numpy as np
import tensorflow.contrib.learn as skflow
import tensorflow as tf
from sklearn.metrics import mean_squared_error
# Create random dataset.
rng = np.random.RandomState(1)
X = np.sort(200 * rng.rand(100, 1) - 100, axis=0)
y = np.array([np.pi * np.sin(X).ravel(), np.pi * np.cos(X).ravel()]).T
# Fit regression DNN model.
feature_columns = [tf.contrib.layers.real_valued_column("", dimension=X.shape[0])]
regressor = skflow.DNNRegressor(hidden_units=[5, 5],feature_columns=feature_columns)
regressor.fit(X, y)
score = mean_squared_error(regressor.predict(X), y)
print("Mean Squared Error: {0:f}".format(score))
But this results in:
ValueError: Shapes (?, 1) and (?, 2) are incompatible
I don't seen any release notes about breaking changes that indicate that the method for multiple outputs have changed. Is there another way to do this?
As mentioned in the tf.contrib.learn.DNNRegressor docs, you may use the
label_dimension
parameter, which is exactly what you are looking for.Your code line with this param will do what you want:
The standard
predict()
returns an generator object. To get an array, you have to addas_iterable=False
: