How dows sklearn.metrics.mean_squared_error do for matrices?

71 Views Asked by At

You know, the mean squared error for two vectors v=[v_1,v_2,...,v_n] and v=[w_1,w_2,...,w_n] is the mean of all square of v_i-w_i ;

$$\text{MSE}=\frac1n\sum_{i=1}^n(v_i-w_i)^2$$

For example, if v=[0,0,0,4] and w=[0,0,0,0] then, MSE should be 4 and RMSE should be 2.

How about the mean squared error for two matrices? For two m times n matrices A and B, is it Frobenious norm?

Here is a code, where the results are as expected (Frobenius norm) except for the fourth case ; RMSE for column vector.

I don't know how to interpret the fourth result and wonder what is the exact code of the mse for sklearn.

y_true = [0, 0, 0, 4]
y_pred = np.zeros_like(y_true) 
print(mean_squared_error(y_true, y_pred, squared=True)) # 4.0
print(mean_squared_error(y_true, y_pred, squared=False)) # 2.0
y_true = [[0, 0, 0, 4]]
y_pred = np.zeros_like(y_true)
print(mean_squared_error(y_true, y_pred, squared=True)) # 4.0
print(mean_squared_error(y_true, y_pred, squared=False)) # 1.0
y_true = [[0], [0], [0], [4]]
y_pred = np.zeros_like(y_true)
print(mean_squared_error(y_true, y_pred, squared=True)) # 4.0
print(mean_squared_error(y_true, y_pred, squared=False)) # 2.0

Six results for MSE and RMSE of vectors, column vectors and row vectors

0

There are 0 best solutions below