Feature scalling/normalization with sparse data

1.2k Views Asked by At

I am having a problem with training a neural network with sparse input data to solve a supervised regression problem. When I perform mean normalization (subtract mean then divide standard deviation) on the input data, I get a lot of NaN values. I am wondering if anyone has experience dealing this kind of problem. What is the correct way to scale the sparse input data?

thanks Joe

1

There are 1 best solutions below

0
On

Sounds like your data is so sparse that the standard deviation is occasionally zero.

Test for that, and if so, don't divide your input by it (stdev normalization is not necessary in that case anyway).