I have a pretty large fully connected network and I've started getting bothered by the fact that I store my weights and biases in a dictionary, and then compute each layer
layer_i+1 = relu(add(matmul(layer_i, weights['i']), biases['i']))
Surely there must be some "cleaner" way to do this? Or am I overthinking things?
I manage my networks the following way:
layers.py
tfmodel.py
And while creating the graph, simply call like this:
This also creates a clean graph in TensorBoard. Its not complete but I'm sure you got the idea.
Another clean way is to use Keras to define layers or models. Check out Keras as a simplified interface to TensorFlow: tutorial