I tried to implement federated learning based on the LSTM approach.
def create_keras_model():
model = Sequential()
model.add(LSTM(32, input_shape=(3,1)))
model.add(Dense(1))
return model
def model_fn():
keras_model = create_keras_model()
return tff.learning.from_keras_model(
keras_model,
input_spec=(look_back, 1),
loss=tf.keras.losses.mean_squared_error(),
metrics=[tf.keras.metrics.mean_squared_error()])
but I got this error when I want to define iterative_process.
iterative_process = tff.learning.build_federated_averaging_process(
model_fn,
client_optimizer_fn=lambda: tf.keras.optimizers.SGD(learning_rate=0.001),
server_optimizer_fn=lambda: tf.keras.optimizers.SGD(learning_rate=1.0))
TypeError: Missing required positional argument
How do I fix it?
The provided input requirements matching the loopback parameters may replace by the client train data requirements. ( TensorSpec ) federated
You can do works parallel by different types of input parameters.
[ Sample ]:
[ Output ]: