I've been following the helpful example here to create a custom environment in gym, which I then want to train in rllib.
My environment has some optional parameters which I would like to select when training. I think the relevant part of code is in train.py
here:
# start Ray -- add `local_mode=True` here for debugging
ray.init(ignore_reinit_error=True)
# register the custom environment
select_env = "example-v0"
register_env(select_env, lambda config: Example_v0())
I've tried some obvious things, like
register_env(select_env, lambda config: Example_v0(optional_arg=n))
but nothing has seemed to work. Is there a way to pass different arguments before training?
I think you should use env_config in the constructor and then pass the dictionary env_config = {'optional_arg': n}