How to properly generate a multi-level composition?

280 Views Asked by At

Currently, my hydra config is organized as follows:

configs/
├── config.yaml
├── data
│   ├── IMDB.yaml
│   └── REUT.yaml
└── model
    ├── BERT.yaml
    ├── GPT.yaml
    └── loss
        ├── CrossEntropyLoss.yaml
        └── TripletMarginLoss.yaml

config.yaml:

defaults:
  - model: BERT
  - data: IMDB

tasks: [ "fit", "eval" ]

The dataset (IMDB.yaml and REUT.yaml) settings are in the format:

name: IMDB
dir: resource/dataset/imdb_reviews/
folds: [0,1,2,3,4]
max_length: 256
num_classes: 10

The model (BERT.yaml and GPT.yaml) settings are in the format:

defaults:
  - loss: TripletMarginLoss

name: BERT
architecture: bert-base-uncased
lr: 5e-5
tokenizer:
  architecture: ${model.architecture}

And finally, the loss function settings (CrossEntropyLoss.yaml and TripletMarginLoss.yam) adopt the following structure:

_target_: source.loss.TripletMarginLoss.TripletMarginLoss
params:
  name: TripletMarginLoss
  margin: 1.0
  eps: 1e-6
  reduction: mean

Running the following entry point:

@hydra.main(config_path="configs/", config_name="config.yaml")
def my_app(params):
    OmegaConf.resolve(params)
    print(
        f"Params:\n"
        f"{OmegaConf.to_yaml(params)}\n")

if __name__ == '__main__':
    my_app()

# python main.py

generates the correct config composition:

tasks:
- fit
- eval
model:
  loss:
    _target_: source.loss.TripletMarginLoss.TripletMarginLoss
    params:
      name: TripletMarginLoss
      margin: 1.0
      eps: 1.0e-06
      reduction: mean
  name: BERT
  architecture: bert-base-uncased
  lr: 5.0e-05
  tokenizer:
    architecture: bert-base-uncased
data:
  name: IMDB
  dir: resource/dataset/imdb_reviews/
  folds:
  - 0
  - 1
  - 2
  - 3
  - 4
  max_length: 256
  num_classes: 10

However, overriding the loss function generates the wrong config:

python main.py model.loss=CrossEntropyLoss

tasks:
- fit
- eval
model:
  loss: CrossEntropyLoss
  name: BERT
  architecture: bert-base-uncased
  lr: 5.0e-05
  tokenizer:
    architecture: bert-base-uncased
data:
  name: IMDB
  dir: resource/dataset/imdb_reviews/
  folds:
  - 0
  - 1
  - 2
  - 3
  - 4
  max_length: 256
  num_classes: 10

Therefore, how to correctly generate a multi-level composition?

1

There are 1 best solutions below

4
On

Overriding nested config groups is done with / as separator as documented in the config group description here.

Try:

$ python main.py model/loss=CrossEntropyLoss