AttributeError: 'GPT2TokenizerFast' object has no attribute 'max_len'

11k Views Asked by At

I am just using the huggingface transformer library and get the following message when running run_lm_finetuning.py: AttributeError: 'GPT2TokenizerFast' object has no attribute 'max_len'. Anyone else with this problem or an idea how to fix it? Thanks!

My full experiment run: mkdir experiments

for epoch in 5 do python run_lm_finetuning.py
--model_name_or_path distilgpt2
--model_type gpt2
--train_data_file small_dataset_train_preprocessed.txt
--output_dir experiments/epochs_$epoch
--do_train
--overwrite_output_dir
--per_device_train_batch_size 4
--num_train_epochs $epoch done

2

There are 2 best solutions below

0
On BEST ANSWER

The "AttributeError: 'BertTokenizerFast' object has no attribute 'max_len'" Github issue contains the fix:

The run_language_modeling.py script is deprecated in favor of language-modeling/run_{clm, plm, mlm}.py.

If not, the fix is to change max_len to model_max_length.

Also, pip install transformers==3.0.2 might fix the issue since it has been reported to work for some people.

1
On

I use this command to solve it.

pip install transformers==3.0.2