How to use Bart with PretrainedTransformerEmbedder?

129 Views Asked by At

if i use encoder = PretrainedTransformerEmbedder(model_name, sub_module="encoder") as the encoder to pass to Bart(encoder=encoder), it reports error because it doesn't implement get_input_dim(), if i pass encoder = PretrainedTransformerEmbedder(model_name, sub_module="encoder"), encoder = encoder.encoder as the input encoder like metioned here, it reports error because the PretrainedTransformerEmbedder(model_name, sub_module="encoder") doesn't has an attribute encoder. So how can i use the full bart model(including token_embed, position_embed) for seq2seq task in allennlp?

1

There are 1 best solutions below

0
On

If you just pass encoder=None (which is the default), the Bart model will use the native BART encoder. It sounds like that's what you want?