BERT model classification with many classes

2.5k Views Asked by At

I want to train a BERT model to perform a multiclass text classification. I use transformers and followed this tutorial (https://towardsdatascience.com/multi-class-text-classification-with-deep-learning-using-bert-b59ca2f5c613) to train it on Google Colab.

The issue is that I have a huge number of classes (about 600) and I feel like it affects the performance that is quite disappointing.

I looked a bit on Stackoverflow and found this thread (Intent classification with large number of intent classes) that answered my question but I don't know how to implement it.

The answer to the similar question was: "If you could classify your intents into some coarse-grained classes, you could train a classifier to specify which of these coarse-grained classes your instance belongs to. Then, for each coarse-grained class train another classifier to specify the fine-grained one. This hierarchical structure will probably improve the results. Also for the type of classifier, I believe a simple fully connected layer on top of BERT would suffice."

Do I have to train my models separately and use "if" conditions to build tbhe workflow or is there a way to train all your BERT models simultaneously and have one unifying model ?

Thanks in advance

0

There are 0 best solutions below