Suppose I have a dataset (let's call it Dataset1) consisting of 100 documents, and I'd like to create a custom feature extraction model. After tagging specific features within these documents, I train and generate a model named 'model1'. Due to the sensitive nature of the data (medical information), I am required to delete these files afterward.
However, I want to enrich 'model1' with another dataset (let's call it Dataset2), which also consists of 100 documents with different features. How can I achieve this? Do I need to create a separate model (model2) from Dataset2 and then combine 'model1' and 'model2' into a new model named 'model3'? Additionally, how many times can I perform this type of model composition without losing the integrity or performance of the original model?
Yes, you can use the compose model approach to utilize your custom models to generate a single model.
As for the limit, you can assign up to 100 trained custom models to a single model ID.
Regarding performance, whenever you submit a document for analysis, it classifies the form, chooses the best-matching assigned model, and returns results.
For more information, refer to the compose custom models documentation.
In your case, for multiple datasets, create custom models and compose them.