BertForSequenceClassification uses [CLS] token's representation to feed a linear classifier. I want to leverage another token (say [X] in the input sequence) rather than [CLS]. What's the most straightforward way to implement that in Transformers?
How to initialize BertForSequenceClassification for different input rather than [CLS] token?
459 Views Asked by AmirHJ At
1
There are 1 best solutions below
Related Questions in HUGGINGFACE-TRANSFORMERS
- Text_input is not being cleared out/reset using streamlit
- Hugging Face - What is the difference between epochs in optimizer and TrainingArguments?
- Is BertForSequenceClassification using the CLS vector?
- HUGGINGFACE ValidationError: 1 validation error for StuffDocumentsChain __root__
- How to obtain latent vectors from fine-tuned model with transformers
- Is there a way to use a specific Pytorch model image processor in C++?
- meta-llama/Llama-2-7b-hf returning tensor instead of ModelOutput
- trainer.train doesnt work I am using transformers package and it gives me error like this:
- How to add noise to the intermediate layer of huggingface bert model?
- How can i import the document in Llamaindex
- Obtain prediction score
- How to converting GIT (ImageToText / image captioner ) model to ONNX format
- Encoder-Decoder with Huggingface Models
- How can I fine-tune a language model with negative examples using SFTTrainer?
- Fine tune resnet-50
Trending Questions
- UIImageView Frame Doesn't Reflect Constraints
- Is it possible to use adb commands to click on a view by finding its ID?
- How to create a new web character symbol recognizable by html/javascript?
- Why isn't my CSS3 animation smooth in Google Chrome (but very smooth on other browsers)?
- Heap Gives Page Fault
- Connect ffmpeg to Visual Studio 2008
- Both Object- and ValueAnimator jumps when Duration is set above API LvL 24
- How to avoid default initialization of objects in std::vector?
- second argument of the command line arguments in a format other than char** argv or char* argv[]
- How to improve efficiency of algorithm which generates next lexicographic permutation?
- Navigating to the another actvity app getting crash in android
- How to read the particular message format in android and store in sqlite database?
- Resetting inventory status after order is cancelled
- Efficiently compute powers of X in SSE/AVX
- Insert into an external database using ajax and php : POST 500 (Internal Server Error)
Popular # Hahtags
Popular Questions
- How do I undo the most recent local commits in Git?
- How can I remove a specific item from an array in JavaScript?
- How do I delete a Git branch locally and remotely?
- Find all files containing a specific text (string) on Linux?
- How do I revert a Git repository to a previous commit?
- How do I create an HTML button that acts like a link?
- How do I check out a remote Git branch?
- How do I force "git pull" to overwrite local files?
- How do I list all files of a directory?
- How to check whether a string contains a substring in JavaScript?
- How do I redirect to another webpage?
- How can I iterate over rows in a Pandas DataFrame?
- How do I convert a String to an int in Java?
- Does Python have a string 'contains' substring method?
- How do I check if a string contains a specific word?
You can define the special tokens when creating the tokenizer.
This is an example of how to modify an special token of a pretrained tokenizer:
Please check the BertTokenizer documentation to see which other special tokens you can modify (
unk_token,sep_token,pad_token,cls_token,mask_token, etc.).