I'm using Kaggle to generate poetry samples with GPT-2. My notebook uses datasets from Gwern's poetry generator and uses nshepperd's GPT-2 model.
This all works fine with my notebook when generating unconditional samples.
!python src/generate_unconditional_samples.py --top_k 40 --nsamples 1 --temperature 0.9 --model_name=1.5b-model --length=300
However, I want to generate samples with the "interactive conditional" method:
!python src/interactive_conditional_samples.py --top_k 40 --nsamples 10 --temperature 0.9 --model_name=1.5b-model --length=300
The problem is when it requests a "model prompt" and I have no way of entering a prompt.
It doesn't work when I enter a prompt in Kaggle's CLI.
If I were to run this on my desktop using my own computing power it would automatically allow me to enter text in response to the prompt.
Is there a way for me to enter a prompt in kaggle?
I've tried to auto respond using flags, like how you would use -y to auto accept a yes/no prompt in installs, but it hasn't worked so far.