I'm new to using the PalmML API and I"m trying to generate results that return in CSV format. No matter what I try, I am getting markdown format. Here is example prompt.
const result = await client.generateText({
model:MODEL_NAME,
prompt: {
text: 'Generate a list of top 10 websites and return results in comma separated value format, not markdown format. Put field names in first row.',
format: 'csv',
delimiter: ',',
quote: '"',
},
});
The output is always in markdown format like following:
| Rank | Website | Alexa Global Rank |
|---|---|---|
| 1 | Google | 1 |
| 2 | YouTube | 2 |
| 3 | Facebook | 3 |
| 4 | Baidu | 4 |
| 5 | Wikipedia | 5 |
| 6 | Amazon | 6 |
| 7 | Tencent | 7 |
| 8 | Twitter | 8 |
| 9 | Instagram | 9 |
| 10 | Reddit | 10 |
Is there a way to have it output CSV format? Or Do I need to convert markdown to CSV on my own?
Firstly you can remove these fields as they aren't going anywhere (see the API docs):
There are a few approaches that will probably help. As LLMs are just auto-complete services, you can try giving it the start of the CSV example in a zero shot request (I would also remove
not markdown format
to avoid including references tomarkdown format
), e.g.Zero-shot prompt:
You can often do better with a few-shot prompt by giving examples.
Few-shot prompt (1-shot really):
Another approach is to get the model to come up with results in stages.
Prompt 1:
Response:
Prompt 2:
You can even do this dynamically.
Prompt:
Try to evaluate the results, if it can't be coerced into CSV then append the prompt with a correction stanza.
This last approach is a bit better suited to the
generateMessage
API but you can make it work in text too.