Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Is this system prompt accounted into my tokens usage?

Is this system prompt included on every prompt I enter or is it only once for every new chat on the web?

That file is quite large, does the LLM actually respect every single line of rule?

This is very fascinating to me.



I'm pretty sure the model is cached with the system prompt already processed. So you should only pay extra tokens.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: