Giedrius FLT said 2 months ago on Streaming LLM Responses :
What about that context and [INST]? Why we need to add this? There wasn't enough explanation on this part. Thanks!

David Kimura PRO said 2 months ago on Streaming LLM Responses :
This is considered the template or prompt format for communicating with the particular model that you're using. You can find more information and the specific templates onĀ https://ollama.com/library/mixtral:latest. Since mistral is based on llama, it uses the [INST]prompt[/INST] template. If using gemma then the template would look something like <start_of_turn>model {{ .Response }}<end_of_turn>. It's basically formatting to yield better results from the model that is being used.

abner.figueroa PRO said 2 months ago on Streaming LLM Responses :
Love this episode, awesome job David!

olimart PRO said about 2 months ago on Streaming LLM Responses :
Really fond of those LLM/ML-related episodes.

ahmad19 PRO said about 2 months ago on Streaming LLM Responses :
Love this episode. It has helped me to get started with AI models. thank you

Login to Comment