Ollama Configuration for Larger Context Windows Link to heading
By default, Ollama templates are configured with a context window of 2048 tokens. However, this can be quite small when analyzing larger sets of data. It is highly recommended to extend this context window for better performance with tools like I’m working on such as PRIscope or Raven.
To increase the context window size: Link to heading
Generate the model config:
ollama show mistral-small --modelfile > ollama_conf.txt
Edit the ollama_conf.txt file by appending the following line right below the FROM … line:
PARAMETER num_ctx 32768
This sets the context window to 128k tokens (the maximum for mistral-small).
Build a new model template:
ollama create mistral-small-128K -f ollama_conf.txt
Update your config.json file with the new model name:
{
"model_name": "mistral-small-128K"
}
A sample ollama_conf.txt file is included below
# Modelfile generated by "ollama show"
# To build a new Modelfile based on this, replace FROM with:
# FROM mistral-small:latest
FROM mistral-small:latest
PARAMETER num_ctx 32768
TEMPLATE """{{- if .Messages }}
{{- range $index, $_ := .Messages }}
{{- if eq .Role "user" }}
{{- if and (le (len (slice $.Messages $index)) 2) $.Tools }}[AVAILABLE_TOOLS] {{ $.Tools }}[/AVAILABLE_TOOLS]
{{- end }} [INST] {{ if and $.System (eq (len (slice $.Messages $index)) 1) }}{{ $.System }}
{{ end }}{{ .Content }} [/INST]
{{- else if eq .Role "assistant" }}
{{- if .Content }} {{ .Content }}
{{- if not (eq (len (slice $.Messages $index)) 1) }}</s>
{{- end }}
{{- else if .ToolCalls }}[TOOL_CALLS] [
{{- range .ToolCalls }}{"name": "{{ .Function.Name }}", "arguments": {{ .Function.Arguments }}}
{{- end }}]</s>
{{- end }}
{{- else if eq .Role "tool" }}[TOOL_RESULTS] {"content": {{ .Content }}}[/TOOL_RESULTS]
{{- end }}
{{- end }}
{{- else }} [INST] {{ if .System }}{{ .System }}
{{ end }}{{ .Prompt }} [/INST]
{{- end }}
{{- if .Response }} {{ end }}{{ .Response }}
{{- if .Response }}</s> {{ end }}"""
PARAMETER stop [INST]
PARAMETER stop [/INST]
PARAMETER stop </s>
LICENSE """# Mistral AI Research License
If You want to use a Mistral Model, a Derivative or an Output for any purpose that is not expressly authorized under this Agreement, You must request a license from Mistral AI, which Mistral AI may grant to You in Mistral AI's sole discretion. To discuss such a license, please contact Mistral AI via the website contact form: https://mistral.ai/contact/
"""