Files
LocalAI/gallery/lfm.yaml
Ettore Di Giacinto fff0e5911b chore(model gallery): add liquidai_lfm2-350m-extract (#6387)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-10-06 09:03:37 +02:00

47 lines
1.3 KiB
YAML

---
name: "lfm"
config_file: |
backend: "llama-cpp"
mmap: true
template:
chat_message: |
<|im_start|>{{ .RoleName }}
{{ if .FunctionCall -}}
<|tool_call_start|>
{{ else if eq .RoleName "tool" -}}
<|tool_response_start|>
{{ end -}}
{{ if .Content -}}
{{.Content }}
{{ end -}}
{{ if eq .RoleName "tool" -}}
<|tool_response_end|>
{{ end -}}
{{ if .FunctionCall -}}
{{toJson .FunctionCall}}
{{ end -}}<|im_end|>
function: |
<|im_start|>system
You are a function calling AI model. You are provided with functions to execute. You may call one or more functions to assist with the user query. Don't make assumptions about what values to plug into functions.
List of tools: <|tool_list_start|>[
{{range .Functions}}
{'type': 'function', 'function': {'name': '{{.Name}}', 'description': '{{.Description}}', 'parameters': {{toJson .Parameters}} }}
{{end}}
]<|tool_list_end|>
<|im_end|>
{{.Input -}}
<|im_start|>assistant
chat: |
{{.Input -}}
<|im_start|>assistant
completion: |
{{.Input}}
context_size: 4096
f16: true
stopwords:
- '<|im_end|>'
- '<dummy32000>'
- '</s>'
- '<|endoftext|>'