Skip to content

bug: AMD GPU: Error: Model appears to have crashed! #6628

@elmodor

Description

@elmodor

Version:
Appimage 0.6.10

Describe the Bug

No models work. If I enter any prompt, the model loads and then the error:
"Error: Model appears to have crashed! Please reload!" pops up.

This was tested with multiple llama.cpp models, including the Jan models.

It happens with all backends: linux-vulkan-x64, linux-avx-x64, linux-noavx-x64 (tested various multiple versions of them, but at least eh newest ones b6324)

Under "Hardware" the GPU is recognized (AMD Radeon RX6700 XT (RADV NAVI22)) with correct VRAM. Though it does not matter if it is checked or unchecked, the error still occurs.

I tried an older version (only) Appimage 0.5.14 and everything did work there. CPU and GPU as well.

Steps to Reproduce

  1. Start Jan
  2. Enter prompt
  3. Get error

Screenshots / Logs

Startup and test prompt logs

[2025-09-27][09:17:56][app_lib::core::setup][INFO] Installing extensions. Clean up: false, Stored version: 0.6.10, App version: 0.6.10
[2025-09-27][09:17:56][app_lib::core::mcp::helpers][INFO] MCP server initialization complete: 0 successful, 0 failed
[2025-09-27][09:17:57][app_lib::core::extensions::commands][INFO] get jan extensions, path: "/home/XXX/.local/share/Jan/data/extensions/extensions.json"
[2025-09-27][09:17:57][tauri_plugin_hardware::vendor::nvidia][ERROR] Unable to initialize NVML: a libloading error occurred: libnvidia-ml.so.1: cannot open shared object file: No such file or directory
[2025-09-27][09:17:57][tauri_plugin_hardware::vendor::nvidia][ERROR] Failed to get NVIDIA GPUs: an internal driver error occured
[2025-09-27][09:17:57][app_lib::core::mcp::commands][INFO] read mcp configs, path: "/home/XXX/.local/share/Jan/data/mcp_config.json"
[2025-09-27][09:17:57][tauri_plugin_updater::updater][DEBUG] checking for updates https://github.com/menloresearch/jan/releases/latest/download/latest.json
[2025-09-27][09:17:57][tauri_plugin_updater::updater][DEBUG] checking for updates https://github.com/menloresearch/jan/releases/latest/download/latest.json
[2025-09-27][09:17:57][reqwest::connect][DEBUG] starting new connection: https://github.com/
[2025-09-27][09:17:57][reqwest::connect][DEBUG] starting new connection: https://github.com/
[2025-09-27][09:17:57][reqwest::connect][DEBUG] starting new connection: https://api.github.com/
[2025-09-27][09:17:57][reqwest::connect][DEBUG] starting new connection: https://api.github.com/
[2025-09-27][09:17:57][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  Calling Tauri command getDevices with arg --list-devices
[2025-09-27][09:17:57][tauri_plugin_llamacpp::device][INFO] Getting devices from server at path: "/home/XXX/.local/share/Jan/data/llamacpp/backends/b6324/linux-vulkan-x64/build/bin/llama-server"
[2025-09-27][09:17:57][tauri_plugin_llamacpp::device][INFO] Device list output:
Available devices:
  Vulkan0: AMD Radeon RX 6700 XT (RADV NAVI22) (12032 MiB, 12032 MiB free)

[2025-09-27][09:17:57][tauri_plugin_llamacpp::device][INFO] Parsing device line: 'Vulkan0: AMD Radeon RX 6700 XT (RADV NAVI22) (12032 MiB, 12032 MiB free)'
[2025-09-27][09:17:57][tauri_plugin_llamacpp::device][INFO] Parsed device - ID: 'Vulkan0', Name: 'AMD Radeon RX 6700 XT (RADV NAVI22)', Mem: 12032, Free: 12032
[2025-09-27][09:17:57][reqwest::connect][DEBUG] starting new connection: https://release-assets.githubusercontent.com/
[2025-09-27][09:17:57][reqwest::connect][DEBUG] starting new connection: https://release-assets.githubusercontent.com/
[2025-09-27][09:17:58][tauri_plugin_hardware::vendor::amd][ERROR] Failed to get memory usage for AMD GPU 0x73df: GPU not found
[2025-09-27][09:17:58][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  Total VRAM: 12616466432 bytes, Total RAM: 14191427584 bytes, Total Memory: 26807894016 bytes
[2025-09-27][09:17:58][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  modelSize: 2497281632
[2025-09-27][09:17:58][tauri_plugin_updater::updater][DEBUG] update response: Object {"notes": String(""), "platforms": Object {"darwin-aarch64": Object {"signature": String("dW50cnVzdGVkIGNvbW1lbnQ6IHNpZ25hdHVyZSBmcm9tIHRhdXJpIHNlY3JldCBrZXkKUlVSRXJUVWE2ekJCTGkzU1lyMmtUTDV6SkVRRWZGMkpqZTRmMDIyNXNZNEN2Vk1wMW1mU0dmcmMrSGwvYXdPMnNwK290dW5GOWJHcnQxdlk5VlZNbTVrUDVwWXRsN0xlZFFVPQp0cnVzdGVkIGNvbW1lbnQ6IHRpbWVzdGFtcDoxNzU4MTIxMjg3CWZpbGU6SmFuLmFwcC50YXIuZ3oKOURIQlhpSjBtQzlldGRiK2gxM09FQ3Z2aFAyRDBldGtNRU9KKzNOSnNVcnBESUduRWN2MStvQXk5TkNxek55Mmt6OEVLTm83VUU0WGc4QmEvYUFoRFE9PQo="), "url": String("https://github.com/menloresearch/jan/releases/download/v0.6.10/Jan.app.tar.gz")}, "darwin-x86_64": Object {"signature": String("dW50cnVzdGVkIGNvbW1lbnQ6IHNpZ25hdHVyZSBmcm9tIHRhdXJpIHNlY3JldCBrZXkKUlVSRXJUVWE2ekJCTGkzU1lyMmtUTDV6SkVRRWZGMkpqZTRmMDIyNXNZNEN2Vk1wMW1mU0dmcmMrSGwvYXdPMnNwK290dW5GOWJHcnQxdlk5VlZNbTVrUDVwWXRsN0xlZFFVPQp0cnVzdGVkIGNvbW1lbnQ6IHRpbWVzdGFtcDoxNzU4MTIxMjg3CWZpbGU6SmFuLmFwcC50YXIuZ3oKOURIQlhpSjBtQzlldGRiK2gxM09FQ3Z2aFAyRDBldGtNRU9KKzNOSnNVcnBESUduRWN2MStvQXk5TkNxek55Mmt6OEVLTm83VUU0WGc4QmEvYUFoRFE9PQo="), "url": String("https://github.com/menloresearch/jan/releases/download/v0.6.10/Jan.app.tar.gz")}, "linux-x86_64": Object {"signature": String("dW50cnVzdGVkIGNvbW1lbnQ6IHNpZ25hdHVyZSBmcm9tIHRhdXJpIHNlY3JldCBrZXkKUlVSRXJUVWE2ekJCTHJ2OXNVRkxLak9ldXFtdnNBZzQxOWh5WFZrMkJTYjZyWVhFMDgvREdRa0plNEl4YWNRKzFDTVcwQU5DTjl4cDJucG5JNUVOS1d2T0pnVUN3OFlJL0E0PQp0cnVzdGVkIGNvbW1lbnQ6IHRpbWVzdGFtcDoxNzU4MTIxMjMzCWZpbGU6SmFuXzAuNi4xMF9hbWQ2NC5BcHBJbWFnZQpCM2ZiSjA3R0pIMnhMTTlKZ0s0SUVNdWdVRk5QTFlsckE5UjB2QjIxWWoxbEpTK3JWeGNEMTdVdGhsR2FFdVBTQ1VkNVVOcjZUbmpzbnhnOXh4bHBEZz09Cg=="), "url": String("https://github.com/menloresearch/jan/releases/download/v0.6.10/Jan_0.6.10_amd64.AppImage")}, "windows-x86_64": Object {"signature": String("dW50cnVzdGVkIGNvbW1lbnQ6IHNpZ25hdHVyZSBmcm9tIHRhdXJpIHNlY3JldCBrZXkKUlVSRXJUVWE2ekJCTGplcUdheTJld2VTcUhtanQvQ2VCZ1FrVkVtK0RaalBtZ3ZUdHFLcHFsY1pmcUlhTG1RQUhnTG5DK1oyVm80ZnVNTkg3cngxUndPSFpWUEhRMkxhOEFFPQp0cnVzdGVkIGNvbW1lbnQ6IHRpbWVzdGFtcDoxNzU4MTIxMzMzCWZpbGU6SmFuXzAuNi4xMF94NjQtc2V0dXAuZXhlCnB0emJDak5DSzRxOEhGVjE1cnlOLzdCNCtwNGZVdmFpWUI4cDF1eVZ4TDlSSjBoNDVwK0lkYW55bUlTUjdyd2RBb2RNYU1qNkt3NitJN1hZUUcxaERRPT0K"), "url": String("https://github.com/menloresearch/jan/releases/download/v0.6.10/Jan_0.6.10_x64-setup.exe")}}, "pub_date": String("2025-09-17T15:02:43.880Z"), "version": String("0.6.10")}
[2025-09-27][09:17:58][tauri_plugin_updater::updater][DEBUG] parsed release response RemoteRelease { version: Version { major: 0, minor: 6, patch: 10 }, notes: Some(""), pub_date: Some(2025-09-17 15:02:43.88 +00:00:00), data: Static { platforms: {"windows-x86_64": ReleaseManifestPlatform { url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("github.com")), port: None, path: "/menloresearch/jan/releases/download/v0.6.10/Jan_0.6.10_x64-setup.exe", query: None, fragment: None }, signature: "dW50cnVzdGVkIGNvbW1lbnQ6IHNpZ25hdHVyZSBmcm9tIHRhdXJpIHNlY3JldCBrZXkKUlVSRXJUVWE2ekJCTGplcUdheTJld2VTcUhtanQvQ2VCZ1FrVkVtK0RaalBtZ3ZUdHFLcHFsY1pmcUlhTG1RQUhnTG5DK1oyVm80ZnVNTkg3cngxUndPSFpWUEhRMkxhOEFFPQp0cnVzdGVkIGNvbW1lbnQ6IHRpbWVzdGFtcDoxNzU4MTIxMzMzCWZpbGU6SmFuXzAuNi4xMF94NjQtc2V0dXAuZXhlCnB0emJDak5DSzRxOEhGVjE1cnlOLzdCNCtwNGZVdmFpWUI4cDF1eVZ4TDlSSjBoNDVwK0lkYW55bUlTUjdyd2RBb2RNYU1qNkt3NitJN1hZUUcxaERRPT0K" }, "linux-x86_64": ReleaseManifestPlatform { url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("github.com")), port: None, path: "/menloresearch/jan/releases/download/v0.6.10/Jan_0.6.10_amd64.AppImage", query: None, fragment: None }, signature: "dW50cnVzdGVkIGNvbW1lbnQ6IHNpZ25hdHVyZSBmcm9tIHRhdXJpIHNlY3JldCBrZXkKUlVSRXJUVWE2ekJCTHJ2OXNVRkxLak9ldXFtdnNBZzQxOWh5WFZrMkJTYjZyWVhFMDgvREdRa0plNEl4YWNRKzFDTVcwQU5DTjl4cDJucG5JNUVOS1d2T0pnVUN3OFlJL0E0PQp0cnVzdGVkIGNvbW1lbnQ6IHRpbWVzdGFtcDoxNzU4MTIxMjMzCWZpbGU6SmFuXzAuNi4xMF9hbWQ2NC5BcHBJbWFnZQpCM2ZiSjA3R0pIMnhMTTlKZ0s0SUVNdWdVRk5QTFlsckE5UjB2QjIxWWoxbEpTK3JWeGNEMTdVdGhsR2FFdVBTQ1VkNVVOcjZUbmpzbnhnOXh4bHBEZz09Cg==" }, "darwin-aarch64": ReleaseManifestPlatform { url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("github.com")), port: None, path: "/menloresearch/jan/releases/download/v0.6.10/Jan.app.tar.gz", query: None, fragment: None }, signature: "dW50cnVzdGVkIGNvbW1lbnQ6IHNpZ25hdHVyZSBmcm9tIHRhdXJpIHNlY3JldCBrZXkKUlVSRXJUVWE2ekJCTGkzU1lyMmtUTDV6SkVRRWZGMkpqZTRmMDIyNXNZNEN2Vk1wMW1mU0dmcmMrSGwvYXdPMnNwK290dW5GOWJHcnQxdlk5VlZNbTVrUDVwWXRsN0xlZFFVPQp0cnVzdGVkIGNvbW1lbnQ6IHRpbWVzdGFtcDoxNzU4MTIxMjg3CWZpbGU6SmFuLmFwcC50YXIuZ3oKOURIQlhpSjBtQzlldGRiK2gxM09FQ3Z2aFAyRDBldGtNRU9KKzNOSnNVcnBESUduRWN2MStvQXk5TkNxek55Mmt6OEVLTm83VUU0WGc4QmEvYUFoRFE9PQo=" }, "darwin-x86_64": ReleaseManifestPlatform { url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("github.com")), port: None, path: "/menloresearch/jan/releases/download/v0.6.10/Jan.app.tar.gz", query: None, fragment: None }, signature: "dW50cnVzdGVkIGNvbW1lbnQ6IHNpZ25hdHVyZSBmcm9tIHRhdXJpIHNlY3JldCBrZXkKUlVSRXJUVWE2ekJCTGkzU1lyMmtUTDV6SkVRRWZGMkpqZTRmMDIyNXNZNEN2Vk1wMW1mU0dmcmMrSGwvYXdPMnNwK290dW5GOWJHcnQxdlk5VlZNbTVrUDVwWXRsN0xlZFFVPQp0cnVzdGVkIGNvbW1lbnQ6IHRpbWVzdGFtcDoxNzU4MTIxMjg3CWZpbGU6SmFuLmFwcC50YXIuZ3oKOURIQlhpSjBtQzlldGRiK2gxM09FQ3Z2aFAyRDBldGtNRU9KKzNOSnNVcnBESUduRWN2MStvQXk5TkNxek55Mmt6OEVLTm83VUU0WGc4QmEvYUFoRFE9PQo=" }} } }
[2025-09-27][09:17:58][tauri_plugin_updater::updater][DEBUG] update response: Object {"notes": String(""), "platforms": Object {"darwin-aarch64": Object {"signature": String("dW50cnVzdGVkIGNvbW1lbnQ6IHNpZ25hdHVyZSBmcm9tIHRhdXJpIHNlY3JldCBrZXkKUlVSRXJUVWE2ekJCTGkzU1lyMmtUTDV6SkVRRWZGMkpqZTRmMDIyNXNZNEN2Vk1wMW1mU0dmcmMrSGwvYXdPMnNwK290dW5GOWJHcnQxdlk5VlZNbTVrUDVwWXRsN0xlZFFVPQp0cnVzdGVkIGNvbW1lbnQ6IHRpbWVzdGFtcDoxNzU4MTIxMjg3CWZpbGU6SmFuLmFwcC50YXIuZ3oKOURIQlhpSjBtQzlldGRiK2gxM09FQ3Z2aFAyRDBldGtNRU9KKzNOSnNVcnBESUduRWN2MStvQXk5TkNxek55Mmt6OEVLTm83VUU0WGc4QmEvYUFoRFE9PQo="), "url": String("https://github.com/menloresearch/jan/releases/download/v0.6.10/Jan.app.tar.gz")}, "darwin-x86_64": Object {"signature": String("dW50cnVzdGVkIGNvbW1lbnQ6IHNpZ25hdHVyZSBmcm9tIHRhdXJpIHNlY3JldCBrZXkKUlVSRXJUVWE2ekJCTGkzU1lyMmtUTDV6SkVRRWZGMkpqZTRmMDIyNXNZNEN2Vk1wMW1mU0dmcmMrSGwvYXdPMnNwK290dW5GOWJHcnQxdlk5VlZNbTVrUDVwWXRsN0xlZFFVPQp0cnVzdGVkIGNvbW1lbnQ6IHRpbWVzdGFtcDoxNzU4MTIxMjg3CWZpbGU6SmFuLmFwcC50YXIuZ3oKOURIQlhpSjBtQzlldGRiK2gxM09FQ3Z2aFAyRDBldGtNRU9KKzNOSnNVcnBESUduRWN2MStvQXk5TkNxek55Mmt6OEVLTm83VUU0WGc4QmEvYUFoRFE9PQo="), "url": String("https://github.com/menloresearch/jan/releases/download/v0.6.10/Jan.app.tar.gz")}, "linux-x86_64": Object {"signature": String("dW50cnVzdGVkIGNvbW1lbnQ6IHNpZ25hdHVyZSBmcm9tIHRhdXJpIHNlY3JldCBrZXkKUlVSRXJUVWE2ekJCTHJ2OXNVRkxLak9ldXFtdnNBZzQxOWh5WFZrMkJTYjZyWVhFMDgvREdRa0plNEl4YWNRKzFDTVcwQU5DTjl4cDJucG5JNUVOS1d2T0pnVUN3OFlJL0E0PQp0cnVzdGVkIGNvbW1lbnQ6IHRpbWVzdGFtcDoxNzU4MTIxMjMzCWZpbGU6SmFuXzAuNi4xMF9hbWQ2NC5BcHBJbWFnZQpCM2ZiSjA3R0pIMnhMTTlKZ0s0SUVNdWdVRk5QTFlsckE5UjB2QjIxWWoxbEpTK3JWeGNEMTdVdGhsR2FFdVBTQ1VkNVVOcjZUbmpzbnhnOXh4bHBEZz09Cg=="), "url": String("https://github.com/menloresearch/jan/releases/download/v0.6.10/Jan_0.6.10_amd64.AppImage")}, "windows-x86_64": Object {"signature": String("dW50cnVzdGVkIGNvbW1lbnQ6IHNpZ25hdHVyZSBmcm9tIHRhdXJpIHNlY3JldCBrZXkKUlVSRXJUVWE2ekJCTGplcUdheTJld2VTcUhtanQvQ2VCZ1FrVkVtK0RaalBtZ3ZUdHFLcHFsY1pmcUlhTG1RQUhnTG5DK1oyVm80ZnVNTkg3cngxUndPSFpWUEhRMkxhOEFFPQp0cnVzdGVkIGNvbW1lbnQ6IHRpbWVzdGFtcDoxNzU4MTIxMzMzCWZpbGU6SmFuXzAuNi4xMF94NjQtc2V0dXAuZXhlCnB0emJDak5DSzRxOEhGVjE1cnlOLzdCNCtwNGZVdmFpWUI4cDF1eVZ4TDlSSjBoNDVwK0lkYW55bUlTUjdyd2RBb2RNYU1qNkt3NitJN1hZUUcxaERRPT0K"), "url": String("https://github.com/menloresearch/jan/releases/download/v0.6.10/Jan_0.6.10_x64-setup.exe")}}, "pub_date": String("2025-09-17T15:02:43.880Z"), "version": String("0.6.10")}
[2025-09-27][09:17:58][tauri_plugin_updater::updater][DEBUG] parsed release response RemoteRelease { version: Version { major: 0, minor: 6, patch: 10 }, notes: Some(""), pub_date: Some(2025-09-17 15:02:43.88 +00:00:00), data: Static { platforms: {"linux-x86_64": ReleaseManifestPlatform { url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("github.com")), port: None, path: "/menloresearch/jan/releases/download/v0.6.10/Jan_0.6.10_amd64.AppImage", query: None, fragment: None }, signature: "dW50cnVzdGVkIGNvbW1lbnQ6IHNpZ25hdHVyZSBmcm9tIHRhdXJpIHNlY3JldCBrZXkKUlVSRXJUVWE2ekJCTHJ2OXNVRkxLak9ldXFtdnNBZzQxOWh5WFZrMkJTYjZyWVhFMDgvREdRa0plNEl4YWNRKzFDTVcwQU5DTjl4cDJucG5JNUVOS1d2T0pnVUN3OFlJL0E0PQp0cnVzdGVkIGNvbW1lbnQ6IHRpbWVzdGFtcDoxNzU4MTIxMjMzCWZpbGU6SmFuXzAuNi4xMF9hbWQ2NC5BcHBJbWFnZQpCM2ZiSjA3R0pIMnhMTTlKZ0s0SUVNdWdVRk5QTFlsckE5UjB2QjIxWWoxbEpTK3JWeGNEMTdVdGhsR2FFdVBTQ1VkNVVOcjZUbmpzbnhnOXh4bHBEZz09Cg==" }, "windows-x86_64": ReleaseManifestPlatform { url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("github.com")), port: None, path: "/menloresearch/jan/releases/download/v0.6.10/Jan_0.6.10_x64-setup.exe", query: None, fragment: None }, signature: "dW50cnVzdGVkIGNvbW1lbnQ6IHNpZ25hdHVyZSBmcm9tIHRhdXJpIHNlY3JldCBrZXkKUlVSRXJUVWE2ekJCTGplcUdheTJld2VTcUhtanQvQ2VCZ1FrVkVtK0RaalBtZ3ZUdHFLcHFsY1pmcUlhTG1RQUhnTG5DK1oyVm80ZnVNTkg3cngxUndPSFpWUEhRMkxhOEFFPQp0cnVzdGVkIGNvbW1lbnQ6IHRpbWVzdGFtcDoxNzU4MTIxMzMzCWZpbGU6SmFuXzAuNi4xMF94NjQtc2V0dXAuZXhlCnB0emJDak5DSzRxOEhGVjE1cnlOLzdCNCtwNGZVdmFpWUI4cDF1eVZ4TDlSSjBoNDVwK0lkYW55bUlTUjdyd2RBb2RNYU1qNkt3NitJN1hZUUcxaERRPT0K" }, "darwin-x86_64": ReleaseManifestPlatform { url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("github.com")), port: None, path: "/menloresearch/jan/releases/download/v0.6.10/Jan.app.tar.gz", query: None, fragment: None }, signature: "dW50cnVzdGVkIGNvbW1lbnQ6IHNpZ25hdHVyZSBmcm9tIHRhdXJpIHNlY3JldCBrZXkKUlVSRXJUVWE2ekJCTGkzU1lyMmtUTDV6SkVRRWZGMkpqZTRmMDIyNXNZNEN2Vk1wMW1mU0dmcmMrSGwvYXdPMnNwK290dW5GOWJHcnQxdlk5VlZNbTVrUDVwWXRsN0xlZFFVPQp0cnVzdGVkIGNvbW1lbnQ6IHRpbWVzdGFtcDoxNzU4MTIxMjg3CWZpbGU6SmFuLmFwcC50YXIuZ3oKOURIQlhpSjBtQzlldGRiK2gxM09FQ3Z2aFAyRDBldGtNRU9KKzNOSnNVcnBESUduRWN2MStvQXk5TkNxek55Mmt6OEVLTm83VUU0WGc4QmEvYUFoRFE9PQo=" }, "darwin-aarch64": ReleaseManifestPlatform { url: Url { scheme: "https", cannot_be_a_base: false, username: "", password: None, host: Some(Domain("github.com")), port: None, path: "/menloresearch/jan/releases/download/v0.6.10/Jan.app.tar.gz", query: None, fragment: None }, signature: "dW50cnVzdGVkIGNvbW1lbnQ6IHNpZ25hdHVyZSBmcm9tIHRhdXJpIHNlY3JldCBrZXkKUlVSRXJUVWE2ekJCTGkzU1lyMmtUTDV6SkVRRWZGMkpqZTRmMDIyNXNZNEN2Vk1wMW1mU0dmcmMrSGwvYXdPMnNwK290dW5GOWJHcnQxdlk5VlZNbTVrUDVwWXRsN0xlZFFVPQp0cnVzdGVkIGNvbW1lbnQ6IHRpbWVzdGFtcDoxNzU4MTIxMjg3CWZpbGU6SmFuLmFwcC50YXIuZ3oKOURIQlhpSjBtQzlldGRiK2gxM09FQ3Z2aFAyRDBldGtNRU9KKzNOSnNVcnBESUduRWN2MStvQXk5TkNxek55Mmt6OEVLTm83VUU0WGc4QmEvYUFoRFE9PQo=" }} } }
[2025-09-27][09:17:58][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  Using explicit key_length: 128, value_length: 128
[2025-09-27][09:17:58][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  Final context length used for KV size: 8192
[2025-09-27][09:17:58][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  nLayer: 36, nHead: 32, headDim (K+V): 256
[2025-09-27][09:17:58][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  ctxLen: 8192
[2025-09-27][09:17:58][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  nLayer: 36
[2025-09-27][09:17:58][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  nHead: 32
[2025-09-27][09:17:58][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  headDim: 256
[2025-09-27][09:17:58][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  isModelSupported: Total memory requirement: 7329119840 for /home/XXX/.local/share/Jan/data/llamacpp/models/Jan-v1-4B-Q4_K_M/model.gguf
[2025-09-27][09:17:58][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  Using stored backend preference: b6324/linux-vulkan-x64
[2025-09-27][09:17:58][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  Initial UI default for version_backend set to: b6324/linux-vulkan-x64
[2025-09-27][09:17:58][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  Auto-update engine is enabled. Current backend: b6324/linux-vulkan-x64. Best available: b6324/linux-vulkan-x64
[2025-09-27][09:17:58][reqwest::connect][DEBUG] starting new connection: https://api.github.com/
[2025-09-27][09:17:58][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  Already at latest version: b6324 = b6324
[2025-09-27][09:17:59][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  Auto-update: Already using the latest version of the selected backend
[2025-09-27][09:17:59][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  Final check: Backend b6324/linux-vulkan-x64 is already installed
[2025-09-27][09:18:07][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  Calling Tauri command getDevices with arg --list-devices
[2025-09-27][09:18:07][tauri_plugin_llamacpp::device][INFO] Getting devices from server at path: "/home/XXX/.local/share/Jan/data/llamacpp/backends/b6324/linux-vulkan-x64/build/bin/llama-server"

[2025-09-27][09:18:07][tauri_plugin_llamacpp::device][INFO] Device list output:
Available devices:
  Vulkan0: AMD Radeon RX 6700 XT (RADV NAVI22) (12032 MiB, 12032 MiB free)

[2025-09-27][09:18:07][tauri_plugin_llamacpp::device][INFO] Parsing device line: 'Vulkan0: AMD Radeon RX 6700 XT (RADV NAVI22) (12032 MiB, 12032 MiB free)'
[2025-09-27][09:18:07][tauri_plugin_llamacpp::device][INFO] Parsed device - ID: 'Vulkan0', Name: 'AMD Radeon RX 6700 XT (RADV NAVI22)', Mem: 12032, Free: 12032
[2025-09-27][09:18:08][tauri_plugin_hardware::vendor::amd][ERROR] Failed to get memory usage for AMD GPU 0x73df: GPU not found
[2025-09-27][09:18:08][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  Total VRAM: 12616466432 bytes, Total RAM: 14235467776 bytes, Total Memory: 26851934208 bytes
[2025-09-27][09:18:08][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  modelSize: 2497281632
[2025-09-27][09:18:08][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  Using explicit key_length: 128, value_length: 128
[2025-09-27][09:18:08][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  Final context length used for KV size: 8192
[2025-09-27][09:18:08][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  nLayer: 36, nHead: 32, headDim (K+V): 256
[2025-09-27][09:18:08][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  ctxLen: 8192
[2025-09-27][09:18:08][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  nLayer: 36
[2025-09-27][09:18:08][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  nHead: 32
[2025-09-27][09:18:08][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  headDim: 256
[2025-09-27][09:18:08][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  isModelSupported: Total memory requirement: 7329119840 for /home/XXX/.local/share/Jan/data/llamacpp/models/Jan-v1-4B-Q4_K_M/model.gguf
[2025-09-27][09:18:10][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  Calling Tauri command llama_load with args:  --no-webui,--jinja,-m,/home/XXX/.local/share/Jan/data/llamacpp/models/Jan-v1-4B-Q4_K_M/model.gguf,-a,Jan-v1-4B-Q4_K_M,--port,3407,-ngl,100,--batch-size,2048,--ubatch-size,512,--no-mmap,--ctx-size,8192
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] Attempting to launch server at path: "/home/XXX/.local/share/Jan/data/llamacpp/backends/b6324/linux-vulkan-x64/build/bin/llama-server"
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] Using arguments: ["--no-webui", "--jinja", "-m", "/home/XXX/.local/share/Jan/data/llamacpp/models/Jan-v1-4B-Q4_K_M/model.gguf", "-a", "Jan-v1-4B-Q4_K_M", "--port", "3407", "-ngl", "100", "--batch-size", "2048", "--ubatch-size", "512", "--no-mmap", "--ctx-size", "8192"]
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] Waiting for model session to be ready...
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] ggml_vulkan: Found 1 Vulkan devices:
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] ggml_vulkan: 0 = AMD Radeon RX 6700 XT (RADV NAVI22) (radv) | uma: 0 | fp16: 1 | bf16: 0 | warp size: 32 | shared memory: 65536 | int dot: 1 | matrix cores: none
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] build: 1 (2bdc9f1) with cc (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0 for x86_64-linux-gnu
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] system info: n_threads = 6, n_threads_batch = 6, total_threads = 12
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] system_info: n_threads = 6 (n_threads_batch = 6) / 12 | CPU : SSE3 = 1 | SSSE3 = 1 | AVX = 1 | AVX2 = 1 | F16C = 1 | FMA = 1 | BMI2 = 1 | LLAMAFILE = 1 | OPENMP = 1 | REPACK = 1 |
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] Web UI is disabled
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] main: binding port with default address family
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] main: HTTP server is listening, hostname: 127.0.0.1, port: 3407, http threads: 11
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] main: loading model
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] srv    load_model: loading model '/home/XXX/.local/share/Jan/data/llamacpp/models/Jan-v1-4B-Q4_K_M/model.gguf'
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_load_from_file_impl: using device Vulkan0 (AMD Radeon RX 6700 XT (RADV NAVI22)) - 12032 MiB free
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: loaded meta data with 35 key-value pairs and 398 tensors from /home/XXX/.local/share/Jan/data/llamacpp/models/Jan-v1-4B-Q4_K_M/model.gguf (version GGUF V3 (latest))
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv   0:                       general.architecture str              = qwen3
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv   1:                               general.type str              = model
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv   2:                               general.name str              = Jan-v1-4B
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv   3:                           general.finetune str              = Jan-v1-4B
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv   4:                         general.size_label str              = 4B
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv   5:                            general.license str              = apache-2.0
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv   6:                   general.base_model.count u32              = 1
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv   7:                  general.base_model.0.name str              = Qwen3 4B Thinking 2507
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv   8:               general.base_model.0.version str              = 2507
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv   9:          general.base_model.0.organization str              = Qwen
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  10:              general.base_model.0.repo_url str              = https://huggingface.co/Qwen/Qwen3-4B-...
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  11:                               general.tags arr[str,1]       = ["text-generation"]
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  12:                          general.languages arr[str,1]       = ["en"]
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  13:                          qwen3.block_count u32              = 36
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  14:                       qwen3.context_length u32              = 262144
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  15:                     qwen3.embedding_length u32              = 2560
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  16:                  qwen3.feed_forward_length u32              = 9728
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  17:                 qwen3.attention.head_count u32              = 32
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  18:              qwen3.attention.head_count_kv u32              = 8
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  19:                       qwen3.rope.freq_base f32              = 5000000.000000
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  20:     qwen3.attention.layer_norm_rms_epsilon f32              = 0.000001
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  21:                 qwen3.attention.key_length u32              = 128
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  22:               qwen3.attention.value_length u32              = 128
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  23:                       tokenizer.ggml.model str              = gpt2
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  24:                         tokenizer.ggml.pre str              = qwen2
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  25:                      tokenizer.ggml.tokens arr[str,151936]  = ["!", "\"", "#", "$", "%", "&", "'", ...
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  26:                  tokenizer.ggml.token_type arr[i32,151936]  = [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, ...
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  27:                      tokenizer.ggml.merges arr[str,151387]  = ["Ġ Ġ", "ĠĠ ĠĠ", "i n", "Ġ t",...
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  28:                tokenizer.ggml.eos_token_id u32              = 151645
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  29:            tokenizer.ggml.padding_token_id u32              = 151643
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  30:                tokenizer.ggml.bos_token_id u32              = 151643
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  31:               tokenizer.ggml.add_bos_token bool             = false
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  32:                    tokenizer.chat_template str              = {%- if tools %}\n    {{- '<|im_start|>...
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  33:               general.quantization_version u32              = 2
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - kv  34:                          general.file_type u32              = 15
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - type  f32:  145 tensors
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - type q4_K:  216 tensors
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_model_loader: - type q6_K:   37 tensors
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: file format = GGUF V3 (latest)
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: file type   = Q4_K - Medium
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: file size   = 2.32 GiB (4.95 BPW)
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] load: printing all EOG tokens:
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] load:   - 151643 ('<|endoftext|>')
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] load:   - 151645 ('<|im_end|>')
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] load:   - 151662 ('<|fim_pad|>')
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] load:   - 151663 ('<|repo_name|>')
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] load:   - 151664 ('<|file_sep|>')
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] load: special tokens cache size = 26
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] load: token to piece cache size = 0.9311 MB
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: arch             = qwen3
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: vocab_only       = 0
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: n_ctx_train      = 262144
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: n_embd           = 2560
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: n_layer          = 36
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: n_head           = 32
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: n_head_kv        = 8
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: n_rot            = 128
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: n_swa            = 0
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: is_swa_any       = 0
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: n_embd_head_k    = 128
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: n_embd_head_v    = 128
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: n_gqa            = 4
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: n_embd_k_gqa     = 1024
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: n_embd_v_gqa     = 1024
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: f_norm_eps       = 0.0e+00
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: f_norm_rms_eps   = 1.0e-06
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: f_clamp_kqv      = 0.0e+00
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: f_max_alibi_bias = 0.0e+00
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: f_logit_scale    = 0.0e+00
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: f_attn_scale     = 0.0e+00
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: n_ff             = 9728
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: n_expert         = 0
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: n_expert_used    = 0
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: causal attn      = 1
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: pooling type     = -1
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: rope type        = 2
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: rope scaling     = linear
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: freq_base_train  = 5000000.0
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: freq_scale_train = 1
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: n_ctx_orig_yarn  = 262144
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: rope_finetuned   = unknown
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: model type       = 4B
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: model params     = 4.02 B
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: general.name     = Jan-v1-4B
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: vocab type       = BPE
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: n_vocab          = 151936
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: n_merges         = 151387
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: BOS token        = 151643 '<|endoftext|>'
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: EOS token        = 151645 '<|im_end|>'
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: EOT token        = 151645 '<|im_end|>'
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: PAD token        = 151643 '<|endoftext|>'
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: LF token         = 198 'Ċ'
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: FIM PRE token    = 151659 '<|fim_prefix|>'
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: FIM SUF token    = 151661 '<|fim_suffix|>'
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: FIM MID token    = 151660 '<|fim_middle|>'
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: FIM PAD token    = 151662 '<|fim_pad|>'
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: FIM REP token    = 151663 '<|repo_name|>'
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: FIM SEP token    = 151664 '<|file_sep|>'
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: EOG token        = 151643 '<|endoftext|>'
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: EOG token        = 151645 '<|im_end|>'
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: EOG token        = 151662 '<|fim_pad|>'
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: EOG token        = 151663 '<|repo_name|>'
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: EOG token        = 151664 '<|file_sep|>'
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] print_info: max token length = 256
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] load_tensors: loading model tensors, this can take a while... (mmap = false)
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] load_tensors: offloading 36 repeating layers to GPU
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] load_tensors: offloading output layer to GPU
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] load_tensors: offloaded 37/37 layers to GPU
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] load_tensors:          CPU model buffer size =   304.28 MiB
[2025-09-27][09:18:10][tauri_plugin_llamacpp::commands][INFO] [llamacpp] load_tensors:      Vulkan0 model buffer size =  2375.91 MiB
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] ................................................................................
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_context: constructing llama_context
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_context: n_seq_max     = 1
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_context: n_ctx         = 8192
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_context: n_ctx_per_seq = 8192
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_context: n_batch       = 2048
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_context: n_ubatch      = 512
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_context: causal_attn   = 1
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_context: flash_attn    = 0
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_context: kv_unified    = false
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_context: freq_base     = 5000000.0
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_context: freq_scale    = 1
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_context: n_ctx_per_seq (8192) < n_ctx_train (262144) -- the full capacity of the model will not be utilized
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_context: Vulkan_Host  output buffer size =     0.58 MiB
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_kv_cache:    Vulkan0 KV buffer size =  1152.00 MiB
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_kv_cache: size = 1152.00 MiB (  8192 cells,  36 layers,  1/1 seqs), K (f16):  576.00 MiB, V (f16):  576.00 MiB
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_context:    Vulkan0 compute buffer size =   558.01 MiB
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_context: Vulkan_Host compute buffer size =    25.02 MiB
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_context: graph nodes  = 1410
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] llama_context: graph splits = 2
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] common_init_from_params: added <|endoftext|> logit bias = -inf
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] common_init_from_params: added <|im_end|> logit bias = -inf
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] common_init_from_params: added <|fim_pad|> logit bias = -inf
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] common_init_from_params: added <|repo_name|> logit bias = -inf
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] common_init_from_params: added <|file_sep|> logit bias = -inf
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] common_init_from_params: setting dry_penalty_last_n to ctx_size = 8192
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] common_init_from_params: warming up the model with an empty run - please wait ... (--no-warmup to disable)
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] srv          init: initializing slots, n_slots = 1
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] slot         init: id  0 | task -1 | new slot n_ctx_slot = 8192
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] main: model loaded
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] main: chat template, chat_template: {%- if tools %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]     {{- '<|im_start|>system\n' }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]     {%- if messages[0].role == 'system' %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {{- messages[0].content + '\n\n' }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]     {%- endif %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]     {{- "In this environment you have access to a set of tools you can use to answer the user's question. You can use one tool per message, and will receive the result of that tool use in the user's response. You use tools step-by-step to accomplish a given task, with each tool use informed by the result of the previous tool use.\n\nTool Use Rules\nHere are the rules you should always follow to solve your task:\n1. Always use the right arguments for the tools. Never use variable names as the action arguments, use the value instead.\n2. Call a tool only when needed: do not call the search agent if you do not need information, try to solve the task yourself.\n3. If no tool call is needed, just answer the question directly.\n4. Never re-do a tool call that you previously did with the exact same parameters.\n5. For tool use, MARK SURE use XML tag format as shown in the examples above. Do not use any other format.\nNow Begin! If you solve the task correctly, you will receive a reward of $1,000,000.\n\n" }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]     {{- "# Tools\n\nYou may call one or more functions to assist with the user query.\n\nYou are provided with function signatures within <tools></tools> XML tags:\n<tools>" }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]     {%- for tool in tools %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {{- "\n" }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {{- tool | tojson }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]     {%- endfor %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]     {{- "\n</tools>\n\nFor each function call, return a json object with function name and arguments within <tool_call></tool_call> XML tags:\n<tool_call>\n{\"name\": <function-name>, \"arguments\": <args-json-object>}\n</tool_call><|im_end|>\n" }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] {%- else %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]     {%- if messages[0].role == 'system' %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {{- '<|im_start|>system\n' + messages[0].content + '<|im_end|>\n' }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]     {%- endif %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] {%- endif %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] {%- set ns = namespace(multi_step_tool=true, last_query_index=messages|length - 1) %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] {%- for message in messages[::-1] %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]     {%- set index = (messages|length - 1) - loop.index0 %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]     {%- if ns.multi_step_tool and message.role == "user" and message.content is string and not(message.content.startswith('<tool_response>') and message.content.endswith('</tool_response>')) %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {%- set ns.multi_step_tool = false %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {%- set ns.last_query_index = index %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]     {%- endif %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] {%- endfor %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] {%- for message in messages %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]     {%- if message.content is string %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {%- set content = message.content %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]     {%- else %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {%- set content = '' %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]     {%- endif %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]     {%- if (message.role == "user") or (message.role == "system" and not loop.first) %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {{- '<|im_start|>' + message.role + '\n' + content + '<|im_end|>' + '\n' }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]     {%- elif message.role == "assistant" %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {%- set reasoning_content = '' %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {%- if message.reasoning_content is string %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]             {%- set reasoning_content = message.reasoning_content %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {%- else %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]             {%- if '</think>' in content %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]                 {%- set reasoning_content = content.split('</think>')[0].rstrip('\n').split('<think>')[-1].lstrip('\n') %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]                 {%- set content = content.split('</think>')[-1].lstrip('\n') %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]             {%- endif %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {%- endif %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {%- if loop.index0 > ns.last_query_index %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]             {%- if loop.last or (not loop.last and reasoning_content) %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]                 {{- '<|im_start|>' + message.role + '\n<think>\n' + reasoning_content.strip('\n') + '\n</think>\n\n' + content.lstrip('\n') }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]             {%- else %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]                 {{- '<|im_start|>' + message.role + '\n' + content }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]             {%- endif %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {%- else %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]             {{- '<|im_start|>' + message.role + '\n' + content }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {%- endif %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {%- if message.tool_calls %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]             {%- for tool_call in message.tool_calls %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]                 {%- if (loop.first and content) or (not loop.first) %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]                     {{- '\n' }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]                 {%- endif %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]                 {%- if tool_call.function %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]                     {%- set tool_call = tool_call.function %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]                 {%- endif %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]                 {{- '<tool_call>\n{"name": "' }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]                 {{- tool_call.name }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]                 {{- '", "arguments": ' }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]                 {%- if tool_call.arguments is string %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]                     {{- tool_call.arguments }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]                 {%- else %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]                     {{- tool_call.arguments | tojson }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]                 {%- endif %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]                 {{- '}\n</tool_call>' }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]             {%- endfor %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {%- endif %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {{- '<|im_end|>\n' }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]     {%- elif message.role == "tool" %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {%- if loop.first or (messages[loop.index0 - 1].role != "tool") %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]             {{- '<|im_start|>user' }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {%- endif %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {{- '\n<tool_response>\n' }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {{- content }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {{- '\n</tool_response>' }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {%- if loop.last or (messages[loop.index0 + 1].role != "tool") %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]             {{- '<|im_end|>\n' }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]         {%- endif %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]     {%- endif %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] {%- endfor %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] {%- if add_generation_prompt %}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp]     {{- '<|im_start|>assistant\n' }}
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] {%- endif %}, example_format: '<|im_start|>system
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] You are a helpful assistant<|im_end|>
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] <|im_start|>user
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] Hello<|im_end|>
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] <|im_start|>assistant
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] Hi there<|im_end|>
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] <|im_start|>user
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] How are you?<|im_end|>
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] <|im_start|>assistant
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] '
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] main: server is listening on http://127.0.0.1:3407 - starting the main loop
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] Model appears to be ready based on logs: 'main: server is listening on http://127.0.0.1:3407 - starting the main loop'
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] srv  update_slots: all slots are idle
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] Model is ready to accept requests!
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] Server process started with PID: 172 and is ready
[2025-09-27][09:18:11][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  Calling Tauri command getDevices with arg --list-devices
[2025-09-27][09:18:11][tauri_plugin_llamacpp::device][INFO] Getting devices from server at path: "/home/XXX/.local/share/Jan/data/llamacpp/backends/b6324/linux-vulkan-x64/build/bin/llama-server"
[2025-09-27][09:18:11][reqwest::connect][DEBUG] starting new connection: http://localhost:3407/
[2025-09-27][09:18:11][tauri_plugin_llamacpp::device][INFO] Device list output:
Available devices:
  Vulkan0: AMD Radeon RX 6700 XT (RADV NAVI22) (12032 MiB, 12032 MiB free)

[2025-09-27][09:18:11][tauri_plugin_llamacpp::device][INFO] Parsing device line: 'Vulkan0: AMD Radeon RX 6700 XT (RADV NAVI22) (12032 MiB, 12032 MiB free)'
[2025-09-27][09:18:11][tauri_plugin_llamacpp::device][INFO] Parsed device - ID: 'Vulkan0', Name: 'AMD Radeon RX 6700 XT (RADV NAVI22)', Mem: 12032, Free: 12032
[2025-09-27][09:18:11][tauri_plugin_llamacpp::process][INFO] Sending SIGTERM to PID 172
[2025-09-27][09:18:11][tauri_plugin_llamacpp::commands][INFO] [llamacpp] srv    operator(): operator(): cleaning up before exit...
[2025-09-27][09:18:11][tauri_plugin_llamacpp::process][INFO] Process exited gracefully: exit status: 0
[2025-09-27][09:18:11][tauri_plugin_hardware::vendor::amd][ERROR] Failed to get memory usage for AMD GPU 0x73df: GPU not found
[2025-09-27][09:18:11][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  Total VRAM: 12616466432 bytes, Total RAM: 15147728896 bytes, Total Memory: 27764195328 bytes
[2025-09-27][09:18:11][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  modelSize: 2497281632
[2025-09-27][09:18:11][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  Successfully unloaded model with PID 172
[2025-09-27][09:18:11][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  Using explicit key_length: 128, value_length: 128
[2025-09-27][09:18:11][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  Final context length used for KV size: 8192
[2025-09-27][09:18:11][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  nLayer: 36, nHead: 32, headDim (K+V): 256
[2025-09-27][09:18:11][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  ctxLen: 8192
[2025-09-27][09:18:11][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  nLayer: 36
[2025-09-27][09:18:11][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  nHead: 32
[2025-09-27][09:18:11][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  headDim: 256
[2025-09-27][09:18:11][webview:info@asset://localhost/%2Fhome%2FXXX%2F.local%2Fshare%2FJan%2Fdata%2Fextensions%2F%40janhq%2Fllamacpp-extension%2Fdist%2Findex.js:2246:7][INFO]  isModelSupported: Total memory requirement: 7329119840 for /home/XXX/.local/share/Jan/data/llamacpp/models/Jan-v1-4B-Q4_K_M/model.gguf

Not sure if this is the culprit:
[2025-09-27][09:17:58][tauri_plugin_hardware::vendor::amd][ERROR] Failed to get memory usage for AMD GPU 0x73df: GPU not found
Though the GPU is correctly shown and recognized under Hardware. Also, unchecking the GPU which should then use the CPU only (or other avx backends) should use the CPU? and also do not work

Operating System

  • MacOS
  • Windows
  • Linux

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

Projects

Status

No status

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions