Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/src/pages/docs/llama-cpp-server.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ import { Settings } from 'lucide-react'
`llama.cpp` is the core **inference engine** Jan uses to run AI models locally on your computer. This section covers the settings for the engine itself, which control *how* a model processes information on your hardware.

<Callout>
Looking for API server settings (like port, host, CORS)? They have been moved to the dedicated [**Local API Server**](/docs/local-server/api-server) page.
Looking for API server settings (like port, host, CORS)? They have been moved to the dedicated [**Local API Server**](/docs/api-server) page.
</Callout>

## Accessing Engine Settings
Expand Down
2 changes: 1 addition & 1 deletion docs/src/pages/docs/server-settings.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -174,7 +174,7 @@ This includes configuration for:
- CORS (Cross-Origin Resource Sharing)
- Verbose Logging

[**Go to Local API Server Settings &rarr;**](/docs/local-server/api-server)
[**Go to Local API Server Settings &rarr;**](/docs/api-server)

## Emergency Options

Expand Down