Name and Version
llama-server --version
version: 8878 (5a4cd67)
built with GNU 13.3.0 for Linux x86_64
Operating systems
Linux
Which llama.cpp modules do you know to be affected?
llama-server
Command line
#!/usr/bin/env bash
LLAMA_SERVER="${HOME}/ml/git/github.com/llama.cpp/build/bin/llama-server"
# Get the number of physical cores (not logical cores!)
PHY_CORES=$(lscpu | awk '/^Core\(s\) per socket:/{cores=$NF} /^Socket\(s\):/{sockets=$NF} END{print cores*sockets}')
LLM_PARAMS=(
--host $(hostname -I | awk '{ print $1 }')
--port 7860
--api-key-file ${HOME}/ml/etc/api-keys.txt
--models-preset ${HOME}/ml/etc/llama-config.ini
--models-max 1
--parallel 1
--threads ${PHY_CORES}
--threads-batch ${PHY_CORES}
--mmap
--no-direct-io
--numa distribute
--cache-prompt
--webui-mcp-proxy
)
"${LLAMA_SERVER}" --version
"${LLAMA_SERVER}" "${LLM_PARAMS[@]}" 2>&1 | tee "${HOME}/ml/log/llama-server.log.$(date "+%Y-%b-%d_%H:%M:%S")"
Problem description & steps to reproduce
I start llama-server, and open the server in my web browser. I subsequently see:
main: NOTE: router mode is experimental
main: it is not recommended to use this mode in untrusted environments
srv log_server_r: done request: GET / 192.168.1.101 200
srv log_server_r: done request: GET /bundle.css 192.168.1.101 200
srv log_server_r: done request: GET /bundle.js 192.168.1.101 200
Unauthorized: Invalid API Key
srv log_server_r: done request: HEAD /cors-proxy 192.168.1.101 401
Note that this bug does not (to my knowledge, anyway) affect actually using llama-server.
This may be a left-over bug from this issue:
#21229
Or possibly from this issue:
#21193
First Bad Commit
No response
Relevant log output
main: NOTE: router mode is experimental
main: it is not recommended to use this mode in untrusted environments
srv log_server_r: done request: GET / 192.168.1.101 200
srv log_server_r: done request: GET /bundle.css 192.168.1.101 200
srv log_server_r: done request: GET /bundle.js 192.168.1.101 200
Unauthorized: Invalid API Key
srv log_server_r: done request: HEAD /cors-proxy 192.168.1.101 401
Name and Version
llama-server --version
version: 8878 (5a4cd67)
built with GNU 13.3.0 for Linux x86_64
Operating systems
Linux
Which llama.cpp modules do you know to be affected?
llama-server
Command line
Problem description & steps to reproduce
I start llama-server, and open the server in my web browser. I subsequently see:
Note that this bug does not (to my knowledge, anyway) affect actually using llama-server.
This may be a left-over bug from this issue:
#21229
Or possibly from this issue:
#21193
First Bad Commit
No response
Relevant log output
main: NOTE: router mode is experimental
main: it is not recommended to use this mode in untrusted environments
srv log_server_r: done request: GET / 192.168.1.101 200
srv log_server_r: done request: GET /bundle.css 192.168.1.101 200
srv log_server_r: done request: GET /bundle.js 192.168.1.101 200
Unauthorized: Invalid API Key
srv log_server_r: done request: HEAD /cors-proxy 192.168.1.101 401