-
Notifications
You must be signed in to change notification settings - Fork 26
Description
Caddy version: 2.9.1
The Problem:
When using the cache directive, the throughput of the given config block experiences a dramatic slowdown. On my system with 7950X testing using AB like so:
ab -k -t 60 -n 100000 -c 16 http://localhost:8080/image.png
Without caching: 85000-87000 requests per second
With caching: <3600 requests per second
image.png is a small (<20KiB) .png file of 185x250px dimensions.
Context:
For comparison, nginx in a similar scenario manages slower non-cached throughput (50-60k r/s), but dramatically faster cached throughput (15-17k r/s).
I tested versus nginx as my project presently uses its fork, Openresty (which is just nginx but with Lua scripting), to serve and cache static assets, and reverse proxy the main app. The expected production traffic consists of a high rate of requests against object storage backed image and thumbnail storage. It is crucial that this is cached (object storage egress costs money), and that the cache is performant enough to keep up with a high volume of requests.
If I were to switch to Caddy, the performance difference must not be too much worse.
The Config
{
cache
}
:80 {
route {
cache # the "no cache" test was done with this line commented-out
file_server {
root /var/www
}
}
}
Caddy is built via Docker like so:
ARG CADDY_VERSION="2.9.1"
FROM caddy:$CADDY_VERSION-builder-alpine AS builder
RUN xcaddy build \
--with github.com/caddyserver/cache-handler
FROM caddy:$CADDY_VERSION-alpine
COPY --from=builder /usr/bin/caddy /usr/bin/caddy
EXPOSE 80
and ran like so
docker run --rm \
-it \
-v /home/luna/code/caddytest/config:/etc/caddy \
-v /home/luna/code/caddytest/data:/data \
-v /home/luna/code/caddytest/static:/var/www \
-p 8080:80 \
caddytest:latest
Needless to say, a very basic and barebones setup.
What I tried:
- Using a different storage backend for the caching module (tried otter, etcd)
- Putting it in a
handleblock - Using it without any
routeorhandleblocks
This performance discrepancy compared to our current setup with nginx is completely unacceptable and prevents me from moving ahead with using Caddy in production (as my resources rely caching having a high throughput).