Lessons I’ve learned after using Nginx’s proxy cache for a high traffic website.
- How to maximize Nginx caching performance
- The Nginx cache is not filling up the available cache disk space
- How to keep serving stale content when your backend is having an outage
- How to configure Nginx proxy cache for Tomcat servers
- Use this Nginx cache key for multilingual sites
- How to purge cache items
How to maximize Nginx caching performance
- Put the cache on a dedicated SSD/NVME disk. On an HDD, the performance hit caused by constant read/writes to disk of the cache manager will dwarf any caching benefits.
- Don’t cache HTTP responses upon first hit. Use
proxy_cache_min_uses 2
to only cache items that have been accessed more than once. This way, you will alleviate the proxy cache’s write pressure and prevent your cache from filling up with content that’s seldom accessed. - If you are a high traffic website, you may want to avoid unexpected load spikes caused by cache misses:
- Set
proxy_cache_lock on;
- When there is a cache miss and many simultaneous requests hitting a single popular URL, you only want to send a single request through to your backend. When this flag is on, the rest of the requests will wait on the Nginx side, and will be served the newly cached response once the first request is handled by the backend.
- Add
proxy_cache_use_stale updating;
to your Nginx config.- This flag instructs Nginx to keep serving stale cache content to users while the cache is being refreshed.
- Set
- If your backend supports conditional GET requests, set
proxy_cache_revalidate on;
- When you turn this flag on, you instruct Nginx to use conditional GET requests to refresh its stale cache, saving bandwidth and disk I/O if the content hasn’t changed.
- When you cold start an Nginx instance with a big existing cache on disk, give it a few minutes before you expose it to the public. Nginx needs to scan the entire cache to load its metadata into memory.
The Nginx cache is not filling up the available cache disk space
Check your Nginx cache’s keys_zone
value. You might have run out of cache key space because you cache a large number of small items.
How to keep serving stale content when your backend is having an outage
proxy_cache_use_stale error timeout http_500 http_502 http_503 http_504;
Consider adding updating
to the list as well, as explained above in the performance section.
How to configure Nginx proxy cache for Tomcat servers
Java web apps running on Apache Tomcat always set a JSESSIONID
cookie in the HTTP response, even if your web visitor is not logged into any account. To prevent session IDs from leaking into your Nginx cache, you need to filter out any Set-Cookie
response headers returned by your backend.
proxy_ignore_headers Set-Cookie;
proxy_hide_header Set-Cookie;
Code language: JavaScript (javascript)
Bypass the cache for dynamic content and logged-in users: your backend should set a nocache
cookie in the HTTP responses for dynamic content. Then you include proxy_cache_bypass $cookie_nocache;
in your Nginx configuration to bypass your Nginx cache for those requests.
Use this Nginx cache key for multilingual sites
The default Nginx caching key doesn’t work well with sites with multiple subdomains. You can customize your cache key by setting proxy_cache_key
. This one has worked well for me:
Code language: PHP (php)proxy_cache_key $scheme$host$uri$is_args$args;
How to purge cache items
If your Nginx runs on a Debian server, I highly recommend you install the nginx-extras
package from the Debian repos, instead of the nginx
package from the official repos. The nginx-extras
package comes with the ngx_cache_purge
community module, whereas the nginx
package from the official repos doesn’t.
(If you liked this, you might enjoy 6 Nginx pitfalls that bit me in the ass)