1. Overview
An HTTP response may pass by one or more proxy servers on its route to the client. A proxy can also cache HTTP responses. If a response exists in the cache and it’s fresh, then the proxy may return the cached response instead of contacting the web server. This principle also applies to browser caches and CDNs.
This tutorial will examine how we can set the expiration time of an HTTP response on an Nginx server. This setting can enable or disable caching in intermediate servers.
2. HTTP Headers for Caching
We can use the Cache-Control and the Expires HTTP headers to define how the intermediate nodes should cache an HTTP response.
2.1. Cache-Control Header
The Cache-Control header value consists of a directive and an optional argument. It can be used in both HTTP requests and responses. When set in an HTTP request, we let intermediate caches know whether we accept a cached response. Notably, it’s optional for an intermediate node to conform to the directive of the request header.
In contrast, intermediate nodes must conform to the Cache-Control header if there’s one in the HTTP response. The purpose of this header in the response is to define whether intermediate caches should store the response.
There are two directives that deal with expiration time are:
- max-age=N – how long, in seconds, should the response be considered fresh after creation
- s-maxage=N – same as max-age but applies only for shared caches
Importantly, when s-maxage exists, it overrides the value of max-age in shared caches.
2.2. Expires Header
We can set the Expires header only in HTTP responses. The value of Expires is a timestamp that denotes the point in time after which a cache should consider the response stale:
Expires: Thu, 4 Dec 2023 12:35:23 GMT
Notably, caches favor the Cache-Control header with the max-age directive over Expires. In other words, if a cache sees the former, it ignores the latter. As a result, we may use the Expires header for caches that don’t support the Cache-Control header.
3. The add_header Nginx Directive
We can add an HTTP header in Nginx with the add_header directive. For that, we should set the header name and the header value. Moreover, the header is included only for some response status codes:
- 200
- 201
- 204
- 206
- 301
- 302
- 303
- 304
- 307
- 308
On the contrary, if we add the always keyword after the header value, the header is always sent without considering the response code. In addition, we may use the add_header directive in http, server, and location blocks.
4. Cache-Control Example
To demonstrate how we can set the Cache-Control header in Nginx, we’ll configure it to act as a web and proxy server.
4.1. Web Server Configuration
First, let’s create a new virtual server with a location block that serves content from the /data/service1 directory:
server {
listen 8001;
access_log /var/log/nginx/access_service1.log;
root /data/service1;
location ~* \.(gif|jpeg|jpg)$ {
add_header Cache-Control "max-age=3600";
}
}
As can be seen, our server listens to port 8001 and writes logs to the /var/log/nginx/access_service1.log file. Moreover, we defined a location block that matches gif and jpeg images with a regular expression.
Last but not least, we include the Cache-Control header with add_header in the location block and set max-age to 3600 seconds. This means that the intermediate servers will consider that the response is fresh for an hour after its generation.
4.2. Proxy Server Configuration
Next, we’ll create a proxy server with a cache that passes requests to port 8001 of localhost:
proxy_cache_path /var/cache/nginx keys_zone=mycache:10m;
server {
listen 9001;
proxy_cache mycache;
access_log /var/log/nginx/access_proxyservice1.log;
location ~* \.(gif|jpeg|jpg)$ {
proxy_pass http://localhost:8001;
proxy_cache_key $scheme://$host$uri$is_args$query_string;
}
}
As we can see, our proxy listens to port 9001 and outputs logs to the /var/log/nginx/access_proxyservice1.log file. Moreover, we created a new location block that matches requests for jpeg and gif images and passes them to port 8001 of localhost.
In addition, we set up a cache for storing HTTP responses with the proxy_cache_path directive. The cache name is mycache. We also define the format of the key that is used to store an HTTP response in the cache with the proxy_cache_key directive.
4.3. Testing
Next, let’s create a sample file and test our setup by sending an HTTP request:
$ sudo touch /data/service1/test.jpg
$ curl -v http://localhost:9001/test.jpg
* Trying ::1:9001...
...
< HTTP/1.1 200 OK
< Server: nginx/1.18.0 (Ubuntu)
< Date: Tue, 03 Jan 2023 22:09:27 GMT
< Content-Type: image/jpeg
< Content-Length: 0
< Connection: keep-alive
< Last-Modified: Tue, 03 Jan 2023 21:50:23 GMT
< ETag: "63b4a31f-0"
< Cache-Control: max-age=3600
< Accept-Ranges: bytes
<
* Connection #0 to host localhost left intact
In the response output, we can see that the Nginx server has set the Cache-Control header. Let's send another request and check the log files:
$ curl http://localhost:9001/test.jpg
$ tail -n 2 /var/log/nginx/access_proxyservice1.log
127.0.0.1 - - [04/Jan/2023:00:09:27 +0200] "GET /test.jpg HTTP/1.1" 200 0 "-" "curl/7.68.0"
127.0.0.1 - - [04/Jan/2023:00:10:17 +0200] "GET /test.jpg HTTP/1.1" 200 0 "-" "curl/7.68.0"
$ tail -n 1 /var/log/nginx/access_service1.log
127.0.0.1 - - [04/Jan/2023:00:09:27 +0200] "GET /test.jpg HTTP/1.0" 200 0 "-" "curl/7.68.0"
Indeed, we can see in the logs that the second request doesn’t reach the web server. This means that the response came from the proxy server’s cache, and it will keep doing so until the max-age time elapses.
5. Conclusion
In this article, we briefly introduced the Cache-Conrol and Expires HTTP headers. Then, we talked about the add_header Nginx directive. Finally, we demonstrated an example where we set the expiration time to an HTTP response in an Nginx server using the Cache-Control header.