Monday, October 1, 2012

Varnish reports error 503 for large files

I have been working with a startup where I had to build a network of Varnish servers, of course in front of the dynamic storage. It worked absolutely fine for smaller files of any kind, e.g. image, video, binary, etc.
But for large files, say greater than 1G, it just hung and after sometimes report error 503. This error generally directs me that Varnish is no longer there but after checking on the system, it was still there and accepting requests.
Next thing I wanted to look in was the VCL (Varnish Configuration Language) since when we talk about Varnish, we are talking about various VCLs we actually configured.
Varnish gets exhausted for large files and we have to "pipe" the requests instead of "lookup". It means that it will stop investigating the request and just sends the data back and forth. It won't cache, it streams.
So your VCL will look like this:
added to vcl_recv:
/* Bypass cache for large files. The x-pipe header is
set in vcl_fetch when a too large file is detected. */
if (req.http.x-pipe && req.restarts > 0) {
remove req.http.x-pipe;
return (pipe);

added to vcl_fetch:
# don't cache files larger than 10MB
/* Don't try to cache too large files. It appears
Varnish just crashes if we don't filter them. */
if (beresp.http.Content-Length ~ "[0-9]{8,}" ) {
set req.http.x-pipe = "1";
return (restart);
Hope it helps!!!