What is your Maximum Cookie Size? Well, let’s start from the top.
From time to time I visit the Microsoft website, I also have a habit of using their website (and some others) to check if my Internet connection is stable and working.
After all, it is Microsoft, their website should always work and I was very surprised a few days ago to find out that “this site can’t be reached”.
I checked my Internet connection – it was fine, other websites load using the Firefox browser was working, but not in my Google Chrome.
The next day, my friend in another part of the world posted on Facebook that www.microsoft.com website didn’t work for him either.
This is intriguing, I thought. Why can’t I open the www.microsoft.com website in my Google Chrome browser?
Well, I understand these two companies compete with each other, but for Microsoft to block Google Chrome or Google Chrome to block a specific website – that just can’t happen, so I decided to investigate.
My initial idea was that the new version of Google Chrome is using some new HTTP2 optimisation technique which somehow conflicts with the software used at microsoft.com and results in interrupted connections.
But other PCs using the same version of Google Chrome were working. It could still be connected to my computer network settings or something different, so I jumped on Wireshark and… Well, those who have done HTTPS sniffing would understand, for everyone else: I gave up without finding what was causing the problem. Microsoft.com unexpectedly and with no explanation just closed the connection with my browser.
The next thing to try was to use Google Chrome’s Net Internals (open new tab in your Google Chrome browser and type: chrome://net-internals/) built-in feature.
You get to see a lot of very technical data like on the screenshot below. This is a very nice feature allowing a detailed view of what is actually happening. Especially useful to debug SSL issues.
On the right bottom part of the screenshot, we can see HTTP headers which look normal except the size of the cookie. 13161 bytes of cookies looks unusually high to me.
13,161 bytes of cookies is a bit much
I am onto something here! So the next step was to take Fiddler proxy to examine what is really being sent and hey presto – the website actually worked if I used Fiddler as a proxy!
Apparently, the Fiddler had silently decreased the size of cookies being sent to 9 kilobytes and it worked!
When I was a web developer I remember a cookie limit of four kilobytes per domain.
While RFC encourages browsers to support arbitrary large cookies, practically speaking it’s best to avoid using cookies much as they are being sent with every request to a server, taking bandwidth and slowing down website speed even if they are not used on this particular page.
Therefore I normally see a much lower cookie size. A quick search on the Microsoft website revealed the cookie limits which Microsoft considers reasonable (https://msdn.microsoft.com/en-us/library/ms178194.aspx)
- 4096 bytes for all cookies per domain (website)
- 20 cookies per domain (website)
- 300 cookies from all websites
Well, four kilobytes of cookies is certainly smaller than the 13 kilobytes which my Google Chrome browser was trying to send.
On top of this, web servers enforce a limit on maximum HTTP header size and cookies are part of them! Every web server is configured a bit differently, so you would probably need to contact your dev team to be sure, but by default, the limits are around 8 kilobytes. Having cookies weighing in around 13 kilobytes definitely exceeds the limits!
Obviously, a web server as popular as microsoft.com has a custom configuration created by professionals, but it seems that cookies that are too big, trigger a connection close operation.
After cleaning my cookies, I was able to visit the microsoft.com website using my Google Chrome browser.
But what caused the problem in the first place? Luckily, I have saved my cookies and was able to calculate their size;
- Optimizely cookie – 2.3 kilobytes
- Two mixpanel cookies – 0.5 kilobytes
- Three marketo cookies sized 0.3 kilobytes each
- Few other cookies up to one kilobyte which I couldn’t quickly identify
None of the above is actually Microsoft products or is in use by the Microsoft web site! Basically, when I was browsing microsoft.com website I was sending more than 10 kilobytes of garbage with each request! Well, not really garbage. They hold important marketing information.
I could understand marketo cookies (their size is kinda OK) and a few others, but companies so sophisticated as Optimizely, MixPanel, etc – please, and please again use Local Storage!
If you don’t know what that is, please read this page: https://www.w3schools.com/html/html5_webstorage.asp
Google Analytics cookie size is tiny compared to the above – only 31 byte.
So, what does it all mean for Maximum Cookie Size?
Imagine the situation: you have your nice looking website, it works and at some moment you add a third-party script used for analytics or marketing or content experiment and you are no longer in control of your website visitors’ cookies!
If the cookie size of your clients accumulates a larger value than your web server can handle – this will prevent users from visiting your website. And the worst part is – unless one of your visitors contacts you (and from my experience, only one in fifty will report the problem) you may not know about the issue at all!
If the cookie size on your clients will accumulate to a bigger value than your web server can handle – this will prevent users from visiting your website
Is your website safe?
Please get me right – you still need to use marketing and analytics scripts, because the positive impact of them on your business is huge.
What you need to do, is to check your website cookies’ limit and proactively monitor it.
First things first, this little tool called Hurl allows you to create an HTTP request (or use telnet if you are technical enough).
Enter your website URL in the Destination field, add the Host header with your website hostname, add User-Agent header and add a Cookie header using a very long string (more than 10 kilobytes) as a value.
Launch request and if you are not seeing an Internal Server Error, you are probably OK.
However, as a general rule, it seems reasonable to keep the size of the cookies under four kilobytes.
This gives you a little bit of space and time in case one of your cookies starts to grow exponentially and you need to fix an issue before it affects customers. Being proactive versus reactive.
In the article A Guide To Tracking Cookie Size In Google Analytics I explain how to start tracking the size of your website visitors’ cookies and how to be reactive with diagnosis and remedying of the problem before it does any real damage to your business.