I’ve been working on an Intranet project which has been attempting some hefty (and perhaps inadvisable) setting and retrieval of cookies via JavaScript. It’s led to some interesting discoveries, and more importantly the discovery of what seems like a bug in IE6. The project is for users with IE6 only.

Most of the documentation I’ve found (and there isn’t much) suggests that web browsers must support a minimum of:

  • 300 cookies in total
  • 20 cookies per domain
  • 4096 bytes per cookie

It seems as though this minimum requirement is part of the original RFC for cookies – see section 6.3 specifically.

I’ve been testing exclusively in IE6 (specifically, version 6.0.2900.2180.xpsp_sp2_dgr.050301-1519). I have found that the 20 cookies per domain limit is not just a minimum, but also the maximum, for IE6. If you set a 21st cookie for a given domain, then the 1st cookie is forgotten – so only the most recently-created 20 cookies are kept.

(As an aside, it’s possible to confuse what is meant by cookies with IE. IE6 keeps a separate text file, in C:\Documents and Settings\[username]\Cookies\ , for each user@domain. All cookies for a user in a given domain are stored in the text file. There isn’t a text file per cookie.)

The real problem, however, comes when you try and set cookies with a large size. The standards state that a browser must support a minimum of 4096 bytes per cookie. IE6 doesn’t do this. Instead, it seems to have a maximum size of 4096 bytes for all cookies from a domain. And, even worse, once this maximum is exceeded, you can’t read or write any further cookies for that domain. The only solution I’ve been able to find is for the user to “Delete cookies…” from Tools > Internet Options > General, and start again.

(I initially ran into a sub-problem when testing this theory. I would set four cookies, each of 1,000 bytes, all set via JavaScript, within one page. I’d then try and set a fifth of the same size, and the bug mentioned above would be triggered. However, if I then refreshed this page, I got a http 400 error from the server. No idea why. To remove this from the equation, I created 5 separate pages, each of which set a 1,000-byte cookie. This removed the http error, but still left me with the problem mentioned above once the 5th cookie was set.)

I’ve not been able to find any information about this problem elsewhere, which surprised me. I’ve also not been able to find a workaround. Perhaps everyone else stays within more sensible limits for their cookies :-) I guess that for a public web site, these limits (20 cookies per domain, 4096 bytes per cookie – allegedly) are pretty sensible. However, for an Intranet (like the one I’m working on), they become a problem. 20 cookies is a very small limit if your entire Intranet is served from one domain, and for large intranets, different development teams don’t always tell each other how many cookies they are using. (Which makes squishing values into a single cookie a good idea, rather than spreading them over a cookie each). And that 4k limit – whilst perhaps never reached in most cases, due to the 20 cookie limit – could quickly become a problem too.

Moral? Use databases for big things, and make sure your JavaScript code can cope with any one of its cookies mysteriously disappearing :-)

(Cookie testing was completed with many thanks to the excellent log4javascript.)