Storing hundreds of millions of simple key-value pairs in Redis

At Instagram, they’re using Redis (comparable to Memcached, but with more options) to map photoIds to userIds.

After tweaking their setup — by using Redis hashes (dictionaries that are can be encoded in memory very efficiently) — in Redis, they’ve brought memory usage down from 70MB to 16MB to store 1,000,000 records.

While prototyping this solution, we found that Redis needed about 70 MB to store 1,000,000 keys this way. Extrapolating to the 300,000,000 we would eventually need, it was looking to be around 21GB worth of data.

With our 1,000,000 key prototype (encoded into 1,000 hashes of 1,000 sub-keys each), Redis only needs 16MB to store the information. Expanding to 300 million keys, the total is just under 5GB

By comparison: Memcached needed 52MB of memory to hold the 1,000,000 records.

Storing hundreds of millions of simple key-value pairs in Redis →

Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.