Is using Redis for this situation?

I plan to create an application (Rails) that will have a very large collection of users - it will start small, but I would like it to be able to handle a million or more.

I want to create a system that can handle 2500+ requests per second. Each request requires a record (for logging), as well as reading from a huge list of users indexed by user name (I was recommended to use MongoDB for this purpose), and the reading results will be sent to the user.

I don’t understand a bit how mongo will handle both reading and writing, so I had the idea of ​​using Mongo to sort the records permanently, and then uploading them to Redis every time the server starts even faster access so Mongo is not needed deal with nothing but records.

Does this sound reasonable or is it a huge abuse of Mongo and Radish?

Delivery speed is paramount.

+4
source share
3 answers

It is possible to actually create an entire application using only Redis. What you want to do is sample design designs for Redis. A good place to start is this Carl Segin's PDF document, called The Little Radish Book .

For example, use Redis hashes to store information for all users.

Also, if you plan well, you don't need to have another persistent storage like Mongo or MySQL in combination with Redis, since Redis is persistent. You just need to choose a good shrapnel / replication strategy that allows you to be flexible enough for future system changes.

+3
source

I think the stack you are asking for is certainly a very good solution, and one that has been pretty fought for high-performance sites. Trello (created by the same people who created this site) uses a similar architecture, as well as Craigslist.

Trello Tech Stack Writeup

Craigslist also uses this

Redis is fast and has an excellent pub / add-on mechanism in addition to the usual functions like invalidation, which makes it an excellent cache for most. Mongo is a db that I am very familiar with and believe that it is good for all types of data warehouses, and is also a reliable business db that scales well, protects data integrity and checks a bunch of tags in the checklist of SLA shutters

I think this is a great combination, but actually the question should be, I even need this. For your load, I think Mongo itself could handle this pretty well (and ensure data integrity), and also if you really want to be able to run it on a server with enough memory to make sure your dataset fits into memory (denormalization and good circuit design are key), Foursquare runs exclusively on Mongo in memory.

So think about whether this is necessary, but remember that always always wins. Redis / Mongo is super powerful, but working with two data warehouses will also require a lot more work.

Thanks Prasith

+2
source

As already mentioned, using one service makes more sense to me. However, there is a reason to store the registration data in memory. I would try using something simple, if possible, a log file, or Scribe or Flume if you need to distribute records.

+1
source

All Articles