Implementing an event feed - will it scale?

Situation:

I am currently developing a submission system for a social website in which each user has the power of their friends. I have two possible methods for creating feeds, and I would like to ask which is better in terms of scalability.

Events from all users are collected in a central database table event_log. In the table, friendsusers mate as friends. The DBMS we use is MySQL.

Standard method: When a user requests a page for his submission, the system generates a feed by internal connection event_logusing friends. The result is then cached and set to timeout after 5 minutes. Scaling is achieved by changing this timeout.

Intended method: The task runs in the background and for each new, unprocessed item in event_log, it creates entries in the database table user_feedthat associate this event with all users who are friends with the user who initiated this event. One row of the table combines one event with one user.

Problems with the standard method are well known - what if a lot of people's caches expire at the same time? The solution also doesn't scale very well - a short message saying that channels can be updated as close to real time as possible

The supposed solution in my eyes seems much better; all processing is performed offline, so the user does not expect to create the page, and there are no joins, so the database tables can be distributed across physical machines. However, if the user has 100,000 friends and creates 20 events in one session, this leads to the insertion of 2,000,000 rows into the database.

Question:

The question comes down to two points:

  • , , , .. MySQL - ?
  • - , ?
+5
2

, ; -, user_feed, -, , ( ); -, , 1000 , 100 - 100 000 .

, , last_user_feed_update, , , .

, , last_user_feed , . , -, , , - .

+1

, . , Facebook iirc. "", 100 . .

, , , . , .

- , MySQL . , memcached, , " " ( , )... . .

0

All Articles