I am creating an application (using Django ORM) that will swallow many events, say 50 / s (1-2 kbit / s). Initially, some “real-time processing” and monitoring of coverage events, so I will use redis so that some of this data makes decisions, crowding them out when it makes sense. I was going to save all entities, including events in Postgres for the "idle" storage.
In the future I will need an “analytical” opportunity for dashboards and other functions. I want to use Amazon Redshift for this. I thought I would just go over to Redshift and skip Postgres. But I also see that people say that this should play a more passive role. Perhaps I could regularly store the data window in the backend and SQL archive in Redshift.
My question is:
Is it possible to use something like Redshift as a backend for web applications, or does it usually play the role of a passive role? If not, is it realistic to think that I can scale Postgres enough so that these events start only with this? Also, if this is not the case, does the "data and archive window" method make sense?
EDIT Here are some things I saw before writing a post:
source
share