The question of processing time comes in reasonable quantities when you get into distributed systems, users and map events between different data sources.
I highly recommend that you ensure that all logging systems use UTC. This allows you to collect any servers (which, we hope, are synchronized in relation to their viewing of the current UTC time) located anywhere in the world.
Then, as requests arrive, you can convert users from the time zone to UTC. At this stage, you have one and the same solution - to execute the request in real time or, possibly, to access some previously generalized data.
Regardless of whether you want to collect data in advance, it will depend on many things. Some of them may entail the possibility to reduce the amount of stored data, reduce the amount of processing to support requests, how often requests will be executed, or even the costs of creating a system in comparison with the amount of use that it can see.
As for the best practices, keep the display characteristics (like time zone) independent of data processing.
If you have not already done so, make sure that you consider the lifetime of the data that you store. Do you need ten years ago data? I hope no. Do you have a strategy for discarding old data when it is no longer needed? Do you know how much data you will have if you save each record (evaluate with different traffic growth rates)?
Again, the best practice for large data sets is to understand how you are going to deal with size and how you are going to manage this data over time as they arise. This may include long-term storage, disposal, or possibly reduction to a summarized form.
Oh, and to slide by analogy with the Matrix, what’s actually going to bake your noodles in terms of “correctness” is that here, correctness is not a problem. Each time zone has a different kind of traffic during the “day” in its own zone, and each of them is “correct”. Even those time zones that differ from you in a setting that is not measured in hours only.