Let's say we have a program that receives user input or any other unpredictable events at arbitrary points in time.
For each type of event, the program must perform some calculations or gain access to a resource that is laborious enough to consider. The program should output the result as quickly as possible. If the following events come, it may be acceptable to discard previous calculations and accept new ones.
To complicate it, some calculations / access to resources can be interdependent, i.e. Create data that can be used in other calculations.
What is important we know how these events usually occur. For example: their relative frequency relative to each other or the general order and time intervals in which they occur.
The challenge is to make an algorithm that considers the problem in the most statistically efficient way. Approaches that receive suboptimal solutions may be more than sufficient.
Is there a concept that covers the design of such algorithms?
Example:
Nested internet browser.
When loading various web pages on several tabs, you should decide whether to load the page on the active tab with a higher priority, to display only the visible part of the page or to pre-display the full page, if so, what to do first - pre-display the entire page for the active tabs or display other tabs etc.
(I donβt know anything about how browsers work, but on condition that it doesnβt hurt)
source share