Design pattern for multiple customers and one data source

I am developing a web interface for specific equipment that provides its own API. The specified web interface can simultaneously control several devices. Data is retrieved from the device through polling using a custom API, so it would be preferable to make it asynchronous.

The most obvious thing is to have a polling stream that polls the data, stores semaphores in a single whole process, and then the web server threads will retrieve data from the aforementioned singleton and show it. I'm not a big fan of singleton or weaving projects together, so I thought that I might have detached the poller data source from the web server, looping it back on the local interface and using something like XML-RPC to use the data.

An application does not have to be “entrepreneurial” or scalable in fact, since for the most part it can be accessed by several people at the same time, but I would rather make it reliable without mixing the two types of logic together. There's the current python implementation using CherryPy, and this is the biggest horrible design mishmash I've ever seen. I feel that if I go with the most obvious design, I just finish re-evaluating the same terrible thing in my own way.

+4
source share
1 answer

If you use Django and celery , you can create a Django project as a web interface and a celery task to run in the background and polling. In this task, you can import your Django models so that they can easily save the survey results.

+4
source

All Articles