Work in real time through Python

So, I'm an inexperienced Python coder, and what I put together can be quite a challenge. I am a cognitive scientist and I need an accurate stimulus display and button click detection. I was told that the best way to do this is to use it in real time, but I don’t know how to do it. Ideally, with each test, the program will work in real time, and then, once the test is completed, the OS may return to not being as meticulous. There would be about 56 trials. Could there be a way to code this from my python script?

(Again, all I need to know is when the stimulus is actually displayed. The real-time method would assure me that the stimulus is displayed when I want it to be from top to bottom. On the other hand, I could use more bottom-up approach, if it’s easier to just know when to record, when the computer really got the opportunity to display it.)

+7
source share
5 answers

When people talk about real-time computing, they mean that latency from interruption (most often set by a timer) when processing application code that is interrupted is both small and predictable. This means that the control process can be performed multiple times at very precise time intervals or, as in your case, external events can be synchronized very precisely. The difference in latency is usually called “jitter” - a maximum jitter of 1 ms means that multiple interruptions will have a reaction delay that changes by no more than 1 ms.

“Small” and “predictable” are both relative terms, and when people talk about real-time performance, they can mean a maximum jitter of 1 μs (for example, people creating inverters for transmitting power associated with such performance), or they can mean a few milliseconds of maximum jitter. It all depends on the requirements of the application.

In any case, Python is unlikely to be the right tool for this to work, for several reasons:

  • Python works primarily on desktop operating systems. Desktop operating systems impose a lower limit on maximum jitter; in case of Windows it is a few seconds. Multi-second events do not occur often, every day or two, and you would not be lucky to coincide with what you are trying to measure, but sooner or later it will happen; jitter in the region of several hundred milliseconds occurs more often, perhaps every hour, and jitter in the region with tens of milliseconds is quite common. The numbers for Linux desktops are probably similar, although you can use different compile-time options and patch kits for the Linux kernel to improve things - Google PREEMPT_RT_FULL.
  • Python stop-the world garbage collector makes latency non-deterministic. When Python decides, it needs to start the garbage collector, your program will be stopped until it finishes. You may be able to avoid this by carefully managing the memory and carefully adjusting the garbage collector, but depending on which libraries you use, you may not.
  • Other Python memory management features make deterministic latency difficult. Most real-time systems avoid heap allocation (i.e. C malloc or C ++ new ), because the amount of time they take is not predictable. Python neatly hides this from you, which makes delay control difficult. Again, using a large number of these good off-the-shelf libraries only makes things worse.
  • In the same vein, it is important that real-time processes store all memory in physical memory and not be unloaded for exchange. There is no good way to control this in Python, especially on Windows (on Linux, you could place the mlockall call somewhere, but any new distribution will break everything).

I have a simpler question. You do not say whether your button is physical or one on the screen. If it is one on the screen, the operating system will impose an unpredictable amount of latency between clicking the physical mouse button and the event arriving in your Python application. How do you explain this? Without a more accurate way to measure it, how do you know if there is?

+9
source

Python by purist standards is not a real-time language - it has too many libraries and functions that are quickly becoming bare. If you are already going through the OS, unlike the embedded system, you have already lost a lot of true features in real time. (When I hear “real time,” I think the time it takes for the VHDL code to pass through the FPGA wires. Other people use it to mean “I press a button and it does something from my slow human perspective , instant "I assume you are using the latest real-time interpretation.)

When displaying a stimulus and detecting a button click, I assume that you mean that you have something (for example), such as a test task, in which you show the person an image and press a button to identify the image or confirm that you saw its possible to test the reaction rate. If you are not worried about millisecond accuracy (which should be negligible compared to the human reaction time), you can do such a test using python. For a graphical interface, check out Tkinter: http://www.pythonware.com/library/tkinter/introduction/ . To work with synchronization between stimulus and button click, see the docs time: http://docs.python.org/library/time.html

Good luck

+4
source

Since you are trying to get a scientific measurement on a time delay accurate to the millisecond, I cannot recommend any process that needs to be cut temporarily on a general purpose computer. Regardless of whether it is implemented in C or Java or Python, if it works in a temporary mode, then how can the result be checked? You may be asked to prove that the CPU never interrupted the process during the measurement, thereby distorting the results.

It looks like you might need to build a specialized device for this purpose with a clock circuit that ticks at a known speed and can measure the discrete number of ticks that occur between the stimulus and the response. Then this device can be controlled by software that does not have such time limits. Perhaps you should post this question in the Electrical Engineering section.

Without a special device, you will have to develop really real-time software, which, as a rule, refers to modern operating systems, works in the kernel and is not subject to task switching. This is not easy to do, and it takes a lot of effort to fix it. More time, I would have guessed what you would have spent on creating specialized software to manage your goal.

+4
source

The most common interruptions of operating systems are variable enough to destroy the time in your experiment, regardless of your programming language. Python adds its own insecurity. Windows interrupts are especially bad. On Windows, most interrupts are serviced after about 4 milliseconds, but sometimes interrupts last longer than 35 milliseconds! (Windows 7).

I would recommend trying the PsycoPy app to see if it works for you. He approaches this problem by trying to get the graphics card to work in openGL, however, some of this code still works outside the graphics card and is subject to interruptions in the operating system. Your existing python code may not be compatible with PsycoPy, but at least you will stay in Python. PsycoPy is particularly good at demonstrating visual stimulation without time problems. See this page in your documentation to see how you will handle the click of a button: http://www.psychopy.org/api/event.html

To solve your problem correctly, you need a real-time operating system such as LinuxRT or QNX. You can try the python application in one of them to make sure that running python in real time is good enough, but even python introduces variability. If python decides to garbage collect, you will have a crash. Python itself is not real-time.

National Instruments is selling a program that allows you to program in real time in the very user-friendly LabviewRT programming language. LabviewRT pushes your code to a real-time FPGA daughter board. It is expensive.

I strongly recommend that you not only minimize this problem, but solve it , otherwise your reviewers will be uncomfortable.

+2
source

If you use Python code on a Linux computer, make the kernel low latency (proactive). There is a flag for compiling the kernel.

Make sure that other processes running on the machine are minimal, so they do not interrupt the kernel.

Assign a higher task priority to your Python script.

+1
source

All Articles