How to track memory for python script

We have a system that has only one interpreter. A lot of user scripts come through this interpreter. We want to put a cap for each memory usage script. There is only a process, and this process calls chains for each script. Since we have only one interpreter and one process, we do not know how to set a limit on the use of each script memory. What is the best way to do this.

+4
source share
2 answers

I do not think this is possible at all. Your questions imply that the memory used by your tasks is completely shared, which is probably not the case. Python optimizes small objects like integers. As far as I know, for example, every 3 in your code uses the same object, which is not a problem, because it is imutable. Therefore, if two of your talismans use the same (small?) Integer, they already use memory .; -)

+3
source

The memory is shared at the OS level. There is no easy way to tell which task and which thread belongs to a particular object.

In addition, there is no easy way to add a custom bookkeeping manager that will analyze which task or thread is allocating some memory and preventing too much allocation. You also need to enable the garbage collection code to free objects that are freed.

If you do not want to write your own Python interpreter, it is best to use a process per task.

You don’t even need to kill and update the interpreters every time you need to run another script. Combine several interpreters and only kill those that outgrow a certain memory threshold after running the script. Limit memory usage by translators with tools provided by the OS, if you need to.

If you need to share large amounts of shared data between tasks, use shared memory; for smaller interactions, use sockets (if necessary, the level of messaging above them). A.

Yes, it may be slower than the current setting. But from your use of Python, I suggest that in these scenarios you still don't do critical computing.

0
source

All Articles