I have 2 separate Python programs (for example myp1.py and myo2.py) must be running in Debian. It appears system memory consumes a lot. So I try to detect memory usage of each .py and decide to relaunch .py in a sh script that triggered by crontab.
Is it possible to have any hint for the commands used to grab memory usage of each .py in sh script please?
2
Answers
In
/proc/[pid]/status
there is a lot of information about the process. You would be interested in the fields VmPeak and VmSize.You state that you have a lot of memory consumption. I don’t know where you got that value from, but you should look at the output of
vmstat 5 5
if there is a memory problem on your system.Memory on Linux is used as much as possible. In the
vmstst
, you will see that quite a lot is used for cache. That makes the system faster. Do not look at thefree
column; it says little about the actual use of memory for processes.Memory becomes a problem when the columns
si
andso
are high. On most modern systems, they are 0 or occasionally 1 or 2. These columns indicate how much memory is swapped in and out to disk.here is the bash code to run (make sure the .sh file is executable – use chmod-):
save the file (FILENAME.sh), then execute
sudo chmod 777 FILENAME.sh
, then you can execute the following to see all PIDs that use python and see how much memory they are using:In order to filter based on specific Python script names, you could use grep with the bash script to filter them out:
make sure to use the flag -E to use the regex engine and then pass the regex_pattern (the filename pattern) you are looking for.
An example I used to test it:
I got the following:
I hope this helps!