Hi,
I have a python client on all out machines that log every few seconds what they are doing, listen for commands (pyzmq) etc.
Actually, I write a report file, then my manager python app read all reports.
It works great, but in some circumstance, clients can’t write their report, mainly when max is opening a scene via backburner and loading all maps. I guess there is a I/O bottleneck on the system, and the report hangs. All comes fine a few minutes later when system is responsive again.
So, what would be the best way to transmit such report data ?
I’m thinking about sending it to an sql server to update a database field ? Our sql server is a low use for our intranet and few custom/projects managing, would having around 50 machines updating its database make it hang ? It is an easyphp install which run fine actually.
Would reading 50 database fields be faster than reading 50 5Ko files ?
What is the best practice for reliability in such use ?
Any better idea ?
Thks