So currently our dept’s tool shelf is kind of a mess and not really flexible. I’m almost done building a framework for a tool suite that I feel will be able to expand better our future needs.
In doing so I’d like to see what are the common tools people use so I’d like to keep a log of button clicks. My first thought was to have a decorator for each function that captures the objectName and updates some file on a shared network.
You might prefer to do it to a database – that way you don’t have to reinvent the datatypes or the way to collect stats. @Jeff Hanna ans @Adam Plechter have a very extensive in-house system for this, I bet they have some advice
I second the database suggestion. A database was the way I went when I last built a tool-usage tracking system and it worked really well. I also worked with a web engineer who was able to take my SQL database and set up a nice intranet web front for going through the data in a more human-readable form.
Yep, at Lionhead we did the same thing. We setup a decorator which we decorated our entry points which (at the tool level rather than the shelf - as people may make their own buttons etc). The decorator then connected to our database and stored the user, action, time etc.
We then setup a second decorator which would catch any failures and decorated critical functionality. This proved to be incredibly useful when determining which tools are solid and which needed more work.
Once they were in place we setup a nightly trigger to parse through the database and produce a report on which users were using which tools and how often - along with the failure rates. This was great for trimming unused tools but gave a secondary benefit of learning how people are working. For instance we have an asset management tool which is dockable - therefore there is little need to re-open it very often. Most users opened it a couple of times a day but there were a few users that we’re hitting it 80+ times a day… at which point we realised they didn’t know it was dockable, so we docked it by default.
If you distribute tools to outsources though don’t forget to create a simple mechanism to turn the logging off - as they will likely not have access to the storage db.
Our Python logging system is pretty simple in concept. I wanted to use a database so I wrote a “Database_Handler” class that can be used with regular (and excellent) Python logging.Logger objects. There’s lots of pre-built handler classes in the standard library to use but none that fires into a databases (in Python 2.6, anyway). My class is a subclass of logging.Handler, and overrides the “emit” function to take the log record passed in, pulls data off it and posts it to our MS SQL database. Each “tool” we have creates a logger object for itself and adds our Database_Handler, just like you would with a regular handler.
At that point we just add one-line events like these, where desired:
log.info( ‘Tool X launched’ )
log.info( ‘Whatever process took X seconds’ )
log.warning( ‘Something is not right’ )
log.error( ‘ERROR: Help me!’ )
… That kind of thing. You could also put calls like that inside a decorator, of course.
I’m glossing over some stuff, but that’s the gist of it. I’d be glad to go into the gory details if anyone likes.