How do I verify my Return On Investment?

I've been advocating resharper adoption to the senior management of our company for sometime now and adoption rates have been steadily increasing.  We have over forty .NET developers at present and the number is steadily increasing.  Most are now equiped with R#.  However a new project has been allocated recently requiring a significant number new licenses to be acquired.  However in order to justify the allocation I have been asked to verify the return on investment we have received with our current adoption of the tool.

This is a difficult prospect without tooling as you might imagine.  Is there by any chance any tooling that might assist with answering this question?  It would be nice, for example, to able to determine a breakdown of green status files versus red and yellow status files within any open solutions at any point and time.  The ability to drill down to any particularly low quality solution would be nice help for enforcing developer compliance with quality standards too.  In fact there's a whole suite of metadata about my developers habits I'd like to be able to snoop on in realtime with resharper and possibly track the quality history of their created files over time.

I think a tool like this would make a nice companion product actually.

Reagrds, Anthony Geoghegan, .NET Architect, Decare Systems.

2 comments
Comment actions Permalink

Anthony,

Thanks for an interesting question.

First of all, there's currently no tooling to help with estimating ROI.

I assume that technically it's possible to create a plug-in for ReSharper that would help track different kinds of user activities with the product - in a similar fashion to the plug-in created by Industrial Logic for their unit testing courseware.

Usually there's demand to compare (and measure) benefits provided by ReSharper compared to functionality of plain Visual Studio, which unfortunately can't be measured because that would require having one developer doing a single task (or set of tasks) in two different environments.

However, if the goal is not to compare ReSharper to Visual Studio without ReSharper but rather to analyze the workflow of a developer when he/she uses ReSharper and whether using ReSharper influences his/her habits, it's critical to know the exact criteria to track. You have cited one criteria as "a breakdown of green status files versus red and yellow status files".

Can you please advise more criteria that you personally would be willing to track so that we can consider developing a plug-in that does this kind of estimation in future? Particularly, when you're saying

In fact there's a whole suite of metadata about my developers habits I'd like to be able to snoop on in realtime with resharper and possibly track the quality history of their created files over time.


that sounds interesting but requires clarification: what kind of habits do you mean and how would you expect the habits to evolve when using ReSharper so that statistical data reflecting this kind of evolution would be qualified to count as a ROI indicator?

Thank you!

0
Comment actions Permalink

Firstly, what a refreshing environment where vendors actually listen to their customers :^O.

Anyway, back to business.  The breakdown of green to yellow and red files is based on a simple premise that green files have better quality.  If I can see a trend where projects are obtaining a higher percentage of green files I can assume that quality is improving and I can supply numbers to support this.

Speaking to the other kind of metrics I'd like to track:

  • How many files touched versus status changes, i.e., are files being edited without having their overall quality improved?
    • This is a big deal for me.  The tool can save our clients money, but not if people ignore it.
  • Simple reports to track the worst offenders for discussion and remediation or retraining as necessary.
    • Knowing who to target makes it more effective.
  • Who is consistently producing the most green files?  I'd like to use a carrot with those implementing policies well.
    • I consider save time not really appropriate for capturing all these metrics BTW,  I think on check in to source control is a better snapshot event.  But I'm not sure how viable that might be.  I know that a project just under development versus one in production would have a very different breakdown of quality but we can assume managers would recognize that.  The implicit point is that committing code
  • We could implement a style infringement burn down chart rather like a bug burn down chart.  As a means of showing the amortization of any technical debt incurred in rushing a project to market, say.
  • Each solution could publish a distinct quality score based on the relative percentage of green to red to yellow files.  Thus you'd have good data for a quality dashboard.  It wouldn't be difficult to include code coverage and CI related information such as build failures on that too.  Many a CTO or CIO would love to have something as insightful as this.

I'm not saying that these metrics will actually do all they claim (or appear to claim), but they may go some way to show the tool is being used and that it's giving greater benefit; plus providing an ongoing, easily consumed insight for the less technical managers.

Something like this could be of benefit to you as well.  If an analysis tool examined how much resolution of problems is being adopted in the real world, what techniques are used to resole them too you might get some good business intelligence on how your tool is used and what your users do when confronted with issues.  For example, when confronted with an analysis issue without a simple resolution is the online help investigated etc.

Hopefully some of these ideas might be considered viable.
Regards, Anthony.

0

Please sign in to leave a comment.