The latest expert opinions, articles, and guides for the Java professional.
Developer Productivity Report 2013 – How Engineering Tools & Practices Impact Software Quality & Delivery
TL;DR – Let’s Close This Out!
For those of you too lazy or distracted to read the entire report, you can come to this section to see all the juiciest statistical morsels from the entire report, and the observations & conclusions we made based on the data we collected from over 1000 engineers.
Summary of overall findings and a goodbye comic :-)
Part I – Metrics: How to Measure Quality & Predictability
Neither quality nor predictability was significantly affected by industry or company size, which is good to see.
In terms of Quality, nearly 60% of releases on average go to production free of critical bugs, so those teams can feel proud. The bottom 10% get apps out the door bug-free only 25% of the time, whereas the rock stars of the group enjoy releasing their apps into production without critical bugs 75% of the time.
When it comes to Predictability, the industry can predict deliveries within a margin of 60%, which matches up with the anecdotal data on late releases, features that needed to be cut and unplanned scope creep. The rock stars get to that enviable 80% for predictability, which as we said is probably the reasonable limit in being able to predict your delivery.
Part II – Practices: How the things you do affect Quality & Predictability
There are certain practices that significantly influence the quality and predictability of software releases. Fixing code quality issues (up to +7% and +9% respectively) and automating tests (up to +9% and 12% respectively) are excellent practices to utilize across the board.
In terms of improving software quality alone, having developers pair up (up to +7%), allowing developers to test code (+5%) and avoiding too many meetings each week (-4%) does the most to increase quality.
In terms of predictability of releases alone, doing code reviews for commits (up to +11%) is the largest single beneficial practice, along with estimating tasks as a team (+6%); however, involving the management drops it by -6%.
Part III – Tools: How the tools you use affect Quality & Predictability
We can also see a relationship between the tools respondents use and how these tools affect predictability (quality was not significantly affected by any tool sets, prompting us to remember that quality is based mainly on practices, not tools).
The tool types that increase the predictability of releases most are Version Control (+9%) and IDEs (+8%), but Code Quality Monitoring, CI, Profiler, Issue Tracker, and IaaS solutions (up to +5% for the group) also improve release predictability. The top 3 individual tools that enhance predictability are JRebel (+8%), Jenkins / Bamboo (+4%) and Confluence (+3%).
In terms of popularity, here are the technologies used by over 50% of respondents: IDE (97%), Version Control (91%), Issue Tracker (79%), Debugger (71%) and Continuous Integration (68%)
The Top 10 tools/technologies used by respondents: Subversion (58%), JIRA (57%), Jenkins (56%), Git (47%), Skype (39%), Confluence (30%), Google Drive (27%), GitHub (22%), Google Hangout (17%) and Bamboo (10%).
Finally, we display what the rock stars of the group (those in the top 10% for quality and predictability metrics) are using. Basically, the message from these folks is: use Jenkins, look for something other than Google Drive, choose Git instead of Subversion, and opt for Google Hangout over Skype.
Leave a comment