The latest expert opinions, articles, and guides for the Java professional.

Developer Productivity Report 2012: Java Tools, Tech, Devs & Data

Introduction to the Java Developer Productivity Survey

Digging into data searching for insights is always an exciting activity. Last year’s Java EE Productivity Report was focused on tools/technologies/standards in use (exclusive selections) and turnaround time in Java (i.e. how much time is spent per hour redeploying/restarting).

In this year’s Java Productivity Report, we expanded the selection of technologies and tools available to Java development teams to choose from, made them non-exclusive, and covered more areas, for example Version Control Systems and Code Quality Tools. We also focused more on the question, “What makes developers tick?” and learned a lot about how devs spend their work week, what elements of the developer worklife increases/decreases efficiency, and what stresses out developers. We found a lot of interesting trends and insights, and broke them down into 4 parts:

Part I: Developer Tools & Technologies Usage Report
coverage of Java versions, JVM languages, IDEs, Build Tools, Application Servers (containers), Web Frameworks, Application (server-side) Frameworks, JVM Standards, Continuous Integration Servers, Frontend Technology, Code Quality Tools, Version Control Systems

Part II: Developer Timesheet
How do devs spend their work week?

Part III: Developer Efficiency
What aspects of your job make devs more/less efficient?

Part IV: Developer Stress
What keeps devs awake at night?

But before we go any deeper, let’s review the data as a whole.

What if you no longer had to redeploy your Java code to see changes? The choice is yours. In just a few clicks you can Say Goodbye to Java Redeploys forever.

Although we ran the survey through an open call on the web, there is a certain amount of bias present in it. Obviously, our customers are more likely to reply to the call than anyone else. Also, it’s likely that people who find and take such surveys are less conservative in their technology choice than those who don’t. To protect against such bias, we asked for the size of the company and the industry, to be able to normalize the data, if there is any overrepresentation. However the no size or industry was over-represented, so aside from some “early adopter” bias there shouldn’t be a lot of misrepresentation.

Another thing to understand is what are we measuring. Popularity can be measured in different ways — number of users, number of organizations, number of lines of code, etc. In this survey we measure the penetration of tools & technology in the market. So 10% of respondents means that likely 10% of the organizations that do Java development use this tool or technology somewhere in the organization. It does not tell us whether it is critical for them or not, whether it is used a lot or only in rare cases and whether it is used by everyone or only the respondent.

tools and tech leaderboard


Responses (9)

  1. Avatar  


    May 15, 2012 @ 1:21 pm

    Need more robots.

  2. Avatar  

    Orest Ivasiv

    May 17, 2012 @ 10:22 am

    Guys, it’s a very nice report. Keep going. Looking forward for the next report ;-)

  3. Avatar  

    Phillip VU

    August 23, 2012 @ 10:29 am

    It looks like some helpful tools were missed in this report.

  4. Avatar  

    J. Johnson

    September 18, 2012 @ 4:19 am

    It will be great if it covers ZK and specific JSF solutions, such as RichFaces nad PrimeFaces.

  5. Avatar  


    September 18, 2012 @ 5:50 am

    as with any statistical slice, there are tools that are more popular and some cool tools might be mentioned just a couple of times which makes the number negligible. It is very possible that very very very cool tools just didn’t make it to the list as not many people mentioned ’em

  6. Avatar  


    September 18, 2012 @ 5:53 am

    Surprisingly, ZK received a negligible number of votes for the survey. Next time be sure to vote for it! :)

  7. Avatar  

    PIeter Humphrey

    October 15, 2012 @ 5:28 pm

    great report — do you publish any details about survey size, sample, methodology?

  8. Avatar  

    neil stockton

    November 12, 2013 @ 7:32 pm

    LOL. JDO lacked a good implementation for years and that is your reason it “lags behind” ? Suggest you do a bit more background reading. DataNucleus (previously JPOX) was RI from 2006, and there were some v good commercial impls before that. What caused it to “lag behind” was politics from RedHat, Oracle and IBM. This is common knowledge, just ask anyone involved in JSR0243. Suggest you at least get the facts straight if pushing out any further such “reports”

  9. Avatar  


    August 4, 2014 @ 9:33 pm

    Agree – I thought that comment was interesting myself. There’s a JPA kool-aid drinking factor in play too. “Everybody’s doing it, so it must be the best.”

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.