Blog

The latest expert opinions, articles, and guides for the Java professional.

Developer Productivity Report 2015: Java Performance Survey Results

Tools and Processes

Developer Productivity Report 2015 Java Performance Survey tools and processes

Process, process, process!

Processes that are put in place (with the best of intentions, usually!) will vary widely from organization to organization. Some organizations will have no process whatsoever and only the good will and diligence of a developer will keep an application from spiraling into a fireball of stack traces and Twitter hatred. Conversely, others (less fortunate than ourselves) will have more processes than lines of code, and twice as buggy. One aspect we care for very much is *when* performance tests are run. You will gain different value from testing at different stages. It’s largely accepted that testing as early as you possibly can will result in fixing issues faster and cheaper than when they were found later. This mentality should apply both to functional testing as well as performance testing.

While there is obvious value in performance testing against a production or staging environment, if we can shift as much as possible to the left, putting in effort earlier in the cycle, we should expect better applications that are cheaper to develop. Figure 1.7 shows that 37% of respondents test their application for performance while it’s being developed. That’s a great statistic and shows how important people believe it is to test early. The most common phase to run performance tests was during system/integration testing. Overall, there is a reasonable spread throughout the application lifecycle which is reassuring. As a multiple choice question, the average number of options selected by a respondent was 1.87.

Developer Productivity Report 2015 Java Performance Survey Results Figure 1.7 at what stage do you perform profiling and performance tuning on your applications

Irrespective of who finds the performance problems, it’s clear from figure 1.8 who fixes the issues that emerge from performance testing. With almost 94%, it’s the developers that apply the fix. Yeh, the same people who wrote the code in the first place. As a result, it makes even more sense to performance test your code as you develop it, as the fix might as well be applied while the bug is being written!

Developer Productivity Report 2015 Java Performance Survey Results Figure 1.8 who fixes the performance problems

Now we move to the question of test frequency. Profiling is an important part of performance testing, so that we can really understand execution paths, bottlenecks and so forth. However, figure 1.9 shows this as an extremely reactive activity. Over 40% of respondents state they profile their code only when issues arise, rather than something that’s typically done on a regular basis. Other than that answer, the results are largely spread across the other options with almost one in ten stating they never profile their code at all.

Developer Productivity Report 2015 Java Performance Survey Results Figure 1.9 how often do you profile your code

Before we look at how much time is spent performance testing during a release, we need to understand how long releases typically are. We can see the usual bell curve here, albeit a bell on its side. With almost half (46%) of the respondents saying their releases are between one and six months, we can say the bulk sit in medium sized release cycles. 21% of respondents release their application every two weeks or less, with a further 21% releasing every 2-4 weeks. 10% of respondents release every six months or more. But how does this affect the amount of time spent performance testing? We’ll find out later on!

Developer Productivity Report 2015 Java Performance Survey Results Figure 1.10 how long is your typical application release

Sadly, our next data shows that one in four respondents (26%) spend no time on performance testing during a release. Perhaps this says something about the types or size of changes that tend to go into releases. Over 55% of respondents state they spend 1-5 days performance testing each release, which is much more encouraging. The remaining respondents test for 1 or more weeks each release.

Developer Productivity Report 2015 Java Performance Survey Results Figure 1.11 how much time per release do you spend on profiling and performance monitoring

The Toolset

The proverb suggests that a good worker never blames their tools. We’ll not be looking into how effective each tool is just yet, but we will look at what is being used. VisualVM seemed an extremely popular choice as did JProfiler and Java Mission Control, the performance tool bundled with Oracle’s distribution of Java since version 7u40. Interestingly, 20% of respondents state they have their own custom in-house tools. Developers love to develop, right! XRebel is also worth mentioning with over 3% of the votes, particularly because it’s barely a year old!

Developer Productivity Report 2015 Java Performance Survey Results Figure 1.12  which tools do you use for application profiling

Another important metric to recognize is that there isn’t a single killer tool that does everything. In fact, if we look at the number of tools used by our respondents, almost half of those that took the survey claimed to use more than one performance profiler for their application. On average, 1.7 tools were used picked by each respondent.

Developer Productivity Report 2015 Java Performance Survey Results Figure 1.13 number of tools picked


DOWNLOAD THE PDF

No Responses

No comments yet.

RSS feed for comments on this post.

Leave a comment