Blog

The latest expert opinions, articles, and guides for the Java professional.

Developer Productivity Report 2015: Java Performance Survey Results

Part 3

Well, you made it to the final part of the report, great job! In this section, we’ll analyze the survey results for tools, testing frequency, stage and duration. We’ll go on to look at the best practices to understand what the differences are between those respondents who stated their users are unaffected by performance issues and those who say their users are affected. But first, we’ll look at the tools respondents use.

Tooling

Developer Productivity Report 2015 Java Performance Survey  tooling

Using the right tool for the job

Tools are always considered one of the most important parts of a task. Very often process and timing and other aspects get forgotten and too much importance is placed on tooling. Let’s take a look at our toolset and compare which finds more performance issues.

In figure 3.1 we see the two leaders are Custom in-house tools and JProfiler with 8.75 and 8 issues found in each release. One might assume that if someone is knowledgeable enough about their environment and performance to write their own tools, they will find more bugs than the average person anyway! In third place is XRebel, one of the new kids on the block with 5.84 issues per release found, followed closely by the NetBeans profiler, JProbe and Java Mission Control.

Developer Productivity Report 2015 Java Performance Survey  figure 3.1 which tools find the most bugs

With Performance testing there is no such thing as a silver bullet. In fact it’s very common to use multiple tools that are fit for specific purposes. For example, XRebel is designed for development. Java Mission Control has a very low overhead, so it works really well in production etc. As a result, let’s see if people who use multiple tools really do see more issues per release, or whether there’s a diminishing return if you’re a multi-tool pro. Figure 3.1 shows a steady growth in the increase of issues found per release with a slight anomaly for 2 tools, but the values for 1 and 2 are close enough that there’s not much we can read into it – other than it being an anomaly.

Developer Productivity Report 2015 Java Performance Survey  figure 3.2 does using multiple tools reveal more bugs


Profiling Techniques

Developer Productivity Report 2015 Java Performance Survey  profiling techniques

I love deadlines. I like the whooshing sound they make as they fly by.

– Douglas AdamS

When, what, where, how long?

As mentioned previously, there’s more to performance testing than just tools – the process is equally important. For instance: how frequently should you profile your code, when during the release lifecycle should you test and how long should you be testing for in your release? We wouldn’t dare go 1-2 months of development without running unit or functional tests, would we? (The answer is no, we wouldn’t.) So we should set the same standard for our performance testing.

In figure 3.3, we compare how frequently people profile their application code with how much faster their application is after their performance testing. Now we’re not saying anything about the total duration of performance testing here, just the frequency that it’s done at. There’s a clear trend here that those who test frequently are more likely achieve a better performing, faster application than those who test infrequently. Those who test monthly or quarterly, follow the same trend with results that sit neatly in the middle of the two other sets of statistics.

This is exactly what we’d expect for unit or functional testing too, which is why we run unit tests during development and functional or system tests in CI. We should expect nothing less from the way we do performance testing.

Developer Productivity Report 2015 Java Performance Survey  figure 3.3 do teams which profile more frequently achieve bigger performance increases

Does Time Really Heal Everything?

We’ve talked about the frequency of profiling, but now let’s move on to duration of performance testing in a release. Now obviously all releases will be of differing durations, so let’s see if we can see any trends between length of release and total time performance testing in that release. We’d hope that for teams with substantially longer release cycles, they’d be spending more time testing. Let’s see if that’s the case. Figure 3.4 shows a line graph plotting the length of time in a release vs the total time spent performance testing in that release. If we look at the first point that’s plotted for all the data sets, we notice that you’re more likely to omit testing altogether if your release cycle is shorter. As we look at the next data point, < 1 week, we can see that you’re most likely to spend less than a week testing performance if you have a shorter release cycle. Also, you’re most likely to spend a shorter amount of time performance testing. Further to this, you’re over twice as likely to spend no time at all performance testing. If your release cycle is every 6 months, you’re 5 times less likely to spend 1-2 days on performance testing and much more likely to spend 1-4 weeks running your performance testing. Again, our mid data point roughly follows the same trend, which means that our prediction was thankfully right and with longer release cycles teams do indeed put additional effort into their testing as we’d hope! Developer Productivity Report 2015 Java Performance Survey  figure 3.4 as an application release gets longer does the time spent performance testing increase as well


DOWNLOAD THE PDF

No Responses

No comments yet.

RSS feed for comments on this post.

Leave a comment