The latest expert opinions, articles, and guides for the Java professional.
If I asked you to pick one aspect of RebelLabs that rocks, you might well pick the beautiful reports we’ve produced over the years. If I asked you to pick one standout report, it might well be one of the developer productivity reports we’re well known for. As the flagship report which RebelLabs has produced year after year, it gives me great pleasure to be involved so heavily in the 2015 edition. In fact, I’ve been happier than a kitten under a leaky cow – mooooooooo!
This year we decided to dive back into a specific area, rather than looking at a broader topic. Given our interest in the performance arena, having created the new developer focused Java profiler XRebel, it was a great time to explore a discipline with very few industry experts.
Back in March 2015, I created a list of questions based on Java performance that would give us insights into how teams and organizations go about their performance testing. A few months later I collated this data and examined it thoroughly to find trends that I could share with you in this report. We received 1562 responses to our survey this year.
Ideally, we’d liked to have passed the 2000 mark, but are happy that we have enough information on which we can base our opinions and find trends. We also raised loads of money for a great charity from completed surveys, so great job to all those who answered all our questions!
I sincerely hope you enjoy reading this report as much as I have enjoyed putting it together and I wish all your performance dreams come true!
— Simon Maple
Developer Productivity Report 2015 at a Glance
Java Performance is often considered the dark art of software development. In fact, aspects which you might think are the simplest of tasks, like benchmarking a piece of code, can in fact turn out to be among the most complex. It’s inevitable that different people and organizations will approach performance testing in different ways. This in turn means that the benefits you might see, if any, will vary from person to person. Many variables, including who runs the performance tests, the toolsets, the stage when you run your performance tests and many more, all add to the number of variables which will ultimately affect the success of your application performance – including your end user satisfaction.
This report sets out to understand how performance testing is currently dealt with by different organizations and teams within those organizations. We aim to understand trends, best practices and pitfalls that could be avoided based on how things are done today, with the data we have collected.
The survey itself was released in March 2015 and contained 19 questions, profiling the respondent, their application, their processes, tools and more. Overall we received 1562 responses. Although we wanted to hit the 2000 mark, 1562 is a sufficiently large enough sample to see trends.
While we expect everyone to be eagerly waiting to complete our survey each year, we understand that your time is valuable. We again decided to add a charity donation for each respondent who took the time and effort to complete our survey. This year we chose Dogs for the Disabled as our charity. Here’s a description of Dogs for the disabled from their website.
We committed to donating $0.50 (USD) to Dogs for the Disabled for every completed survey, which equates to $781. We rounded this up to $1000 as we’re kind and because it’s such a deserving and rewarding charity! So if you were one of the many that completed the survey, you should feel good that you contributed and have made disabled children’s lives better. High-fives all round!
Dogs for the Disabled is working to provide solutions to help people with a wide variety of different disabilities and conditions; from assistance dogs helping children and adults with physical disabilities and families with a child with autism, to pet dog autism workshops, and innovative new projects working in schools and residential care settings.
– DOGS FOR THE DISABLED, http://www.dogsforthedisabled.org/
The report is split into 3 sections. The first is a representation of the raw answers given to the survey questions. No fluff, no pivoting, just answers! Parts 2 and 3 provide a more in-depth analysis, with pivoting on the data points to understand trends. Pivoting? Wasn’t that a thing in my old physics class with a see-saw, some forces and some kind of formula? Well, maybe, but in this case, we’re asking questions about our data, based on the answers given to other questions. For instance, do those who state their application has over 100 screens find more bugs? Or perhaps: do organizations with dedicated performance teams have fewer complaints from their end users? We’ll ask a raft of questions like this and let our data try and answer them. But for now, let’s start with Part 1 and the raw data.
P.S. Oh and by the way, if you’re just getting started with performance testing, a RebelLabs report that will help you learn and understand the performance basics is also available for you to download – The Developers Guide to Understanding Performance Problems. It will give you a great base knowledge on all things related to Java Performance, including terminologies and tooling. Make sure you read this one too!
No comments yet.
Leave a comment