Tricentis, in collaboration with Flood, SpecFlow, and TestProject, conducted our annual State Of Open Source survey, with nearly 2,000 responses.
In this post, we'll explore the findings relating to performance and load testing for 2020.
Our biggest surprises this year were the combination of load-testing and continuous integration, which has seen a considerable increase from previous years and the consensus that it was a more niche workflow.
Not so much of a surprise, the continued adoption of open-source tools across organizations of all sizes, including large enterprises that have historically shunned open source products due to existing commercial vendor agreements. However, as we'll see, security and training remain as roadblocks for around a sixth of companies surveyed.
In terms of geographies and demographics, Asia is leading the adoption of open-source testing tools, making up 61%. With Europe leading the western market at 16%, the United States at 14%, and Australia trailing at 3%.
India and Vietnam are growing faster than any other region, and I expect we'll see substantial growth here in the coming years.
The main questions we want to answer from this survey are: is the role of performance testing changing? And is s open-source testing here for the long haul?
Major roles in testing
It comes as no surprise that QA teams are still responsible for about half of testing. At the same time, engineers do the remainder with broad functional responsibilities across operations, development, security, and performance.
We've seen a steady increase in our customers' technical skills over the last 5 years due to the technical nature of load testing. Still, we're now seeing this spread into broader testing disciplines, driven by the adoption of open-source tools.
Additionally, 33% of companies surveyed had specifically tasked performance engineers or teams that conduct performance tests regularly, which means QA is doing slightly less load testing overall than specialized teams.
And yet, around 10% of companies surveyed said that nobody was explicitly responsible for load testing.
Due to the rise of automated testing over the last 20 years, we haven't seen any breakout group in terms of years of experience.
There is perhaps a slight increase in the number of people who started in the field around 2010 to 2015—those who now have 5-10 years under the belt.
Top programming languages
If you're not doing model-based testing, you're likely writing tests in some form of programming language.
— Jeff Atwood, Cofounder of StackOverflow
Biggest Roadblocks to Adoption
While a wide range of organizations have adopted open-source tools, they still face roadblocks in some enterprises.
We'll break these down as Support & Training, Security, and Technical Capabilities.
Support and Training
Support and training remain a roadblock for adoption across all open-source projects, with very few investing the time required to write proper documentation, tutorials, and training material. This is a critical challenge is training your team when adopting a new tool.
For open-source, this has spawned multiple companies that focus specifically on supporting the implementation of open-source testing tools within organizations that would usually get training directly from a vendor.
Another risk is how often a project is updated, as most open-source projects don't follow regular release cadence, or provide an easy upgrade path between major versions — a challenge with all software, whether you pay for it or not.
As more and more testing tools move to the cloud, updates are becoming a thing of the past.
Interestingly, 15% of organizations still say that security and perceived vulnerabilities are a roadblock. However, when you talk to security experts, they say open source provides greater assurance of security because it increases the pool of eyes looking for vulnerabilities.
However, the ability of maintainers to implement fixes can increase security or leave projects open to exploitation. But at least the vulnerabilities are known.
In recent years an entire ecosystem of code analysis tools has been developed, which help automate the discovery and repair of vulnerabilities in open source projects, so it will be interesting to see how security trends as a concern next year.
Support for more esoteric protocols within a chosen tool, setting up a test environment, maintaining test scripts, and managing test data were all viewed as technical impediments to regularly conducting performance tests.
Testing systems built on SAP or Citrix that don't expose accessible testing interfaces, compound the problems faced by open-source tools in the enterprise.
With SAP, organizations spend millions on the software and the internal adoption, but finally, the transformation from one legacy system to something a few generations newer.
The CEO of one of the world's largest pharmaceutical companies once said it would be a competitive advantage to implement SAP for less money and time than their competitors.
These types of enterprises are still the holdout for open-source adoption.
While 16% of customers identified customer support as a roadblock, a similarly sized cohort prefers the community-based support offered by open-source tools.
The number one driver of open-source adoption is still cost, which is no surprise as commercial testing tools have historically run into the hundreds of thousands of dollars before training and professional services. Organizations can quickly make a significant saving on their QA effort by adopting open-source in the current climate.
Performance testing tools have a more storied history than a lot of modern QA testing tools, which is why performance testers view cost as less of a motivator for adoption then technical capabilities.
Lastly, freedom from vendor lock-in and ease of customization is still a strong driver for most surveyed companies.
The role of load testing
Automated testing has well and truly unseated manual testing, and load testing makes up nearly a third of testing overall.
At Flood, we've seen a strong trend to browser-based load testing in the last two years, as it now makes up 35% of all tests run on Flood. As a result of this, we hired three more engineers for the Element Core Team this year.
An interesting finding we're seeing is that performance testing has moved from a reactive to proactive testing discipline, with nearly two-thirds of customers conducting performance tests within a typical sprint cycle.
Thanks to easy integrations between Flood and most Continuous Deployment services, you can easily add load testing into your release process to prevent regressions that could catch you in production.
For performance testers, open-source tooling is critical to the job, which shows a significant move away from vendor apps of the past decades.
JMeter is still the most popular, with a little over half the market. The developer-focused tools such as Element, K6, Locust, and Gatling have seen an increase in adoption and now make up most of the backfield.
Is the role of performance testing changing? I think there is strong evidence that more generalists are conducting performance testing in development teams.
It is clear to say open-source testing here for the long haul, as we see a substantial move away from commercial closed-source tools where organizations can make that move. I have no doubt this is increasing in the coming years.
Still curious about the results? Check out the survey results infographic here.