Skip to main content
Page Tittle
Performance Test Automation - Why Does it Matter?
Images
Performance Test Automation - Why Does it Matter?

Quality Engineering is now being realized as a key driver in Business value generation and improving customer experience. As per Technavio’s Global Software Testing Services Market report, published in July 2018, the software testing industry will witness a significant growth during the period 2018-2022, with revenues reaching close to $68 billion by 2022. These numbers give a clear sense of the market potential and the rising importance of Quality Engineering within the IT Lifecycle.

While functional testing has always been an important part, companies are realizing that performance issues are critical to customer perception of the “Experience” in using applications. While some organizations are slower to adopt a complete performance testing strategy, they’ll quickly realize that having no strategy will negatively impact bottom line goals.

Here are a few of the ways that performance testing affects business ROI, according to Radware:

  • 51% of online shoppers in the US say that site slowness is the top reason they’d abandon a purchase.
  • Shoppers remember online wait times as being 35% longer than they actually are.
  • 2-second delay in load time during a transaction results in abandonment rates of up to 87%.
  • The total cost of abandoned shopping carts for online retailers has been estimated at more than $18 billion per year.
  • 64% of smartphone users expect pages to load in less than 4 seconds.
  • When faced with a negative mobile shopping experience, 43% of consumers will go to a competitor’s site next.

Tools for automated performance tests always have existed, from the old WinRunner to modern toolsets that can simulate global user interaction across tens of thousands of users. While these performance test automation tools are evolving with the requirements of the industry, what is changing is how sophisticated automated performance tests are becoming and how they are being integrated in a Continuous Delivery model.

Test Automation in NFT

Automated testing of functional requirements is common, but it is rare in the NFT world. The challenge that most companies face is the difficulty of automation in NFT. More often than not, automating the entire process of NFT is usually left out of the scope of CI/CD. The reason is usually the complexity and width of large-scale NFT. At some point, the test scenario becomes so heavy that the only way forward is to run the tests manually.

QA teams have been advocating for Non-Functional Testing to be performed using automation tools to test the ‘system/application’ under different scenarios. The aim is to be able to equate the tests to real-world scenario. For example, increasing the load on all the CPU cores on which the application is running and checking the performance of the application under such a scenario. To facilitate a real end-to-end automation, including non-functional testing, teams need to focus on developing specific point solutions to avoid manual interventions.

In an environment where tests are rerun frequently, QA automation can be a reasonable investment. Often used as part of a continuous integration process, as well as an agile software development approach, test automation can help test the whole product at every iteration, with minimum effort and quick turn-around time. And as the trend continues to gain wider adoption, according to a survey, 86 percent of companies have the intent to apply automation in their testing processes.

Why Performance Test Automation

Quick and effective: Executed by special software tools, automated testing requires significantly less time and effort.

Better long-term ROI: While requiring some upfront investment, testing automation proves to be more cost-efficient in the long run.

Transparent and meticulous: Testing automation gives better transparency, collaboration and visibility the team.

Integrating Automated Performance Tests in CI/CD Pipeline

Both functional and non-functional testing are critical parts of the software development lifecycle. Most companies do not face a problem in implementing appropriate automated functional testing within a CI/CD process, but, implementing performance test automation can be demanding.

Non-functional tests such as performance tests have limitations when it comes to running it within CI/CD process. These tests are specifically sensitive to the details of the runtime environment. For a performance test to be reliable, the infrastructure in which the tests run must be consistently appropriate. Some NFR tests require that the runtime environment is specially provisioned to support the purpose of the tests. However, the environment is not the only limitation when it comes to running such tests under CI/CD. These tests’ execution time is also an impediment, in addition to the cost involved.

Despite the challenges, CI/CD can support reliable, automated performance tests when a level-based approach is taken. However, there will still be times when parts of the automated non-functional testing will need to take place outside the CI/CD pipeline. In such a scenario, companies can opt for cloud-based testing service that provides the automation testing tools, software, and reporting capabilities, necessary to perform stringent, long-running tests on a pay-as-you-go basis. Using a cloud-based service can be economical, while providing the hardware needed along with the runtime capacity required.

When conducting performance tests in a continuous-integration environment, testers must design test cases with a minimum run or execution time. When the execution time is short, the test accuracy is susceptible to small fluctuations. Dynamic architecture validation helps identify potential changes in performance and in the internal processing of application cases. This includes the analysis of response times, changes in the execution of database statements, and an examination of remoting calls and object allocations.

It is not possible to extrapolate results from the continuous-integration environment to later production-stage implementation. The point is to streamline load testing by identifying possible performance issues earlier in development, thereby making the overall testing process more agile. Moreover, it is important to have a realistic approach towards understanding the company’s technical stakeholders and the contributors about the limitations of automating performance testing in the CI/CD process. For a successful testing automation, companies need to chart out the difference between performance and functional testing. They also need to create a level-based test plan, which can determine the layers in application stack most suitable for automated performance testing. For most part, functional tests can be automated without hurdles, while automating performance tests poses a tougher challenge. However, this difficulty can be mitigated when a level-based approach is taken.

Performance

Creating a Level-Based Test Plan for Automated Performance Testing

The trick to automating performance testing in a meaningful manner is to take a level-based approach. Level-based performance testing is a process by which automated performance tests are executed on components at various levels of technology stack. Performance testing, particularly automated performance testing, is best done in an isolated manner at each level of the stack. Each level in the stack refers to different components/modules of the application, APIs, web services and DB-specific tests.

Running short automated test performance scripts against various levels of the technology stack is a more realistic approach than a top-level assault on the system overall. There are just too many parts in play to be adequately accommodated by a single, high-level approach to performance test automation. Performance testing using a level-based approach allows for a good deal of automation testing.

An end-to-end business flow specific Performance Test Strategy that runs a sequence of these component level tests is still critical to test the overall response of the application, but testing components at the early stages (shift-left approach) can reduce the testing time and help in early detection of performance issues.

To conclude, whether a company goes cloud, hybrid, or keeps its performance automation testing infrastructure in-house, the important thing is to make performance testing – from level-based testing focused on isolated components to full scale pre-release, regression testing – an essential part of the company’s automated QA testing process, within and beyond the Continuous Integration/Continuous Deployment pipeline.