A Jason Blomberg's post [1] implicitly encourages to paraphrase the old joke with buzzwords, such as 'digital transformation.' In fact, the joke applies well to many buzzwords, but not all the buzzwords come to stage 4. Take 'cloud computing.' Even though the cloud is now well-understood and mature, the terminology is in no danger of disappearing. Why? Because you use the cloud by yourself or you know people that use the cloud: you can point to AWS (IaaS) or Salesforce (SaaS) or whatever and say, that’s the cloud. What about test automation? Let us discuss and compare two editions (2015-16 [2] and 2016-17 [3]) of a well-known QA & Testing annual survey, the World Quality Report (WQR).

Stage 1: What is test automation?

All the analysts of the Testing & QA market share two predictions/prescriptions: in a more or less distant future, service/API testing shall overstep UI (both laptop and mobile) testing, and test automation shall take charge of more than 80% of the testing activity. We are speaking about black-box testing of single components, with the help of simulated downstream (stubs) and upstream (clients) services, and gray-box testing of distributed architectures, with the help of intercepting proxies placed on the service dependency wires. All the analysts champion test automation as a kind of ideal remedy that overcomes the majority of the testing pains for businesses.  What is test automation seems clear to everybody: just do it!

Stage 2: We need test automation!

The average IT budget allocated to QA and Testing in 2015 had increased by 9% since 2014 to 35%. While this astounding increase suggests a growing awareness of the value of QA and Testing, it also implies that the practice was not able at once to meet increasing demands and still maintain a sound level of efficiency. Furthermore, in 2015, 54% of the respondents of the WQR 2015-16 had adopted Agile development and DevOps, whose shorter time frames can be met only by increasing levels of testing velocity, efficiency, and effectiveness, supported by virtualization and cloud-based test environment solutions (adopted by 43% of the respondents).

Speaking about test environments, we should distinguish between (i) the testbed and (ii) the test harness. The testbed is the environment where you deploy your system under test. If your system is a distributed service architecture, it is more appropriate to implement a distributed testbed, which approaches the conditions of the production environment. If your architecture is cross-organizational, your testbed is inevitably distributed. The test harness is the environment where you deploy your test engine, that is in charge of stimulating the system under test, collecting its feedbacks, and eventually evaluating them. If you can deploy on cloud both environments, you gain in flexibility, but, in the real world, you have to overcome several obstacles that we do not detail here.

The WQR 2015-16 relates that the average percentage of test case automation adopters had grown from 28% in 2014 to 45% in 2015. On the other side, the 'Inability to apply test automation at appropriate levels' to Agile approaches and DevOps processes by the respondent had gone down from 55% in 2014 to 23% in 2015. It seems that companies had successfully navigated their way from test automation stage #1 to #2. Notwithstanding, there were still 39% of the WQR 2015-16 respondents that considered 'Reliance on manual testing' as the number one technical challenge in the application development lifecycle. Furthermore, as top test automation pains, 31% of the interviewees cited 'We don't have the right automation tool' and 'We don't have the right automation testing method/process.' OK, we can think that those people are the laggards of test automation, which is otherwise promised to a brilliant growth.

In the WQR 2016-17, perspectives change radically. The survey labels as a notable trend 'Significant drop in test automation after last year’s increase.' 45% of the WQR 2016-17 respondents cite, among the top challenges in application development,  'Lack of effective build/integration automation' (from 23% in 2015), 'Reliance on manual testing' (41% - 39% in 2015), and  'Manual environment management and deployment' (39% - 22% in 2015).  Furthermore, the 'Inability to apply test automation at appropriate levels' to Agile approaches and DevOps processes jumps from 23% in 2015 to 41% in 2016 citations. 

The top pain is 'We don't have the right automation tool,' jumping from 31% in 2015 to 45% in 2016. Other pains follow the same trend with more or less amplitude. The respondents cite 'Lack of skilled and experienced test automation resources' by 34% in 2016 (12% in 2015) and 'We don't have the right automation testing method/process' by 35% in 2016 (31% in 2015).

Furthermore, respondents cite challenges that they didn't take into account in 2015, when 'what is test automation' seemed clear for everybody. They concern methodological and organizational challenges, such as test data and environment availability, tight time constraints, and the use of multiple development lifecycles. The automation tools are also sources of further annoyances, such as their diversity across the testing cycle, with high license costs but low utilization and ROI, and the difficulty in integrating them together. Other pains concern the instability and high maintenance costs of ad hoc, tailor-made, custom-built, and error-prone test environments (harnesses and testbeds).

Confronted with such pains, we could ask: why test automation? The identified test automation top gains in both the 2015-16 and 2016-17 surveys are: (i) better detection of defects, (ii) better reuse of test cases, (iii) reduction of test cycle time, (iv) reduction of test costs, and (v) better control and transparency of test activities. The ascertainment of these benefits goes down too, from about 70% of the interviewees in 2015 to about 40% in 2016. In conclusion, with test automation, pains are increasing and gains are decreasing. What can we do next? 

Stage 3: We need whatever comes next after test automation!

The WQR 2016-17 survey asked participants which emerging test automation technologies and methods they foresaw using in the coming years. At 42% of responses, both test design automation and robotics automation were the top two techniques. They were followed closely by cognitive automation (41%), test data automation (40%), machine learning (40%), predictive analysis (40%), self-remediation (40%), and test environment virtualization (39%).

Some of these terms are more or less well-understood. Test design automation means the ability of the test engine to conceive automatically synthetic test cases at the business level, endowed with strong failure detection potential (efficacy) and high failure detection rate (efficiency). Test data automation is the ability of the test engine to generate the test case executable format. Robotics automation means a configurable test harness being set up to perform autonomously the test tasks humans assign and control. Once configured, the robot can be left to carry out the work by itself, and can interact with the system under test the same way a human does. Cognitive automation means the combination of intelligence-led (optimizing) and knowledge-based (context aware) automatic search, detection, and diagnosis of failures (troubleshooting). Test environment virtualization concerns the ability of the testbed and the test harness to be quickly deployed on private and public clouds. But machine learning - another buzzword - of what? Predictive analysis of what? Self-Remediation of what? It is not sure that the meanings of all these terms, and even of the terms that we have defined above, were crystal clear in the mind of the respondents. So, inevitably, we go forth to the next stage and come back to the first question: What is (really) test automation?

The sequel is here.

[1] https://www.linkedin.com/pulse/surviving-digital-transformation-fatigue-jason-bloomberg

[2] World Quality Report 2015-16 - https://www.capgemini-consulting.com/thought-leadership/world-quality-report-2015-16

[3] World Quality Report 2016-17 - http://www8.hp.com/us/en/software-solutions/capgemini-world-quality-report/