Increased productivity and
effectiveness, reduced time-to-market, enhanced product quality, lower spending
on testing, and faster test cycles are some among the many allurements offered
to entice organizations to invest in test automation tools. Test automation, if
executed judiciously, can streamline the testing process, and deliver high
returns on investment (ROI). It can speed up testing to accelerate product
releases, improve test coverage and reliability, ensure consistency, and offer significant
financial savings.
Nevertheless, studies
indicate that globally close to 50 percent of all test automation projects
fail. The test automation outcomes either fail to deliver on financial
expectations or do not satisfy stakeholder expectations. There have been
several cases in the past where organizations have abandoned their expensive
test automation tools and have resorted to manual testing to get the development
project back on track. Reasons for failure are many—lack of a clearly defined and
appropriate automation framework for project execution and delivery being the
most common. This generally stems from a skewed understanding of the subject.
This paper will look at the common reasons for test automation failure, and
measures that organization can take to ensure success.
Problems Plaguing Test Automation:
Automating the testing process
improves bug discovery, compresses testing cycle time, enhances productivity
and testing effectiveness, reduces time-to-market, and delivers very effective
QA. It directly impacts organizational top line as well as bottom line by
providing organizations a first mover advantage and thereby a virtual price
monopoly till the competition enters the market.
Yet, billions of dollars are
lost each year due to failed or abandoned testing automation projects that
invariably lead to product release delays and bugs seeping unnoticed into the
post production environment. Why do so many test automation projects fail to
achieve their potential? What are the common pitfalls that organizations
regularly fall prey too? And how can organizations avoid these issues to
extract the full benefits of automating the testing environment? Before we proceed
to the best practices application developers can leverage to ensure testing
automation success, listed below are a few of the most common reasons for
failure.
Unrealistic
Expectations
Most organizations consider
automation the panacea for all their testing ailments. The common belief is
that automation will eliminate the need for all manual testing and is the
perfect answer to time and resource crunch. This compels the need to automate all
tests, which is not a practical solution. Automation, in itself, does not improve
testing. It may seem more reliable than humans in detecting and reporting bugs.
However, a test script does not have the intuitive ability to analyze the
impact or usefulness of the final output. It can repeatedly run the same non
critical tests, without providing any real value, as the basic premise of the
script creation is flawed.
Another common reason is expecting immediate
payback. This again is not practical as usually the payback occurs only after conducting
several rounds of testing using the test scripts in question. Automation is
also not the solution to cutting down testing costs. Setting up the facility
involves significant startup costs. Also, it involves regular maintenance cost
during the lifecycle of the project. In the short term, it can free manual
testing from mundane and laborious testing work to focus on value-addition.
However, immediate cost reduction is a fallacy. A few key reasons for failure
are mentioned below:
Unclear
Objectives
Stakeholders might have differing expectations from
the project. For the management the goal might be improving time-to-market for
new releases by reducing test cycle time; for the IT department it might be
cost reduction, and for the development team it might be improvement in quality
while simultaneously reducing manual effort. This divergent set of goals often
leads to disappointment. Also, many organizations fail to share the objective
of the project with the automation team leading to lack of sync with the
business objectives of the organization.
Lack
of Focus
Most organizations do not give testing its due
credit and priority. Instead of engaging full time resources to automate tests
most organizations involve their developers to work on test automation as a
back burner project. Some organizations also look at it as a means of building
employee skill sets.
Not
Choosing Right Test Cases
Test automation is not an alternative to manual
testing. However, many organizations indulge in vacuous automation without
distinguishing between tests that can benefit from the same and those that need
to be conducted manually. It is necessary to clearly identify the tests that
can most benefit from automation viz repeatability of the test case over
releases, execution over multiple datasets, etc, and the return on investment that
can be expected from the same.
Wrong
choice of tools
License fees for commercial automation tools tend to
be quite expensive. There are a multitude of vendors offering myriad test
automation tools with various different capabilities and features. Organizations
need to exercise care in selecting the tools best suited to their requirements before
planning and implementing automated tests. Not all tools are suited to the
needs of all organizations. Many projects fail due to choosing inappropriate tools
or not having the capabilities or skills to efficiently utilize the same.
Ambiguity
in Cost Structure
Most organizations are left floundering when it
comes to understanding the various cost components of a test automation
project. Lack of clarity of the actual costs involved in automation often lead
to sudden surprises and budget overshoots. Testing automation involves certain fixed
costs on hardware, software licenses, software maintenance and support, tools,
and training. License fees for automation tools are expensive and perpetual. An
automation testing project involves script development, which needs to be planned,
estimated, and managed like any other software development project. Also,
creating test automation scripts require specialized skills, which come at a
cost. Organizations need to look at all these cost components in this entirety
to gain a clear understanding of the returns on investment.
Steps
to ensure success
Decisions made based upon
expected return on investment is probably the first step towards ensuring that
the organization is moving in the right direction. The benefits accrued should
have a direct impact on top line or bottom line or it should offer significant
advantages over other options available like manual testing. Listed below are
eight best practices that organizations can leverage to ensure the success of
their test automation projects.
1)
Clearly
define automation objectives
First and foremost, get management support for the
project. This can be achieved by clearly demonstrating the short-term as well
as long-term financial benefits to be accrued by investing quality resources
and time in the project. Once this stage is crossed, ensure that all stakeholders
are in agreement on the end objective of the project. Different stakeholders--
top management, the IT team, the developers, the tester, and the automation
team--may have different requirements from the project. However, success will
be elusive unless they all agree to a common goal.
2)
Treat
automation as software development
Treat the testing automation project like any other
software development project—plan well in advance; set realistic goals; assess
existing skill sets and resources; evaluate the testing tools; perform
automation feasibility; identify appropriate frameworks, and finally study the
applications to be tested to ensure their suitability. Dedicate quality resources
to test automation, and put in place a clear cut development plan and
governance procedure including detailed documentation procedure. Prepare a plan
to track bugs and rectify them. While developing an automation solution,
identify reusable functions across automation scripts to increase returns. This
is key to ensuring the success of the project.
3)
Design
and develop a framework
This should be the natural progression from the
previous best practice i.e. developing a framework that supplements the automation
effort. Design and document an extensible automation architecture, which
clearly outlines the foundation and rules for the project. Incorporating
features like flexibility and scalability into the architecture right at
inception ensures that the automation system is able to cope with growth in the
requirements of the applications being tested. Also, provide the testing team
with a proof-of-concept test suite and make test case maintenance a top
priority. Ensure that these test cases run across supported platforms and are
not machine-dependent. Frameworks should be scalable and best suitable for
application under test.
As discussed earlier, choosing the right automation tool
plays a critical role in ensuring the success of the project. Validate the tools
and approach at the earliest and evaluate its compatibility with the
organization’s requirements. This might require the guidance of an expert. If
the expertise is not available internally, it would be advisable to engage the
services of an external consultant towards this end. While assessing the
various options available in the market it is necessary to understand the
learning curve of the testing tool in comparison with available skill set. Also,
check whether the answers are in the affirmative to any of the following
questions. Does existing resources have necessary technical knowledge to use
the tools without humungous investment into training? Are the test scripts easy
to maintain? Is the tool well supported for “Application under Test” technology?
Can components developed using the script be reused later? In addition to this,
also evaluate whether the test tool supports all the required platforms on
which the application is being executed. In fact, the proof-of-concept test
suite would be an excellent way to evaluate a test tool.
5)
Decide
what needs to be automated
Resist
the temptation to automate all the existing manual test cases. Not all test
cases can benefit from automation. Identify the ones that stand to benefit the
most from automation. For instance, if automating a test consumes more man
hours than doing it manually it makes more sense to continue with manual
testing. Good test cases for automation include short and simple transactions
and tests that are executed regularly. Test cases being executed with multiple
datasets are also candidates for automation. Refrain from automating tests
where it is difficult to predict results or are not going to be repeated. And
unless there is a strong business case, avoid automating long or complex
transactions, and those that cross multiple applications.
6)
Spend
time on preparing automation test plan and strategy
Prepare
automation test plans containing the list of activities that need to be performed.
A plan should be created for every phase of automation testing i.e. identification
of appropriate automation framework, framework design, development, scripting,
maintenance etc. Prioritize activities like identification of reusable
component libraries development. Over all planning is the key to success.
Test
plans should be structured around the organization’s test objectives, as this
ensures more efficient automation. Plans that do not provide adequate structure
lead to automated tests that are long and complex. Also, ensure separate
scripts are available for different functions. Using the same script to perform
multiple functions is not advisable. Instead, design test cases and test
scripts to be modular and generic. This will also ensure reusability.
7)
Run
automation as full time effort
A half-hearted attempt at automation is a sure-fire
way of ensuring that the project ends in complete failure. Executing tests
includes running the test scenarios, debugging failed scenarios, maintaining
the automation platform/scripts, under test. Effective automation demands
disciplined focus. Dedicate qualified, full-time resources to the project.
Understand the complexity of the project and ensure requisite skills are
available on demand. In case skilled resources are not available internally,
engage the services of an experienced external testing service provider.
8)
Look
at it as a long-term investment
Look at the investment in testing automation tools
and resources from a long‐term perspective. Unlike the common perception,
automation involves high startup costs. And while immediate pay-offs are a
possibility usually it can take anywhere between 1‐3 years before the
organization sees any returns. This necessitates the need to clearly analyze
the RoI the organization stands to gain from the investment including the
various intangible benefits associated with automation. The focus should not be
on just getting the automation to work. Rather, it should be on building an
extensible automation solution that remains relevant as new product releases
are developed. Plan for the future by ensuring thorough documentation, standard
hand-off procedures, and well planned knowledge transition procedures.
Conclusion
In
addition to the above best practices, organizations can also work on improving
the product itself to make it easier to test. Automation testing identifies
defects, which, in turn, help improve product quality. Automation
frameworks, if properly implemented, will accurately identify failure. Develop
automation standards to ensure faster, more efficient coding, and lower
maintenance costs.
Careful
planning, disciplined execution, meticulous maintenance procedures, and a
dedicated, skilled staff are the key to increased RoI and cost savings.
Expecting immediate return on investment is, however, tantamount to expecting
miracles. Nevertheless, if carefully planned the total cost will prove significantly
lower than the cost of manual testing in the long run and help achieve quality
testing. This will help detect defects earlier and reduce the possibility of it
spilling over into production.
This article first appeared in IT Next
Comments
Post a Comment