Beginning Your Test Automation Journey
Part 1 of a three-part series on introducing and building test automation into your application development and deployment pipeline
April 01, 2019

Drew Horn
Applause

Many companies are embracing test automation in an effort to move quicker and more efficiently, while simultaneously saving on costs. Test automation continues to experience growth year over year. In fact, in 2017 KPMG found an 85 percent increase in test automation in a two-year period across all industry domains.

Automation is a tool that all companies should prioritize when operating in an Agile development environment. However, while one of the main goals of test automation is to move faster, maturing test automation within a deployment pipeline should be a progressive, iterative process. If you try to automate too much, or automate the wrong things, you can end up slowing down and even hurting development and testing efforts.

The success you have will hinge on how you set up your automation practice, so the more attention you pay to the early stages, the more benefits you will realize in the long run.

In your journey to mature test automation within your deployment pipeline, you will move through three distinct stages: beginner, intermediate, and expert. We will explore the beginner stage in part one of this series.

Taking the First Step

At its core, automation is all about doing things better, faster, and cheaper. It improves coverage by increasing the number of tests you can run and improves test quality through repeatability. However, you need to crawl before you can walk and eventually run, so, like most things, it is advisable to start small with your test automation efforts.

The beginner stage of test automation starts with assessing the maturity of your testing organization and defining a goal of where you want to go. You’ll also need to develop a (or integrate an existing) framework on which to run your unit, integration, and functional tests. You can assess your testing maturity by self-evaluating across five key criteria:

1. Team – What is the team makeup? Where are the expertise gaps? Are QA and Dev teams working in silos?

2. Technology – What does the deployment pipeline look like? How are automated tests triaged? How is automation integrated with a VSC/TCM/BTS?

3. Process – When are tests written? How are bugs triaged and tests updated? How fast are sprint cycles? How does feedback guide test strategy?

4. Reporting – What is the test coverage? What are the common devices used? How are test results viewed? How is a "go/no-go" decision made? What is the automation ROI?

5. Pain – What are the existing pain points? What bugs have been missed?

Once you’ve assessed your maturity and developed a framework, you will want to focus on one main thing: Creating quick wins for your team. This will help to build confidence in your automation practice and get the full weight of the team involved.

Your development team can begin by checking in unit tests to immediately receive pass/fail feedback. After seeing these unit tests pass on a consistent basis, the team should then begin to prioritize developing a core set of functional smoke tests.

How often a set of smoke tests should be run initially will depend on your team's experience and bandwidth. A good rule of thumb is to start with a nightly smoke test that you can review each morning. This allows you to focus on code quality, test reliability, and process as opposed to performance and optimization. Eventually, though, you will want these automated tests up and running as part of your overarching build efforts so they are automatically triggered upon a successful build.

Steady iteration, where success is built on success, will help you gain valuable experience with your automation. Once you see your early unit and smoke tests passing consistently, you can consider adding more smoke tests to your automation suite. In general, you will want to keep the runtime of your tests to 15 minutes or less. As you advance your automation practice, you will find additional ways to run more smoke tests in a 15-minute period, such as test parallelization. We'll cover these strategies in the next article.

Though it’s not the most exciting part of test automation, setting up a documentation process is an important step in building a successful automation practice. The entire team should agree on a definition of "done" for automation tasks that need to be written out during sprint planning. While developing and documenting your automation processes, you should also look to answer the following questions: When should tests be run initially? How is the team notified when tests fail? Who is responsible for fixing failed tests? What is the action for triaging failures and logging bugs? Document the answers to the questions so everyone understands the protocol, otherwise you can expect to run into problems in the future as you scale up the speed and breadth of your practice.

How long you spend in the beginner stage of your automation journey will depend on a number of factors. In general, the hardest part of this stage is just getting the ball rolling. Try to set a goal of about eight weeks to stand up your initial test automation practice. This is a good pace that will all you to keep momentum. When time constraints, lack of prior experience, and competing priorities are large factors, consider pulling in turn-key solutions to get up and running quickly. Remember that your strategy can always be adjusted later. It's much easier make tweaks with an existing practice in place that can provide value to the team almost immediately.

Read Part 2: Building Confidence in Automation.

Drew Horn is Senior Director of Automation at Applause
Share this

Industry News

January 30, 2025

OutSystems announced the general availability (GA) of Mentor on OutSystems Developer Cloud (ODC).

January 30, 2025

Kurrent announced availability of public internet access on its managed service, Kurrent Cloud, streamlining the connectivity process and empowering developers with ease of use.

January 29, 2025

MacStadium highlighted its major enterprise partnerships and technical innovations over the past year. This momentum underscores MacStadium’s commitment to innovation, customer success and leadership in the Apple enterprise ecosystem as the company prepares for continued expansion in the coming months.

January 29, 2025

Traefik Labs announced the integration of its Traefik Proxy with the Nutanix Kubernetes Platform® (NKP) solution.

January 28, 2025

Perforce Software announced the launch of AI Validation, a new capability within its Perfecto continuous testing platform for web and mobile applications.

January 28, 2025

Mirantis announced the launch of Rockoon, an open-source project that simplifies OpenStack management on Kubernetes.

January 28, 2025

Endor Labs announced a new feature, AI Model Discovery, enabling organizations to discover the AI models already in use across their applications, and to set and enforce security policies over which models are permitted.

January 27, 2025

Qt Group is launching Qt AI Assistant, an experimental tool for streamlining cross-platform user interface (UI) development.

January 27, 2025

Sonatype announced its integration with Buy with AWS, a new feature now available through AWS Marketplace.

January 27, 2025

Endor Labs, Aikido Security, Arnica, Amplify, Kodem, Legit, Mobb and Orca Security have launched Opengrep to ensure static code analysis remains truly open, accessible and innovative for everyone:

January 23, 2025

Progress announced the launch of Progress Data Cloud, a managed Data Platform as a Service designed to simplify enterprise data and artificial intelligence (AI) operations in the cloud.

January 23, 2025

Sonar announced the release of its latest Long-Term Active (LTA) version, SonarQube Server 2025 Release 1 (2025.1).

January 23, 2025

Idera announced the launch of Sembi, a multi-brand entity created to unify its premier software quality and security solutions under a single umbrella.

January 22, 2025

Postman announced the Postman AI Agent Builder, a suite empowering developers to quickly design, test, and deploy intelligent agents by combining LLMs, APIs, and workflows into a unified solution.

January 22, 2025

The Cloud Native Computing Foundation® (CNCF®), which builds sustainable ecosystems for cloud native software, announced the graduation of CubeFS.