Time Waster Maintenance - How Your Peers Keep It Low

Time Waster Maintenance - How Your Peers Keep It Low

September 25, 2024

Navigating the Maze of Maintenance in Software Test Automation

In the fast-evolving world of software development, test automation stands as a cornerstone, ensuring reliability and efficiency. Yet, beneath its streamlined surface lies a less talked about challenge – maintenance.

Often perceived as a necessary evil, the maintenance of test automation can quickly escalate from a routine task to a time-consuming quagmire, draining resources and diverting attention from innovation.

But why is maintenance such a thorny issue in software test automation?

The answer lies in the dynamic nature of software projects. As applications grow and evolve, so must the tests that ensure their functionality. This constant need for updates and adjustments turns maintenance into a persistent hurdle, often overshadowing the benefits of automation itself.

However, it's not all doom and gloom.

Solutions do exist, and who better to shed light on them than industry leaders themselves?

In this blog post, we delve into the insights of five opinion leaders in the field of software testing.

Each brings a unique perspective to the table, addressing our burning question:

"Time waster maintenance! How can the maintenance effort for test automation be kept within limits? How do you ensure this? Which approaches or features help you?"

From streamlined processes to cutting-edge tools, these experts share their strategies and experiences, offering a beacon of hope for those navigating the maintenance maze:

Prachi Dahibhate

Software Quality Assurance Engineer | Women in IT Awards Asia- 2023 (Next Generation Leader) Finalist | Blogger✍️ | Community contributor

When it comes to minimizing maintenance efforts in test automation, a few strategies help keep things in check. Firstly, creating robust test cases that are resistant to small changes in the application is key. It's also essential to regularly review and refactor test code to keep it clean and maintainable. Implementing smart automation frameworks that allow easy modifications and reusability of code across different tests helps save time and effort.

Moreover, prioritizing test cases based on their criticality and frequency of use can ensure that maintenance efforts focus on the most valuable tests. Regular monitoring and analyzing test results also help identify flaky tests or areas needing improvement, reducing maintenance needs.

In my experience, leveraging version control systems and continuous integration tools streamline the maintenance process by tracking changes and automating test runs. Additionally, fostering a culture of collaboration between developers, testers, and stakeholders ensures everyone understands the importance of maintaining test automation and actively contributes to its upkeep.

Rahul Parwal

Jerry Weinberg Excellence Awardee | ifm 🧡 | Test Specialist ✌️ | Blogger ✒️ | Speaker🎙 | RST PA 💡

Key Approaches to Maintain Test Automation

 

Develop with Maintainability in Mind

-Use modular and reusable code.

-Implement proper error handling.

-Regularly review and refactor the scripts.

-Maintainability should be a key factor in decision-making.

 

Leverage Version Control Systems

-Keep track of changes.

-Maintain proper logs.

-Have a clear history of the test automation code.

 

Maintainable Test Scripts

-Consider modularity, reusability, and design principles.

-Avoid creating a short-term mess that rots in the long run.

 

AvoidMachine / User-Specific Code

- No hard codes or reckless assumptions in the scripts.

-Keep things configurable.

-The code should not just work on your machine.

 

Treat as a Genuine Programming Project

-Automation equals development.

-It's a separate project on its own.

 

Document Your Work

-The next new engineer in your team doesn't know the "obvious" things.

-Document the work to avoid re-engineering the same thing.

-Please document!

 

DealThoughtfully with Legacy Project Code

-Old codes can be messy.

-Don't assume.

-Test before making a change in production.

 

Don't Ignore Library Updates

- Be prepared for breaking changes.

- Be aware of library bugs.

- Be open to realities.

 

Continuous Monitoring and Proactive Maintenance

-Regularly review test results.

-Monitor system changes.

-Update test automation scripts accordingly.

Sidharth Shukla

SDET@Amazon Canada | Featured in TimeSquare | API-UI-Mobile Automation | AWS-DevOps | Technical Blogger & Trainer | LinkedIn Top Voice

To keep the maintenance effort for test automation within limits, I would follow these best practices:

  • Use better exception handling: Exception handling is a way of dealing with unexpected errors or failures in the test code. It helps to avoid crashing the test execution or showing unwanted messages to the user. I would use try-catch blocks, assert statements, or custom exceptions to handle different types of errors and provide meaningful feedback or recovery actions. For example, I would use a try-catch block to handle a NoSuchElementException when a web element is not found on the page, and display a relevant message or retry the operation.

  • Locators should not be dynamic ones: Locators are the identifiers that help to find and interact with web elements on a page. They can be based on attributes such as id, name, class, xpath, css selector, etc. However, some of these attributes may change dynamically depending on the state of the page or the data entered by the user. This can make the locators unreliable and cause the tests to fail. Therefore, I would avoid using dynamic locators and instead use static or stable ones that do not change frequently. For example, I would use id or name attributes if they are unique and consistent, or use relative xpath or css selector if they are more robust and flexible.

  • Use soft assert: Assertions are the statements that verify the expected outcome of a test case. They can be hard or soft assertions. Hard assertions stop the execution of the test case if they fail, while soft     assertions continue the execution and collect all the failures at the end. I would avoid using multiple hard assertions in a single test case, as they can cause the test to abort prematurely and miss some important checks. Instead, I would use soft assertions to verify multiple conditions in a test case, and report all the failures at the end. This way, I can ensure that the test is comprehensive and reliable.

  • Use self-healing test technique: The self-healing test technique is a method of using artificial intelligence (AI) and machine learning (ML) to adapt the test automation dynamically to any changes in the application interface or     development environment. It helps to reduce the maintenance effort by automatically updating the test cases or locators when the application changes. I would use a tool that supports the self-healing test technique. These tools can detect changes in the application under test and make necessary adjustments to ensure that the test cases are functional and up-to-date.

  • Try to add API calls in UI tests (where possible): API calls are the requests and responses that communicate with the application’s backend or external services. They can be used to perform actions or verify data that are not possible or convenient to do through the UI. I would try to add API calls in UI tests where possible, as they can enhance the test coverage and reliability. For example, I would use API calls to set up or tear down test data, to check the database or server status, or to mock or stub some dependencies. I would use a tool that supports API testing, such as SoapUISwagger UI, or Postman. These tools can help me to create, execute, and validate API calls in a user-friendly and intuitive way.

Nithin S.S

Head of QA at Lodgify | Ex-Fave | Founder Synapse QA | QA Strategist | Blogger & Speaker | Community Builder | Mentor at The Mentoring Club

Automation in testing can be a double-edged sword if not properly maintained. However, it can become manageable if approached with the right strategies. Throughout my career, I have come across people who believe automated checks, once scripted, are done forever and do not require any modifications or maintenance. Yet, this is a misconception about automation that many individuals in our industry hold as a"Silver Bullet" solution.

 

Automated tests do not create, modify, or maintain themselves. Maintenance is necessary to ensure they serve their intended purpose. Test automation is a software development process that faces the same challenges as any other development activity. Creating and maintaining tests will be more costly and time-consuming if we intend to automate more tests.

 

Adopting a proactive approach is critical to reduce maintenance efforts. Regularly reviewing and updating your test scripts to ensure they align with the evolving application and its features is essential. Establishing a robust version control system to keep track of changes makes it easier to identify and rectify issues promptly. Breaking down the test scripts into smaller, independent modules enhances reusability and simplifies the process of updating or replacing specific components without disrupting the entire system.

 

Peer reviews and sessions to understand more about the features and get diversified opinions from a fresh perspective are also highly beneficial.Having continuous collaboration between developers and testers is crucial. A fresh set of eyes can often catch issues that have been overlooked, preventing potential maintenance headaches. By fostering open communication, you can stay informed about upcoming changes in the software, allowing engineers toproactively adjust your automated tests accordingly.

 

Maintaining test data is always a pain in automation. Data-driven testing and parameterization can significantly ease these maintenance challenges. It allows you to manage test data separately, making it easier to update and modify without altering the core test script. Documenting your test scripts and the reasoning behind certain design choices can also be a lifesaver during maintenance. It provides a clear idea for anyone updating or troubleshooting the automated tests.

 

Most importantly, test your tests quite often instead of waiting for a moment when you need to spend time maintaining them. Remember tobe smart and innovative. Realize that when you automate a mess, all you end up having is an automated mess!

 

I wrote a post a while ago on four simple steps for creating meaningful automated tests. Keeping these tips in mind while designing a test will highly benefit in reducing the maintenance effort later as well.

 

In short, remember these 4R3I's when you develop tests to reduce the maintenance effort later proactively: Revisit, Review, Revise, Reframe, Ideate, Innovate, and iterate the tests.

Tobias Müller

Managing Director | Founder of TestResults.io

I think in test automation most of the struggles come from an imperfect balance between abstraction and extremely specific implementations. For example, if you have automated test cases without any layer of abstraction, you’ll find yourself in trouble as soon as central functionality in the system under test changes. On the other side, too many abstract layers make the automation extremely complicated to maintain because you need to understand the interdependencies on these layers.

This is why I usually tell everyone: Abstract your application, abstract your controls. Done. No need for inheritance or any other technology that is typically used in development projects. If you think about it, you automate an application; it is not that you are building a library that is reused in different projects in different layers.

  1. Abstract your application: You want to centralize logic. A typical example is log in. You might use this from all test scenarios. Have it centralized.
  2. Abstract your controls: The application you might automate might require some special handling, e.g., you must click 3x in a text field before you can enter text. Create a control library and use these controls in your application abstraction (see 1)
  3. Write your automation against the application abstraction(s).

DON’T inherit controls from controls in your abstraction. It is not required. There are just a few controls, and most of the functionality cannot be overlapped from a usage point of view. Yes, from a developer’s point of view, a text field and a drop-down share a lot of common functionality. From an automation point-of-view, it is only the “enter text” part. But 90% of the functionality is different.

DON’T inherit application from applications. You are automating a single app with an abstraction. You might be tempted to say, “But they all share the same navigation frame”. Which might be true, but they might also be on a different release cadence in the future. Just abstract a single application with a single model without any dependency on any other application abstraction.

DON’T combine logic in abstractions. Workflows, like entering information in one application and checking in another for the result is functional testing in the test case. That is automation. Don’t put code that checks cross applications in either an application abstraction or in an “overall” application abstraction. One application -> one abstraction. Complex interactions -> automation.

 

We follow these guides on all projects, and it keeps the maintenance effort extremely low. Naturally, these rules are based on our experience with TestResults.io, which inherently includes things like self-healing, etc. thanks to the underlying approach to automation.

As we've journeyed through the insights of our esteemed experts, a clear narrative emerges: maintenance in software test automation, while challenging, is far from insurmountable. The experiences shared by Prachi Dahibhate, Rahul Parwal, Sidharth Shukla, Nithin S.S and Tobias Müller underscore the importance of adopting a proactive, strategic approach to maintenance.

Here is your space!

We are always looking for new perspectives and lively debates.

Do you have something to say about maintenance in software test automation? The community wants to read it!

Click here to message us and give your statement a stage right here!