• Home
  • >
  • DevOps News
  • >
  • How an Imbalanced Test Automation Strategy Hurts Business Agility – InApps Technology 2022

How an Imbalanced Test Automation Strategy Hurts Business Agility – InApps Technology is an article under the topic Devops Many of you are most interested in today !! Today, let’s InApps.net learn How an Imbalanced Test Automation Strategy Hurts Business Agility – InApps Technology in today’s post !

Read more about How an Imbalanced Test Automation Strategy Hurts Business Agility – InApps Technology at Wikipedia



You can find content about How an Imbalanced Test Automation Strategy Hurts Business Agility – InApps Technology from the Wikipedia website

Anyone involved in software delivery will tell you that automation — wherever possible — is an inherently good thing. But can test automation for its own sake somehow throw our software lifecycle out of balance?

We’re all becoming acutely aware lately of the need for balance in ensuring our own health, and the health of those we care about. We consume content from epidemiologists, nutritionists and personal trainers in search of this balance.

The performance of our body has a lot to do with achieving balance over the course of time. Controlling caloric intake is important for a diet, but you also need the right amounts of whole food fats and cholesterols for long-term success. Exercise is beneficial when training for a sport, but taking it to extremes without enough rest could actually weaken the body’s recovery ability and cause injuries in the field. You may need vitamin D and potassium to preventatively deal with an infection, but too much of these compounds could be dangerous.

That’s not even addressing our state of mind, which can either stabilize us or produce the most destructive effects on our health if suddenly subjected to undue mental stress.

Likewise, automation of builds, tests, deployments and observability are all beneficial for the health of our software — but only in moderation. We can replace most functional UI tests with automation, for instance, but the need for some level of manual user acceptance testing (UAT) and human verification will always remain.

If too much test automation is applied, at the least opportune times and for the wrong reasons, the software lifecycle — and with it, your business agility — will be pushed out of balance.

Read More:   Google on the DevOps ‘Elite’ and Everyone Else – InApps Technology 2022

Blind Automation Is Risky

Testing is the yang to the yin of software development — without it, there’s no possible way to know that software will meet requirements.

As agile software development and delivery accelerate, it seems obvious that we can (and should) conduct continuous automated testing as much as possible. But we need to do it strategically, not blindly. Otherwise, the following bad habits could cause test automation to become counterproductive and actually undermine our business agility:

  • Tipping the scales away from customer outcomes. Test automation goals should always be aligned with customer goals: better software functionality, reliability, security, performance. It is easy to forget to tie business incentives into every instance of test automation that is introduced and maintained, but without that alignment, teams are creating fruitless labor and costs, merely for the sake of checking a box.
  • Feeling a false sense of security. Symptoms of this habit may include claims of 99% or higher levels of “test coverage” created by the execution of hundreds of thousands of static code checks, unit tests, data comparisons and regressions. High-volume test procedures are useful for quality control gateways at each phase of the software delivery lifecycle. But saying “our million unit tests passed” at any one level doesn’t automatically translate to better user experience; and such statistics can’t provide more than a fig leaf’s worth of coverage across a complex application portfolio.
  • Inflexibility to change. If the test strategy isn’t architected for change, then every new update, component, or contribution makes test automation unusable, test data invalid, and results hard to reproduce. Brittle tests — those that can’t survive changes, especially in agile environments — produce 60 to 80 percent of the false positives and negatives seen by testers. When assets are not responsive to change, teams begin to give up on the wasted effort of repairing existing tests and building new ones, which impacts the organization’s ability to move forward.
  • Test bloat and burn. The reflexive response to imbalanced test automation is creating more and more of the easy tests, or slight variations of existing ones. Since the failures of redundant tests are hard to trace back to their source, nearly as much time is spent re-creating tests — since it is assumed that failures are happening because the tests are bad, not the application itself. Test bloat results in higher system costs and cloud opex for running test workloads and gathering and cleansing test data — which precedes a higher labor burn rate. If integration partners are involved and incentivized to make more tests, they may burn budget at an alarming rate and internal testers will experience higher burnout.
Read More:   Hire Offshore PHP Developers From Vietnam

Together, all of these challenges consume lots of costly resources that could be better invested in the highest value work, which can’t be automated. This erodes the organization’s confidence in testing over time, creating a huge impact on your ability to rapidly release software that meets the ever-changing needs of customers.

The Right Resources for Success

Fortunately, it’s possible to break this vicious cycle by automating the right kinds of tests at the right time, at the right place, and with the right resources for success.

  • Testing at the right time: There is a lot more we can do to weave critical functional, regression, integration, performance and security testing into every phase of the SDLC, to make it a smoother experience. The problem may not be “too much automation,” so much as making sure the test automation we do adds business value — avoiding an imbalanced and/or potentially unhealthy situation for any one kind of testing, at any one phase of software development.
  • Testing at the right place: To balance our own health over the years, we stretch to retain flexibility and a better range of motion. In software terms, we also train for flexibility so that our testing will remain resilient and useful wherever there is risk — at every layer of the application architecture. This includes resilient UI tests, API and microservices layer validation, end-to-end data testing, and zeroing in on high-risk change impacts to reduce test scope.
  • Testing with the right resources: The best-performing companies always emphasize improving the productive capacity of all team members — through a cultural combination of professional achievement, organizational design, education, and skill development — balanced with putting the procedures, tooling and infrastructure in place to make them successful.

Essential Ingredients

These essential ingredients should always be present to catalyze high-quality, agile software delivery:

  • Fast feedback loops. Achieving near-instant results in response to test runs, and fast feedback from both simulated environments and live customer usage, is considered the “superfood” of test-driven development. High test responsiveness allows teams to debug problems or pinpoint root causes for developer issue resolution, without context switching.
  • Requirements test automation. If you really want to shift testing all the way to the left, why not start testing the test requirements themselves? This approach may sound absurd, but requirements testing has been an essential design proof point for highly regulated environments (like electronics and defense) for decades. Solutions like Tricentis Tosca allow test requirements to be weighted and prioritized by the business value and risk, allowing tests with less relevance and impact to be left off the budget entirely.
  • Self-service automated environments. All the goodness of software-defined infrastructure as code (IaC) and elastically scaled public/private cloud capacity isn’t just for software build and deployment teams. The automated setup and teardown of complete test environments, replenished with golden state test data, drastically reduces cycle times and overhead costs. Self-service makes all the difference here because test teams thrive when they can readily provision their own resources without having to log an IT support request.
  • Service virtualization for simulation. Sometimes, you need to defy reality and settle for virtual environments. Service virtualization (or SV) allows the capture, configuration and simulation of systems, so that you no longer need to access the real thing. Why could this be better? Not only does SV eliminate availability conflicts over-constrained systems, but the virtual environments can also more predictably model the “to-be” state and scenario data — including edge conditions that are hard to reproduce in the real world — for more consistent test results.
  • Welcome our AI collaborators. Call it machine learning or augmented intelligence, but we’re starting to see a new class of AI-driven testing that can visually detect and identify on-screen elements, and understand how the human user’s interactions with these objects are tied to application logic. Tricentis introduced Vision AI so that functional, use case and non-functional tests can remain valid and stable without maintenance, even when the presentation layer changes. There’s no “magic” here — teams still need to tell Vision AI where to look — but once engaged, this cognitive engine enables automation to keep pace with software change, with fast feedback in line with coding and testing tools.
Read More:   Update How Hybrid Analytics Improves Real-Time Data-Driven Insights

The Intellyx Take

Given the chaos inflicted upon our applications in the real world, over-engineering and over-automation of testing are only natural human responses.

Test automation doesn’t have to become unhealthy. Smart, strategic automation is both the best preventative measure and the best medicine for what ails software delivery. Properly incentivized test teams with a balanced approach can overcome the false positives and negatives, and the data and alert exhaust that accompany an ever-expanding test automation suite.

It shouldn’t matter what development languages or tools are in use, nor what infrastructure you are delivering into. Nor what stage of development your apps are in.

Achieving a balanced software test automation practice, one that uses intelligence to focus on the critical challenges, will free up human minds to focus on creative solutions and critical thinking — resulting in less risk, more output and more real innovation.

Read the full version of this report — as well as more from Intellyx and other analysts — in Tricentis’ analyst research library.

Feature image via Pixabay.

InApps Technology is a wholly owned subsidiary of Insight Partners, an investor in the following companies mentioned in this article: Tricentis.




Source: InApps.net

Rate this post
As a Senior Tech Enthusiast, I bring a decade of experience to the realm of tech writing, blending deep industry knowledge with a passion for storytelling. With expertise in software development to emerging tech trends like AI and IoT—my articles not only inform but also inspire. My journey in tech writing has been marked by a commitment to accuracy, clarity, and engaging storytelling, making me a trusted voice in the tech community.

Let’s create the next big thing together!

Coming together is a beginning. Keeping together is progress. Working together is success.

Let’s talk

Get a custom Proposal

Please fill in your information and your need to get a suitable solution.

    You need to enter your email to download

      Success. Downloading...