OneStopTesting - Quality Testing Jobs, eBooks, Articles, FAQs, Training Institutes, Testing Software, Testing downloads, testing news, testing tools, learn testing, manual testing, automated testing, load runner, winrunner, test director, silk test, STLC

Forum| Contact Us| Testimonials| Sitemap| Employee Referrals| News| Articles| Feedback| Enquiry
Testing Resources
  • Testing Articles
  • Testing Books
  • Testing Certification
  • Testing FAQs
  • Testing Downloads
  • Testing Interview Questions
  • Career In Software Testing
  • Testing Jobs
  • Testing Job Consultants
  • Testing News
  • Testing Training Institutes
  • Introduction
  • Designing Test Cases
  • Developing Test Cases
  • Writing Test Cases
  • Test Case Templates
  • Purpose
  • What Is a Good Test Case?
  • Test Specifications
  • UML
  • Scenario Testing
  • Test Script
  • Test Summary Report
  • Test Data
  • Defect Tracking
    Software testing
  • Testing Forum
  • Introduction
  • Testing Start Process
  • Testing Stop Process
  • Testing Strategy
  • Risk Analysis
  • Software Listings
  • Test Metrics
  • Release Life Cycle
  • Interoperability Testing
  • Extreme Programming
  • Cyclomatic Complexity
  • Equivalence Partitioning
  • Error Guessing
  • Boundary Value Analysis
  • Traceability Matrix
    SDLC Models
  • Introduction
  • Waterfall Model
  • Iterative Model
  • V-Model
  • Spiral Model
  • Big Bang Model
  • RAD Model
  • Prototyping Model
    Software Testing Types
  • Static Testing
  • Dynamic Testing
  • Blackbox Testing
  • Whitebox Testing
  • Unit Testing
  • Requirements Testing
  • Regression Testing
  • Error Handling Testing
  • Manual support Testing
  • Intersystem Testing
  • Control Testing
  • Parallel Testing
  • Volume Testing
  • Stress Testing
  • Performance Testing
  • Agile Testing
  • Localization Testing
  • Globalization Testing
  • Internationalization Testing
    Test Plan
  • Introduction
  • Test Plan Development
  • Test Plan Template
  • Regional Differences
  • Criticism
  • Hardware Development
  • IEEE 829-1998
  • Testing Without a TestPlan
    Code Coverage
  • Introduction
  • Measures
  • Working
  • Statement Coverage
  • Branch Coverage
  • Path Coverage
  • Coverage criteria
  • Code coverage in practice
  • Tools
  • Features
    Quality Management
  • Introduction
  • Components
  • Capability Maturity Model
  • CMMI
  • Six Sigma
    Project Management
  • Introduction
  • PM Activities
  • Project Control Variables
  • PM Methodology
  • PM Phases
  • PM Templates
  • Agile PM
    Automated Testing Tools
  • Quick Test Professional
  • WinRunner
  • LoadRunner
  • Test Director
  • Silk Test
  • Test Partner
  • Rational Robot
    Performance Testing Tools
  • Apache JMeter
  • Rational Performance Tester
  • LoadRunner
  • NeoLoad
  • WAPT
  • WebLOAD
  • Loadster
  • OpenSTA
  • LoadUI
  • Appvance
  • Loadstorm
  • LoadImpact
  • QEngine
  • Httperf
  • CloudTest
  • Perl Testing
  • Python Testing
  • JUnit Testing
  • Unix Shell Scripting
    Automation Framework
  • Introduction
  • Keyword-driven Testing
  • Data-driven Testing
    Configuration Management
  • History
  • What is CM?
  • Meaning of CM
  • Graphically Representation
  • Traditional CM
  • CM Activities
  • Tools
  • What Is Software Testing?
  • Effective Defect Reports
  • Software Security
  • Tracking Defects
  • Bug Report
  • Web Testing
  • Exploratory Testing
  • Good Test Case
  • Write a Test
  • Code Coverage
  • WinRunner vs. QuickTest
  • Web Testing Tools
  • Automated Testing
  • Testing Estimation Process
  • Quality Assurance
  • The Interview Guide
  • Upgrade Path Testing
  • Priority and Severity of Bug
  • Three Questions About Bug
    Home » Testing Books » Automated Testing Books » Automated Software Testing Return on Investment (ROI)

    Automated Software Testing Return on Investment (ROI)

    A D V E R T I S E M E N T

     If Automated Software Testing is implemented effectively, it will contribute to solving the ever increasing software testing issue. An Automated Software Testing ROI was demonstrated on an application for the Navy recently. Here is a high level description of one part of this effort:

    Versions of a component used for communications onboard Navy ships and other DoD areas are delivered to Navy labs from vendors for testing and verification prior to release to respective programs and ultimately onboard war-fighters. The components each consist of nearly one million lines of code including vast complexity. Currently, it takes several months to thoroughly test multi-vendor component versions for performance, quality, and functionality. An initial Automated Software Testing implementation and ROI have shown that with Automated Software Testing substantial time savings can be achieved. (see Figure 2 ).

    Figure 2 shows the initial findings: Based on the initial component testing actual results, one can project a 97% reduction in test days would occur over the course of ten years. Implementing Automated Testing to conduct testing in new and innovative ways, while shortening the testing and certification timeline, while maintaining or improving product quality, can accomplish a significant reduction in overall software development costs.

    Ideally, automation in 10 years would include self-testable automated components. As for today, there are many reasons why the STL should be automated. The quality of the test effort is improved through automated regression testing, build verification testing, multi-platform compatibility tests, and easier ability to reproduce software problems, since automated testing takes out the human error in recreating the test steps. Test procedure development, test execution, test result analysis, documentation and status of problems should all be reduced with automated testing enabling the overall test effort and schedule to be reduced. Since Automated Software Testing applies to all phases of the STL, this would include an automated requirements traceability matrix (i.e. traceability from requirements to design/development/test cases, etc.); automated test environment setup; automated testing; automated defect tracking; etc.

    Most importantly, some tests can hardly be accomplished using manual testing efforts, such as memory leak detection, stress or performance testing, high test coverage with a large amount of test data input and so on.

    The challenges described related to testing complex software systems and the desire to reduce the cost and schedule associated with testing is not unique to DOD in general. Commercial businesses, large and small, are also faced with increasing large and sophisticated software projects while, at the same time, they are interested in delivering new and more capable products faster to market at the lowest possible cost. In response to these challenges, automated testing tools and methodologies have been developed and continue to emerge. In addition, the emphasis on iterative incremental development approaches where incremental software builds are utilized and incremental repetitive testing is required has further contributed to the growth in automated test tools and capabilities being utilized.

    The IDC Software Research Group published report entitled, "Worldwide Distributed Automated Software Quality Tools 2005- 2009 Forecast and 2004 Vendor Shares"5 begins by stating that the "automated software quality tools market was once again the growth leader across application life-cycle markets". This report goes on to state, "The criticality of software to business, the increasing complexity of software applications and systems, and the relentless business pressures for quality, productivity, and faster time to market have all been positive drivers (resulting in growth in the market) and will continue to be in the foreseeable future."

    The IDC Software Research Group attributes automated software quality tools growth primarily to businesses' desire for higher quality, increased productivity, and faster time to market.
    Benefits of Automated Testing that Should Also be Considered

        Types of tests that manual testing cannot accomplish effectively, if at all such as Concurrency, Soak, Memory Leak or Performance testing:

        Concurrency testing uncovers any type of concurrent user access issues, while soak and persistence testing often uncovers memory leaks when the application runs over a period of time. Automated testing tools allow for those types of tests in a stand-alone fashion, while this type of testing is very time consuming and resource intensive to conduct manually.

        Using test automation, automated scripts can be run over an extended period of time to determine whether there is any type of performance degradation or memory leak. At the same time, timing statements can be inserted to track performance timing of each event tested.

        We can kick off the automated scripts on numerous PCs to simulate concurrency testing, i.e. accessing the same application resources at the same time with numerous users and monitoring for any potential issues.

        Automated testing tools should be used for these types of testing efforts to make them more feasible.
        Effective Smoke (or Build Verification) Testing

        Whenever a new software build or release is received, a test (generally referred to as "smoke test") is run to verify that previously working functionality is still working. It sometimes can require numerous hours to complete an entire smoke test, only to determine that a faulty software build has been received, resulting in wasted testing time, because now the build has to be rejected and testing has to start all over again.

        If the smoke test is automated, the smoke test scripts could be run by the developers to verify the build quality before it is handed over to the testing team, saving valuable testing time and cost.
        -Standalone - Lights Out Testing

        Automated testing tools can be programmed to kick off a script at a specific time.

        Consider the fact that automated testing can be run standalone and be kicked off automatically if needed, overnight, and the testers simply can analyze the results of the automated test the next day they are in the office.
        Increased repeatability

        Often a test is executed manually that uncovers a defect only to find out that the test cannot be repeated, i.e. the tester forgot which combinations of test steps led to the error message and is not able to reproduce the defect. Automated testing scripts take the guess work out of test repeatability.
        Testers can focus on advanced issues

        As tests are automated most system issues are uncovered. The automated script can be baselined and rerun for regression testing purposes, which generally yields less new defects than testing and automating new functionality. Testers can focus on newer or more advanced areas, where the most defects can be uncovered while the automated testing script verifies the regression testing area. New features are incrementally added to the automated test regression test suite.
        Higher functional test coverage

        Automated Testing will allow for an increase of the number of test case data combinations that manual testing could not cover. Data driven testing allowed for numerous test data combinations to be executed using one automated script. For example, in our case study during one of our prototype efforts we wanted to baseline numerous charts and routes used in an application we were testing. In order to automate this test efficiently, we only need to write one test script that calls and baselines numerous charts and routes and runs a bitmap comparison against a recorded baseline.

        Additionally, if the chart is off by just one pixel, during a manual test analysis the naked eye would probably have a difficult time detecting that pixel difference, however, the automated bitmap comparison feature in the automated testing tool will point out that difference immediately. Therefore, the accuracy of an automated test is higher in most cases.

    Automated Software Testing Pitfalls

    The above list is just a subset of the automated testing options and potential benefits. With so many benefits why are so few automated testing efforts underway or even successful? There are various reasons why automated testing efforts can fail and in many years of experience in automated testing many lessons learned have been accumulated.6 Here are just a few of the common mistakes programs make when implementing automated testing:

    Treating automated testing as a side activity: It is important that automated testing is not treated as a side activity, i.e. asking a tester to automate whenever he gets free time. Testers rarely have free time and deadlines are always looming. Automated testing requires a mini-development lifecycle with test requirements, test design and test implementation and verification. Automated testing can

    Thinking anyone can automate a test: Testing requires skills and automated testing requires software development skills. The automation effort is only successful if implemented using the appropriate expertise.7

    A structured approach to automated testing is necessary to help steer the test team away from some of the common test program mistakes below:

        Implementing the use of an automated test tool without a testing process in place resulting in an ad-hoc, nonrepeatable, non-measurable test program
        Implementing a test design without following any design standards, resulting in the creation of test scripts that are not repeatable and therefore not reusable for incremental software builds
        Using the wrong tool
        Test tool implementation initiated too late in the application development life cycle, not allowing sufficient time for tool setup and test tool introduction process (i.e. learning curve)
        Test engineer involvement initiated too late in the application development life cycle resulting in poor understanding of the application and system design, which results in incomplete testing
        Not including software developers, so they can keep automated testing in mind when they make changes to the code. Developers need to understand the impact their code changes could have on an automated testing framework and can consider alternatives, as appropriate.

    Automated testing enables rapid regression testing while comprehensive manual regression testing is almost prohibitive to conduct because of the time required.

    Often the mistake is made to assume that a "manual" tester can pick up an automated testing tool and simply hit record and playback. However, much more is involved and a development background is required. Automated Software Testing, when done effectively, should be considered a software development effort and includes test requirements, automated test design, script development and automated script verification.
    How to Automate Software Testing

    Automated Testing can be accomplished using vendor provided tools, open-source tools or in-house developed tools or a combination of the above:

    Vendor provided automated testing tools generally mimic the actions of the test engineer via the use of the tool's "recording" feature. During testing, the engineer uses the keyboard and mouse to perform some type of test step or action, while the recording feature of the automated testing tool captures all keystrokes, saving the recording baselines and test results in the form of an automated test script. During subsequent test playback, scripts compare the latest test output against the previous baseline. Testing tools generally have built-in test functions, code modules, .dlls and code libraries that the test engineer can reuse. Most test tools provide for non-intrusive testing, i.e. they interact with the application-undertest without affecting the application's behavior, as if the test tool was not involved. Vendor provided test tools use a variety of test scripting languages, i.e Java script, VB Script, C, or vendor proprietary languages. Vendor provided tools also use various storage mechanisms, with generally no specific standard being applied across the vendor community. This type of automation can be most tedious and time-consuming with possibly the least level of Return on Investment in an environment where the application-under-test is still constantly changing.

    The problem with this type of "record/playback" automation is that the script baselines contain hard coded values, i.e. if the test engineer clicks on today's date as part of her test steps, today's date will be recorded/baselined and trying to play back the script tomorrow (on the subsequent date) will fail. The hard coded values need to be replaced with variables, etc. The tool generated scripts generally will require much modification and coding expertise, such as understanding of the use of reusable functions/ libraries, looping constructs, conditional statements, etc. Software development knowledge is required to effectively use vendor provided automated testing tools.

    Open-source testing tools8 come in various flavors, i.e. are based on various technologies, come with different levels of capabilities and features, and can be applied to various phases of the software testing lifecycle.

    increasingly mature and stable enough to be safely implemented in an enterprise test environment. Implementing open-source testing tool solutions can be a viable option, especially when vendor provided tools don't support the software engineering environment under test while the open-source tool provides the compatible features required.

    In-house developed software test automation efforts are still common and are often necessary when vendor provided or open-source tools don't meet the automated testing needs. Developing automated test scripts is a software development effort and requires a mini-softwaredevelopment lifecycle.

    The most successful automated testing environments develop a framework of automated tests with reusable components that is continuously maintained and new capability is added.

    While 73% of survey respondents believe Automated Testing is beneficial, 54% of the software testing survey respondents listed "lack of time" or "lack of budget" as the major reason for not automating their software testing efforts. Considering, that there doesn't seem to be a lack of time or budget when a regression test has to be rerun manually yet again after just another showstopper has been uncovered and required a fix, no matter how long the manual regression testing cycle takes or how many testers it takes or how often it already had been run previously, isn't it time to automate?

    The second highest percentage of survey respondents listed "lack of expertise" as the reason for not automating their software testing efforts. There are various companies that provide automated testing services, plus a vast pool of automated test expertise exists that could be drawn from.

    30% of survey respondents listed the regression testing phases as most time consuming.

    Automated testing payoff is highest during regression testing, because at the time of regression testing the application area-under-test generally has stabilized and initial tests have been run and defects have been removed. Automated test scripts can be rerun with minimal maintenance or other involvement.

    Too much time is spent on software testing. Hardware automated testing and the associated standards are prevalent in the commercial sector and have been employed successfully in the commercial arena at the various DOD organizations for many years. We need to get software testing up to par with hardware testing, which includes quick turnaround times. Implementing efficient and effective Automated Software Testing is a main step into that direction.

        Software debugging, testing and verification by Hailpern and Santhanam, 2002, see journal/sj/411/hailpern.pdf
        For a detailed explanation of how Automated Testing parallels the Engineering Lifecycle, see the book "Automated Software Testing," 1999, Dustin, et al, Addison Wesley
        D. Hendrick, IDC, July 2006
        "Lessons in Test Automation," 2001, Dustinsee http:// MAGAZINE_62
        Book "Automated Software Testing," 1999, Dustin, et al, Addison Wesley
        See for information on various open-source testing tools

    About the Authors

    Elfriede Dustin works at Innovative Defense Technologies (IDT), an Arlington based software testing consulting company, currently working on an effort to bring in automated software testing to a branch of the DOD. Elfriede is lead author of the book "Automated Software Testing," which describes the Automated Testing Lifecycle Methodology (ATLM), a process that has been implemented at numerous companies. Elfriede is also author of various white papers, of the books "Effective Software Testing," co-author of "The Art of Software Security Testing," and "Quality Web Systems," books which have been translated into many languages and are available world-wide. Dustin has been responsible for implementing automated test, or has performed as the lead consultant/manager/director guiding implementation of automated and manual software testing efforts at various commercial and Government agencies.

    Bernie Gauf is President and Chief Technologist of IDT. Mr. Gauf has twenty years of experience in leading the design, development, and delivery of innovative solutions for the DoD. His experience includes the development and production of systems for passive and active sonar, electronic warfare, command and control, and computer based training and simulation for these system. Mr. Gauf is currently leading IDT's efforts in developing automated testing strategies and an automated testing framework suitable for DoD systems. Mr. Gauf has been invited to participate in numerous DoD panels associated with the use of COTS technology, middleware technology, and Open Architecture.

    Prior to his employment at IDT, Mr. Gauf was one of the founding employees at Digital System Resources, Inc., a system integration and software company specializing in technology critical to national security and a recognized leader in providing state of the art, high quality products. DSR became one of the top 100 largest prime Department of Defense contractors for Research, Development, Test, and Evaluation through the successful transition of transformational technologies for the DoD. 

    Download Book: Automated Software Testing Return on Investment (ROI)

    More Testing Books

    discussionDiscussion Center


    Yahoo Groups
    Y! Group
    Sirfdosti Groups
    Contact Us

    Looking for Software Testing eBooks and Interview Questions? Join now and get it FREE!
    A D V E R T I S E M E N T

    Members Login

    Email ID:

    Forgot Password
    New User
    Testing Interview Questions
  • General Testing
  • Automation Testing
  • Manual Testing
  • Software Development Life Cycle
  • Software Testing Life Cycle
  • Testing Models
  • Automated Testing Tools
  • Silk Test
  • Win Runner
    Testing Highlights

  • Software Testing Ebooks
  • Testing Jobs
  • Testing Frequently Asked Questions
  • Testing News
  • Testing Interview Questions
  • Testing Jobs
  • Testing Companies
  • Testing Job Consultants
  • ISTQB Certification Questions
    Interview Questions

  • WinRunner
  • LoadRunner
  • SilkTest
  • TestDirector
  • General Testing Questions

  • Testing Forum
  • Downloads
  • E-Books
  • Testing Jobs
  • Testing Interview Questions
  • Testing Tools Questions
  • Testing Jobs
  • A-Z Knowledge
    Study ABROAD ?

    Study Abroad

    Vyom Network : Free SMS, GRE, GMAT, MBA | Online Exams | Freshers Jobs | Software Downloads | Programming & Source Codes | Free eBooks | Job Interview Questions | Free Tutorials | Jokes, Songs, Fun | Free Classifieds | Free Recipes | Bangalore Info | GATE Preparation | MBA Preparation | Free SAP Training
    Privacy Policy | Terms and Conditions
    Sitemap | Sitemap (XML)
    Job Interview Questions | Placement Papers | SMS Jokes | C++ Interview Questions | C Interview Questions | Web Hosting
    German | French | Portugese | Italian