OneStopTesting - Quality Testing Jobs, eBooks, Articles, FAQs, Training Institutes, Testing Software, Testing downloads, testing news, testing tools, learn testing, manual testing, automated testing, load runner, winrunner, test director, silk test, STLC

Forum| Contact Us| Testimonials| Sitemap| Employee Referrals| News| Articles| Feedback| Enquiry
 
Testing Resources
 
  • Testing Articles
  • Testing Books
  • Testing Certification
  • Testing FAQs
  • Testing Downloads
  • Testing Interview Questions
  • Career In Software Testing
  • Testing Jobs
  • Testing Job Consultants
  • Testing News
  • Testing Training Institutes
  •  
    Fundamentals
     
  • Introduction
  • Designing Test Cases
  • Developing Test Cases
  • Writing Test Cases
  • Test Case Templates
  • Purpose
  • What Is a Good Test Case?
  • Test Specifications
  • UML
  • Scenario Testing
  • Test Script
  • Test Summary Report
  • Test Data
  • Defect Tracking
  •  
    Software testing
     
  • Testing Forum
  • Introduction
  • Testing Start Process
  • Testing Stop Process
  • Testing Strategy
  • Risk Analysis
  • Software Listings
  • Test Metrics
  • Release Life Cycle
  • Interoperability Testing
  • Extreme Programming
  • Cyclomatic Complexity
  • Equivalence Partitioning
  • Error Guessing
  • Boundary Value Analysis
  • Traceability Matrix
  •  
    SDLC Models
     
  • Introduction
  • Waterfall Model
  • Iterative Model
  • V-Model
  • Spiral Model
  • Big Bang Model
  • RAD Model
  • Prototyping Model
  •  
    Software Testing Types
     
  • Static Testing
  • Dynamic Testing
  • Blackbox Testing
  • Whitebox Testing
  • Unit Testing
  • Requirements Testing
  • Regression Testing
  • Error Handling Testing
  • Manual support Testing
  • Intersystem Testing
  • Control Testing
  • Parallel Testing
  • Volume Testing
  • Stress Testing
  • Performance Testing
  • Agile Testing
  • Localization Testing
  • Globalization Testing
  • Internationalization Testing
  •  
    Test Plan
     
  • Introduction
  • Test Plan Development
  • Test Plan Template
  • Regional Differences
  • Criticism
  • Hardware Development
  • IEEE 829-1998
  • Testing Without a TestPlan
  •  
    Code Coverage
     
  • Introduction
  • Measures
  • Working
  • Statement Coverage
  • Branch Coverage
  • Path Coverage
  • Coverage criteria
  • Code coverage in practice
  • Tools
  • Features
  •  
    Quality Management
     
  • Introduction
  • Components
  • Capability Maturity Model
  • CMMI
  • Six Sigma
  •  
    Project Management
     
  • Introduction
  • PM Activities
  • Project Control Variables
  • PM Methodology
  • PM Phases
  • PM Templates
  • Agile PM
  •  
    Automated Testing Tools
     
  • Quick Test Professional
  • WinRunner
  • LoadRunner
  • Test Director
  • Silk Test
  • Test Partner
  • Rational Robot
  •  
    Performance Testing Tools
     
  • Apache JMeter
  • Rational Performance Tester
  • LoadRunner
  • NeoLoad
  • WAPT
  • WebLOAD
  • Loadster
  • OpenSTA
  • LoadUI
  • Appvance
  • Loadstorm
  • LoadImpact
  • QEngine
  • Httperf
  • CloudTest
  •  
    Languages
     
  • Perl Testing
  • Python Testing
  • JUnit Testing
  • Unix Shell Scripting
  •  
    Automation Framework
     
  • Introduction
  • Keyword-driven Testing
  • Data-driven Testing
  •  
    Configuration Management
     
  • History
  • What is CM?
  • Meaning of CM
  • Graphically Representation
  • Traditional CM
  • CM Activities
  • Tools
  •  
    Articles
     
  • What Is Software Testing?
  • Effective Defect Reports
  • Software Security
  • Tracking Defects
  • Bug Report
  • Web Testing
  • Exploratory Testing
  • Good Test Case
  • Write a Test
  • Code Coverage
  • WinRunner vs. QuickTest
  • Web Testing Tools
  • Automated Testing
  • Testing Estimation Process
  • Quality Assurance
  • The Interview Guide
  • Upgrade Path Testing
  • Priority and Severity of Bug
  • Three Questions About Bug
  •    
     
    Home » Testing Articles » Automated Testing Articles » Manual and automated tests together are challenging

    Manual and automated tests together are challenging

    A D V E R T I S E M E N T


    I am talking about the fact that even though you invested the time to automate something you still choose to run it manually.

    Notice that I am asking for any tests, including the ones you run manually only once in a while or those you give to your junior testers to be 100% sure all is OK; in fact any tests you are still running both automatically and manually at the same time.


    Surprisingly enough, even organizations with a relatively mature automation processes still run a significant number of their automatic scenarios as part of their manual tests on a regular basis, even when this doesn't make any sense (at least on the theoretical level).

    After realizing this was the case I sat down with a number of QA Managers (many of them PractiTest users) and asked them about the reason for this seemingly illogical behavior.

    They provided a number of interesting reasons and I will go over some of them now:

    We only run manually the tests that are really important

    The answer that I got the most was that some teams choose tests to be run automatic and manually only when they are "really important or critical".

    This may sound logical at first, but on the other hand when you ask what is their criteria for selecting the tests that should be automated most companies say they select cases based on the number of times they will need to run them and also based on the criticality or importance of the business scenario.  In plain English, they automated the important test cases.


    So if you choose to automate the test cases that are important then why do you still run them manually under the same excuse of them been really important�?  Am I the only one confused in here?


    We don't trust our automation 100%

    The answer to the question I asked above, of why run the important tests even though they are already automated comes in the form of an even more interesting (and simple) answer: "We don't really trust our test automation"

    So this basically means they are investing 10 or even 50 man-moths of work and in most cases thousands of dollars on software and hardware in order to automate something and then they don't really trust the results?  Where is the logic in this?

    OK, so I've worked enough with tools such as QTP and Selenium in order to know that it is not trivial to write good and robust automation, but on the other hand if you are going to invest in automation you might as well do it seriously and write scripts that you can trust.  In the end it is a matter of deciding to invest on the platform and be serious in the work you are doing in order to get results you can trust (and I don't mean buy expensive tools, selenium will work fine if you have a good infrastructure and write your scripts professionally).

    The alternative is really simple, if you have automatic tests you can't trust because they constantly give you wrong results (either false negative or even worst false positives!) you will eventually stop using them and finally throw all the work and money out the window�

     

    We don't know what is covered and what is not covered by the automated tests

    This is also another big reason for why people waste time running manual tests that are already automated, they are simply not aware of which scenarios are included on their automation suite and which aren't. In this situation they decide, based on their best judgement, to assume that "nothing is automated" and so run their manual test cases as if there was no automation.

    If this is the case, then why do these companies have automation teams in the first place?

     

    The automated tests are the responsibility of another team

    Now for the interesting question, how come a test team "doesn't know" which scenarios are automated and which aren't?  The most common answer is that the tests are been written by a completely different team, a team of automation engineers, that is completely separate from the one running the manual tests.

    Having 2 test teams, one manual and one automatic, is not something bad and in many cases it will be the best approach to achieve effective and trustworthy automation.  The bad thing is that these teams can sometimes be completely disconnected and so work on the same project without communicating and cooperating as it should be.


    I will talk about how to communicate and cooperate in a future post, but the point here is that when you have 2 teams (one automated and one manual) you need to make an extra effort to make sure both teams are coordinated and as a minimum each of them know what the other is doing in order to plan accordingly.

     

    We want to have all the results in a single place to give good reports

    Finally, I wanted to mention a reason that was brought up by a number of test managers, even though they brought it as a difficulty and not a show stopper but it was brought up many times and so it sounded interesting enough to mention it.  The fact that they needed to provide a unified testing report for their project, and for this they either run part of their tests manually, or created manual tests to reflect the results of their automation.

    Again, this looks like a simple and "relatively cheap" way of coordinating the process and even producing a unified report, but it suffers from the problem of being a repetitive manual job that needs to be done even after you already have an automation infrastructure and it will eventually but surely (specially as more and more automation is added) run into issues of coordination and maintenance that will make more expensive and in some cases will render it misleading or even obsolete.

     

    What's your take?

    I am actively looking for more issues, experiences or comments like the ones above that revolve around the challenges in manual and automated testing.  Do you have stuff you want to share?  Please add it as comments or mail me directly to joel-at-practitest-com.

    We've been working on a solution for these types of issues and so we are looking for all the inputs we can get in order to make sure it will provide an answer to as many of the existing challenges as possible.  I will be grateful for any help you can provide



    More Automated Testing Articles



    discussionDiscussion Center
    Discuss
    Discuss

    Query

    Feedback
    Yahoo Groups
    Y! Group
    Sirfdosti Groups
    Sirfdosti
    Contact Us
    Contact

    Looking for Software Testing eBooks and Interview Questions? Join now and get it FREE!
     
    A D V E R T I S E M E N T
       
       

    Members Login


    Email ID:
    Password:


    Forgot Password
    New User
       
       
    Testing Interview Questions
  • General Testing
  • Automation Testing
  • Manual Testing
  • Software Development Life Cycle
  • Software Testing Life Cycle
  • Testing Models
  • Automated Testing Tools
  • Silk Test
  • Win Runner
  •    
       
    Testing Highlights

  • Software Testing Ebooks
  • Testing Jobs
  • Testing Frequently Asked Questions
  • Testing News
  • Testing Interview Questions
  • Testing Jobs
  • Testing Companies
  • Testing Job Consultants
  • ISTQB Certification Questions
  •    
       
    Interview Questions

  • WinRunner
  • LoadRunner
  • SilkTest
  • TestDirector
  • General Testing Questions
  •    
       
    Resources

  • Testing Forum
  • Downloads
  • E-Books
  • Testing Jobs
  • Testing Interview Questions
  • Testing Tools Questions
  • Testing Jobs
  • A-Z Knowledge
  •    
    Planning
    for
    Study ABROAD ?


    Study Abroad


    Vyom Network : Free SMS, GRE, GMAT, MBA | Online Exams | Freshers Jobs | Software Downloads | Programming & Source Codes | Free eBooks | Job Interview Questions | Free Tutorials | Jokes, Songs, Fun | Free Classifieds | Free Recipes | Bangalore Info | GATE Preparation | MBA Preparation | Free SAP Training
    Privacy Policy | Terms and Conditions
    Sitemap | Sitemap (XML)
    Job Interview Questions | Placement Papers | SMS Jokes | C++ Interview Questions | C Interview Questions | Web Hosting
    German | French | Portugese | Italian