OneStopTesting - Quality Testing Jobs, eBooks, Articles, FAQs, Training Institutes, Testing Software, Testing downloads, testing news, testing tools, learn testing, manual testing, automated testing, load runner, winrunner, test director, silk test, STLC

Forum| Contact Us| Testimonials| Sitemap| Employee Referrals| News| Articles| Feedback| Enquiry
 
Testing Resources
 
  • Testing Articles
  • Testing Books
  • Testing Certification
  • Testing FAQs
  • Testing Downloads
  • Testing Interview Questions
  • Career In Software Testing
  • Testing Jobs
  • Testing Job Consultants
  • Testing News
  • Testing Training Institutes
  •  
    Fundamentals
     
  • Introduction
  • Designing Test Cases
  • Developing Test Cases
  • Writing Test Cases
  • Test Case Templates
  • Purpose
  • What Is a Good Test Case?
  • Test Specifications
  • UML
  • Scenario Testing
  • Test Script
  • Test Summary Report
  • Test Data
  • Defect Tracking
  •  
    Software testing
     
  • Testing Forum
  • Introduction
  • Testing Start Process
  • Testing Stop Process
  • Testing Strategy
  • Risk Analysis
  • Software Listings
  • Test Metrics
  • Release Life Cycle
  • Interoperability Testing
  • Extreme Programming
  • Cyclomatic Complexity
  • Equivalence Partitioning
  • Error Guessing
  • Boundary Value Analysis
  • Traceability Matrix
  •  
    SDLC Models
     
  • Introduction
  • Waterfall Model
  • Iterative Model
  • V-Model
  • Spiral Model
  • Big Bang Model
  • RAD Model
  • Prototyping Model
  •  
    Software Testing Types
     
  • Static Testing
  • Dynamic Testing
  • Blackbox Testing
  • Whitebox Testing
  • Unit Testing
  • Requirements Testing
  • Regression Testing
  • Error Handling Testing
  • Manual support Testing
  • Intersystem Testing
  • Control Testing
  • Parallel Testing
  • Volume Testing
  • Stress Testing
  • Performance Testing
  • Agile Testing
  • Localization Testing
  • Globalization Testing
  • Internationalization Testing
  •  
    Test Plan
     
  • Introduction
  • Test Plan Development
  • Test Plan Template
  • Regional Differences
  • Criticism
  • Hardware Development
  • IEEE 829-1998
  • Testing Without a TestPlan
  •  
    Code Coverage
     
  • Introduction
  • Measures
  • Working
  • Statement Coverage
  • Branch Coverage
  • Path Coverage
  • Coverage criteria
  • Code coverage in practice
  • Tools
  • Features
  •  
    Quality Management
     
  • Introduction
  • Components
  • Capability Maturity Model
  • CMMI
  • Six Sigma
  •  
    Project Management
     
  • Introduction
  • PM Activities
  • Project Control Variables
  • PM Methodology
  • PM Phases
  • PM Templates
  • Agile PM
  •  
    Automated Testing Tools
     
  • Quick Test Professional
  • WinRunner
  • LoadRunner
  • Test Director
  • Silk Test
  • Test Partner
  • Rational Robot
  •  
    Performance Testing Tools
     
  • Apache JMeter
  • Rational Performance Tester
  • LoadRunner
  • NeoLoad
  • WAPT
  • WebLOAD
  • Loadster
  • OpenSTA
  • LoadUI
  • Appvance
  • Loadstorm
  • LoadImpact
  • QEngine
  • Httperf
  • CloudTest
  •  
    Languages
     
  • Perl Testing
  • Python Testing
  • JUnit Testing
  • Unix Shell Scripting
  •  
    Automation Framework
     
  • Introduction
  • Keyword-driven Testing
  • Data-driven Testing
  •  
    Configuration Management
     
  • History
  • What is CM?
  • Meaning of CM
  • Graphically Representation
  • Traditional CM
  • CM Activities
  • Tools
  •  
    Articles
     
  • What Is Software Testing?
  • Effective Defect Reports
  • Software Security
  • Tracking Defects
  • Bug Report
  • Web Testing
  • Exploratory Testing
  • Good Test Case
  • Write a Test
  • Code Coverage
  • WinRunner vs. QuickTest
  • Web Testing Tools
  • Automated Testing
  • Testing Estimation Process
  • Quality Assurance
  • The Interview Guide
  • Upgrade Path Testing
  • Priority and Severity of Bug
  • Three Questions About Bug
  •    
     

    Test Script


    A D V E R T I S E M E N T


    Home » Test Cases » Test Specifications

    Why Scripts?


    We need Test Scripts because of the following points:

    1. Acceptance testing is a complex and painstaking process that requires great attention to detail in order to ensure that the software is adequately exercised and that it meets the original requirements.
    2. The objective of testing is to answer a binary decision: does the system pass the test or not. To make that decision there must be a defined result that must be achieved for each specified test ; and for the overall system there is usually an acceptance level defined in the contract which defines what number of individual failures may be acceptable.

      To decide whether a test has been passed, there must be clear criteria that it must meet. This may be a value or set of values to be displayed on a screen, some results printed out, some specific changes in a database etc. In some cases it will be easy to check that the necessary result has been achieved (e.g. a picture on a screen) in other cases an extra action might be required (e.g. printing out the before and after state of a database record).

    1. In order for overall testing to be accurately carried out, there will usually be a need to set up base data - perhaps by conversion of some existing computer data. Where this data can be set up and manipulated by the system it would be usual to use the system to do so - in fact to define the set up process as part of the overall test sequence.

    2. To ensure that the correct steps are carried out, and in the correct sequence, it is important to put the process in writing. Where the sequence of the tests is important, for example where one sets up data for another, it is particularly important to have a defined "running order".

    3. The function of the test script is to define what input should be made to the system, and what output should be received. Usually the output will simply be text or perhaps a graphic on a workstation screen, sometimes there may be a printed report of some nature. A formal script defines (or identifies) several things:

      • Any data which should pre-exist and on which this test will rely, which may have been input by an earlier step in the script or crated by an entirely separate process - this data may be data held in the computer memory, a simple data file or a set of records in a database;
      • The steps that must be taken to run the test - for example keyboard input. This may of course be quite complex like filling in a long form defining a ship and its cargo capacity/facilities.
      • Any data that is created by the test, with steps to take to check that it has been properly created. In some instances this will be the next step in the test sequence, in others it will require the use of a separate piece of software to (say) examine a database.
      • The "pass" criteria which will be applied to the test.
      • A most important criterion for a test script is that it must be repeatable. Provided the environment is set up correctly, the test should produce the same result each time it is executed. But please note the dependence on the environment - usually data but potentially other factors as well such as computer or network configuration.
      • There is a risk of "false negatives", i.e. tests said to have been deemed to have failed but where the failure is due to extraneous errors. For example, there was an error in the script and the "pass" criteria was not obtainable, there was a mistake in typing in data (say a wrong selection of "yes" or "no" - simple but devastating) or required pre-set data did not exist (the tests had been run out of sequence). It is of course also possible to make a mistake in the "pass" criteria - hence the script is faulty not the system.
      • All "failed" tests should be investigated to establish a reason for failure. Because of the complexity of testing, in many cases it is a "false negative", and the root cause needs to be identified so that the test can be re-run properly. Occasionally investigations will indicate a "false positive" from another test, which had apparently passed but the system had failed to carry out some action or other during that test on which this test depended. Whilst the result might be still one test that failed (albeit a different one) it may affect the classification of the failure (and hence overall acceptability). On the other hand it may again simply be that the test script for that test failed to ensure that the action was carried out - i.e. a script error not a system error. There are then a string of tests to be repeated, or if time is not available, discounted.


    Looking for Software Testing eBooks and Interview Questions? Join now and get it FREE!
    discussionDiscussion Center
    Discuss
    Discuss

    Query

    Feedback
    Yahoo Groups
    Y! Group
    Sirfdosti Groups
    Sirfdosti
    Contact Us
    Contact
    Recommended Resources
    • Testing Interview Questions - http://www.coolinterview.com/type.asp
    • Testing Tools Interview Questions - http://www.coolinterview.com/type.asp
    • What is Software Testing?- http://en.wikipedia.org/wiki/Software_testing
    • Software QA & Testing Resource Center- http://www.softwareqatest.com/
    • Testing Faqs- http://www.testingfaqs.org/
     
    A D V E R T I S E M E N T
       
       

    Members Login


    Email ID:
    Password:


    Forgot Password
    New User
       
       
    Testing Interview Questions
  • General Testing
  • Automation Testing
  • Manual Testing
  • Software Development Life Cycle
  • Software Testing Life Cycle
  • Testing Models
  • Automated Testing Tools
  • Silk Test
  • Win Runner
  •    
       
    Testing Highlights

  • Software Testing Ebooks
  • Testing Jobs
  • Testing Frequently Asked Questions
  • Testing News
  • Testing Interview Questions
  • Testing Jobs
  • Testing Companies
  • Testing Job Consultants
  • ISTQB Certification Questions
  •    
       
    Interview Questions

  • WinRunner
  • LoadRunner
  • SilkTest
  • TestDirector
  • General Testing Questions
  •    
       
    Resources

  • Testing Forum
  • Downloads
  • E-Books
  • Testing Jobs
  • Testing Interview Questions
  • Testing Tools Questions
  • Testing Jobs
  • A-Z Knowledge
  •    
    Planning
    for
    Study ABROAD ?


    Study Abroad


    Vyom Network : Free SMS, GRE, GMAT, MBA | Online Exams | Freshers Jobs | Software Downloads | Programming & Source Codes | Free eBooks | Job Interview Questions | Free Tutorials | Jokes, Songs, Fun | Free Classifieds | Free Recipes | Bangalore Info | GATE Preparation | MBA Preparation | Free SAP Training
    Privacy Policy | Terms and Conditions
    Sitemap | Sitemap (XML)
    Job Interview Questions | Placement Papers | SMS Jokes | C++ Interview Questions | C Interview Questions | Web Hosting
    German | French | Portugese | Italian