OneStopTesting - Quality Testing Jobs, eBooks, Articles, FAQs, Training Institutes, Testing Software, Testing downloads, testing news, testing tools, learn testing, manual testing, automated testing, load runner, winrunner, test director, silk test, STLC

Forum| Contact Us| Testimonials| Sitemap| Employee Referrals| News| Articles| Feedback| Enquiry
Testing Resources
  • Testing Articles
  • Testing Books
  • Testing Certification
  • Testing FAQs
  • Testing Downloads
  • Testing Interview Questions
  • Career In Software Testing
  • Testing Jobs
  • Testing Job Consultants
  • Testing News
  • Testing Training Institutes
  • Introduction
  • Designing Test Cases
  • Developing Test Cases
  • Writing Test Cases
  • Test Case Templates
  • Purpose
  • What Is a Good Test Case?
  • Test Specifications
  • UML
  • Scenario Testing
  • Test Script
  • Test Summary Report
  • Test Data
  • Defect Tracking
    Software testing
  • Testing Forum
  • Introduction
  • Testing Start Process
  • Testing Stop Process
  • Testing Strategy
  • Risk Analysis
  • Software Listings
  • Test Metrics
  • Release Life Cycle
  • Interoperability Testing
  • Extreme Programming
  • Cyclomatic Complexity
  • Equivalence Partitioning
  • Error Guessing
  • Boundary Value Analysis
  • Traceability Matrix
    SDLC Models
  • Introduction
  • Waterfall Model
  • Iterative Model
  • V-Model
  • Spiral Model
  • Big Bang Model
  • RAD Model
  • Prototyping Model
    Software Testing Types
  • Static Testing
  • Dynamic Testing
  • Blackbox Testing
  • Whitebox Testing
  • Unit Testing
  • Requirements Testing
  • Regression Testing
  • Error Handling Testing
  • Manual support Testing
  • Intersystem Testing
  • Control Testing
  • Parallel Testing
  • Volume Testing
  • Stress Testing
  • Performance Testing
  • Agile Testing
  • Localization Testing
  • Globalization Testing
  • Internationalization Testing
    Test Plan
  • Introduction
  • Test Plan Development
  • Test Plan Template
  • Regional Differences
  • Criticism
  • Hardware Development
  • IEEE 829-1998
  • Testing Without a TestPlan
    Code Coverage
  • Introduction
  • Measures
  • Working
  • Statement Coverage
  • Branch Coverage
  • Path Coverage
  • Coverage criteria
  • Code coverage in practice
  • Tools
  • Features
    Quality Management
  • Introduction
  • Components
  • Capability Maturity Model
  • CMMI
  • Six Sigma
    Project Management
  • Introduction
  • PM Activities
  • Project Control Variables
  • PM Methodology
  • PM Phases
  • PM Templates
  • Agile PM
    Automated Testing Tools
  • Quick Test Professional
  • WinRunner
  • LoadRunner
  • Test Director
  • Silk Test
  • Test Partner
  • Rational Robot
    Performance Testing Tools
  • Apache JMeter
  • Rational Performance Tester
  • LoadRunner
  • NeoLoad
  • WAPT
  • WebLOAD
  • Loadster
  • OpenSTA
  • LoadUI
  • Appvance
  • Loadstorm
  • LoadImpact
  • QEngine
  • Httperf
  • CloudTest
  • Perl Testing
  • Python Testing
  • JUnit Testing
  • Unix Shell Scripting
    Automation Framework
  • Introduction
  • Keyword-driven Testing
  • Data-driven Testing
    Configuration Management
  • History
  • What is CM?
  • Meaning of CM
  • Graphically Representation
  • Traditional CM
  • CM Activities
  • Tools
  • What Is Software Testing?
  • Effective Defect Reports
  • Software Security
  • Tracking Defects
  • Bug Report
  • Web Testing
  • Exploratory Testing
  • Good Test Case
  • Write a Test
  • Code Coverage
  • WinRunner vs. QuickTest
  • Web Testing Tools
  • Automated Testing
  • Testing Estimation Process
  • Quality Assurance
  • The Interview Guide
  • Upgrade Path Testing
  • Priority and Severity of Bug
  • Three Questions About Bug

    Methods for Gathering Test Metrics

    A D V E R T I S E M E N T

    Home » Test Metrics » Methods for Gathering Test Metrics

    Methods for Gathering Test Metrics

    Conduct test metrics analysis using several steps, which include:

    ?sorting data into categories,

    ?performing calculations,

    ?analyzing to determine trends.

    Sort Data into Categories

    Sort the data gathered into the applicable categories based on the objectives defined at the beginning of the test metrics evaluation process. For example, if the objective is to determine the most frequent source of defects, the data is sorted by source of defect.

    Perform Calculations

    Perform calculations, such as percentages and averages, to compare the types of data gathered.

    For example, to determine the percentage for each source of defect category, calculate the total based on each defect source and calculate the total for all defects. Then, divide the number of defects for each source of defect type by the total number of defects, to determine the category with the highest percentage of occurrence.

    Frequently, additional categories need to be calculated. For example, to determine the cost to fix each source of defect category, calculate the percentage of effort required to fix the defects for each source of defect category. To do this, calculate the total number of hours to fix the defects for each source of defect category, and the total number of hours to fix all defects, then divide the number of hours for each source of defect category by the total number of hours to fix all defects.

    Analyze to Determine Trends

    Review the calculated findings to determine trends. For example, the majority of the defects were coding defects, but the most costly to fix were the functional specification defects.

    Additional calculation may also be necessary. For example, the determination of the average number of hours to fix each source of defect category may allow better comparison of the cost of defects.

    Evaluate the causes of the findings. For example, further discussion with the programmers reveals that it takes longer to fix functional specification defects, because the analysis and design processes must occur again.

    The analysis process may also reveal additional factors that need to be collected to improve the reliability of the findings. For example, to clarify the causes of the coding errors, the identification of the programs where each defect occurred, as well as the complexity of the programs, may help. When determining complexity, established metrics, such as McCabe's Complexity Measure and Function Point Estimating are useful.

    Whenever possible, use automated means, such as a database, to accumulate and analyze test metrics.

    When to Complete the Analysis

    Depending on the objective, the analysis process can be done on a continual basis or at the end of the data gathering period. When the objective is improvement of the current project, the analysis process should be done on a continual basis during the data collection. When the objective is improvement of a future project, the analysis process can be done at the end of the data gathering period.

    Continuous analysis can be comparative or cumulative. For example, monitoring system readiness by calculating and comparing the number of defects each day, to determine if there is a downward trend, illustrates comparative analysis. In other situations, the analysis needs to be done on a cumulative basis. For example, when determining the most frequent and costly types of defects, a continuous analysis of information as it is accumulated helps with early identification of areas requiring corrective action.

    An example of analysis at the end of the data gathering period is the collection of work hours used for testing, for the purpose of estimating future testing efforts. For greater reliability, accumulate and analyze data from as many projects as possible.

    Looking for Software Testing eBooks and Interview Questions? Join now and get it FREE!
    discussionDiscussion Center


    Yahoo Groups
    Y! Group
    Sirfdosti Groups
    Contact Us
    Recommended Resources
    • Testing Interview Questions -
    • Testing Tools Interview Questions -
    • What is Software Testing?-
    • Software QA & Testing Resource Center-
    • Testing Faqs-
    A D V E R T I S E M E N T

    Members Login

    Email ID:

    Forgot Password
    New User
    Testing Interview Questions
  • General Testing
  • Automation Testing
  • Manual Testing
  • Software Development Life Cycle
  • Software Testing Life Cycle
  • Testing Models
  • Automated Testing Tools
  • Silk Test
  • Win Runner
    Testing Highlights

  • Software Testing Ebooks
  • Testing Jobs
  • Testing Frequently Asked Questions
  • Testing News
  • Testing Interview Questions
  • Testing Jobs
  • Testing Companies
  • Testing Job Consultants
  • ISTQB Certification Questions
    Interview Questions

  • WinRunner
  • LoadRunner
  • SilkTest
  • TestDirector
  • General Testing Questions

  • Testing Forum
  • Downloads
  • E-Books
  • Testing Jobs
  • Testing Interview Questions
  • Testing Tools Questions
  • Testing Jobs
  • A-Z Knowledge
    Study ABROAD ?

    Study Abroad

    Vyom Network : Free SMS, GRE, GMAT, MBA | Online Exams | Freshers Jobs | Software Downloads | Programming & Source Codes | Free eBooks | Job Interview Questions | Free Tutorials | Jokes, Songs, Fun | Free Classifieds | Free Recipes | Bangalore Info | GATE Preparation | MBA Preparation | Free SAP Training
    Privacy Policy | Terms and Conditions
    Sitemap | Sitemap (XML)
    Job Interview Questions | Placement Papers | SMS Jokes | C++ Interview Questions | C Interview Questions | Web Hosting
    German | French | Portugese | Italian