OneStopTesting - Quality Testing Jobs, eBooks, Articles, FAQs, Training Institutes, Testing Software, Testing downloads, testing news, testing tools, learn testing, manual testing, automated testing, load runner, winrunner, test director, silk test, STLC

Forum| Contact Us| Testimonials| Sitemap| Employee Referrals| News| Articles| Feedback| Enquiry
 
Testing Resources
 
  • Testing Articles
  • Testing Books
  • Testing Certification
  • Testing FAQs
  • Testing Downloads
  • Testing Interview Questions
  • Career In Software Testing
  • Testing Jobs
  • Testing Job Consultants
  • Testing News
  • Testing Training Institutes
  •  
    Fundamentals
     
  • Introduction
  • Designing Test Cases
  • Developing Test Cases
  • Writing Test Cases
  • Test Case Templates
  • Purpose
  • What Is a Good Test Case?
  • Test Specifications
  • UML
  • Scenario Testing
  • Test Script
  • Test Summary Report
  • Test Data
  • Defect Tracking
  •  
    Software testing
     
  • Testing Forum
  • Introduction
  • Testing Start Process
  • Testing Stop Process
  • Testing Strategy
  • Risk Analysis
  • Software Listings
  • Test Metrics
  • Release Life Cycle
  • Interoperability Testing
  • Extreme Programming
  • Cyclomatic Complexity
  • Equivalence Partitioning
  • Error Guessing
  • Boundary Value Analysis
  • Traceability Matrix
  •  
    SDLC Models
     
  • Introduction
  • Waterfall Model
  • Iterative Model
  • V-Model
  • Spiral Model
  • Big Bang Model
  • RAD Model
  • Prototyping Model
  •  
    Software Testing Types
     
  • Static Testing
  • Dynamic Testing
  • Blackbox Testing
  • Whitebox Testing
  • Unit Testing
  • Requirements Testing
  • Regression Testing
  • Error Handling Testing
  • Manual support Testing
  • Intersystem Testing
  • Control Testing
  • Parallel Testing
  • Volume Testing
  • Stress Testing
  • Performance Testing
  • Agile Testing
  • Localization Testing
  • Globalization Testing
  • Internationalization Testing
  •  
    Test Plan
     
  • Introduction
  • Test Plan Development
  • Test Plan Template
  • Regional Differences
  • Criticism
  • Hardware Development
  • IEEE 829-1998
  • Testing Without a TestPlan
  •  
    Code Coverage
     
  • Introduction
  • Measures
  • Working
  • Statement Coverage
  • Branch Coverage
  • Path Coverage
  • Coverage criteria
  • Code coverage in practice
  • Tools
  • Features
  •  
    Quality Management
     
  • Introduction
  • Components
  • Capability Maturity Model
  • CMMI
  • Six Sigma
  •  
    Project Management
     
  • Introduction
  • PM Activities
  • Project Control Variables
  • PM Methodology
  • PM Phases
  • PM Templates
  • Agile PM
  •  
    Automated Testing Tools
     
  • Quick Test Professional
  • WinRunner
  • LoadRunner
  • Test Director
  • Silk Test
  • Test Partner
  • Rational Robot
  •  
    Performance Testing Tools
     
  • Apache JMeter
  • Rational Performance Tester
  • LoadRunner
  • NeoLoad
  • WAPT
  • WebLOAD
  • Loadster
  • OpenSTA
  • LoadUI
  • Appvance
  • Loadstorm
  • LoadImpact
  • QEngine
  • Httperf
  • CloudTest
  •  
    Languages
     
  • Perl Testing
  • Python Testing
  • JUnit Testing
  • Unix Shell Scripting
  •  
    Automation Framework
     
  • Introduction
  • Keyword-driven Testing
  • Data-driven Testing
  •  
    Configuration Management
     
  • History
  • What is CM?
  • Meaning of CM
  • Graphically Representation
  • Traditional CM
  • CM Activities
  • Tools
  •  
    Articles
     
  • What Is Software Testing?
  • Effective Defect Reports
  • Software Security
  • Tracking Defects
  • Bug Report
  • Web Testing
  • Exploratory Testing
  • Good Test Case
  • Write a Test
  • Code Coverage
  • WinRunner vs. QuickTest
  • Web Testing Tools
  • Automated Testing
  • Testing Estimation Process
  • Quality Assurance
  • The Interview Guide
  • Upgrade Path Testing
  • Priority and Severity of Bug
  • Three Questions About Bug
  •    
     
    Home » Testing Articles » Testing - General Articles » Bug Reporting – Art and Advocacy

    Bug Reporting – Art and Advocacy

    A D V E R T I S E M E N T


    article highlights the essence and traits of finding bugs. It leaps to redefine the art, a tester should inculcate while finding a bug. It enumerates various artifacts in reporting a bug. Whereas, also voices the advocacy on the bugs that has been reported. The basic amenity of a tester being to fight for the bug until it is fixed.

    Introduction:

    As testers, we all agree to the fact that the basic aim of the Tester is to decipher bugs. Whenever a build appears for testing, the primary objective is to find out as many bugs as possible from every corner of the application. To accomplish this task as perfection, we perform testing from various perspectives. We strain the application before us through various kinds of strainers like boundary value analysis, validation checks, verification checks, GUI, interoperability, integration tests, functional � business concepts checking, backend testing (like using SQL commands into db or injections), security tests, and many more. This makes us to drill deep into the application as well as the business.

    We would agree to the fact that Bug Awareness is of no use until it is well documented. Here comes the role of BUG REPORTS. The bug reports are our primary work product. This is what people outside the testing group notices. These reports play an important role in the Software Development Life Cycle � in various phases as they are referenced by testers, developers, managers, top shots and not to forget the clients who these days demand for the test reports. So, the Bug Reports are remembered the most.

    Once the bugs are reported by the testers and submitted to the developers to work upon, we often see some kinds of confrontations � there are humiliations which testers face sometimes, there are cold wars � nonetheless the discussions take the shape of mini quarrels � but at times testers and developers still say the same thing or they are correct but the depiction of their understanding are different and that makes all the differences. In such a situation, we come to a stand-apart that the best tester is not the one who finds most of the bugs or the one who embarrasses most programmers but is the one who gets most of the bugs fixed.

    Bug Reporting � An Art:

    The first aim of the Bug Report is to let the programmer see the failure. The Bug Report gives the detailed descriptions so that the programmers can make the Bug fail for them. In case, the Bug Report does not accomplish this mission, there can be back flows from the development team saying � not a bug, cannot reproduce and many other reasons.

    Hence it is important that the BUG REPORT be prepared by the testers with utmost proficiency and specificity. It should basically describe the famous 3 What's, well described as:

    What we did:

    • Module, Page/Window � names that we navigate to
    • Test data entered and selected
    • Buttons and the order of clicking

    What we saw:

    • GUI Flaws
    • Missing or No Validations
    • Error messages
    • Incorrect Navigations

    What we expected to see:

    • GUI Flaw: give screenshots with highlight
    • Incorrect message � give correct language, message
    • Validations � give correct validations
    • Error messages � justify with screenshots
    • Navigations � mention the actual pages

    Pointers to effective reporting can be well derived from above three What's. These are:

    1. BUG DESCRIPTION should be clearly identifiable � a bug description is a short statement that briefly describes what exactly a problem is. Might be a problem required 5-6 steps to be produced, but this statement should clearly identify what exactly a problem is. Problem might be a server error. But description should be clear saying Server Error occurs while saving a new record in the Add Contact window.

    2. Bug should be reported after building a proper context � PRE-CONDITIONS for reproducing the bug should be defined so as to reach the exact point where bug can be reproduced. For example: If a server error appears while editing a record in the contacts list, then it should be well defined as a pre-condition to create a new contact and save successfully. Double click this created contact from the contacts list to open the contact details � make changes and hit save button.

    3. STEPS should be clear with short and meaningful sentences � nobody would wish to study the entire paragraph of long complex words and sentences. Make your report step wise by numbering 1,2,3�Make each sentence small and clear. Only write those findings or observations which are necessary for this respective bug. Writing facts that are already known or something which does not help in reproducing a bug makes the report unnecessarily complex and lengthy.

    4. Cite examples wherever necessary � combination of values, test data: Most of the times it happens that the bug can be reproduced only with a specific set of data or values. Hence, instead of writing ambiguous statement like enter an invalid phone number and hit save�one should mention the data/value entered�.like enter the phone number as 012aaa@$%.- and save.

    5. Give references to specifications � If any bug arises that is a contradictive to the SRS or any functional document of the project for that matter then it is always proactive to mention the section, page number for reference. For example: Refer page 14 of SRS section 2-14.

    6. Report without passing any kind of judgment in the bug descriptions the bug report should not be judgmental in any case as this leads to controversy and gives an impression of bossy. Remember, a tester should always be polite so as to keep his bug up and meaningful. Being judgmental makes developers think as though testers know more than them and as a result gives birth to a psychological adversity. To avoid this, we can use the word suggestion � and discuss with the developers or team lead about this. We can also refer to some application or some module or some page in the same application to strengthen our point.

    7. Assign severity and priority � SEVERITY is the state or quality of being severe. Severity tells us HOW BAD the BUG is. It defines the importance of BUG from FUNCTIONALITY point of view and implies adherence to rigorous standards or high principles. Severity levels can be defined as follows:

    Urgent/Show � stopper: Like system crash or error message forcing to close the window, System stops working totally or partially. A major area of the users system is affected by the incident and It is significant to business processes.

    Medium/Workaround: When a problem is required in the specs but tester can go on with testing. It affects a more isolated piece of functionality. It occurs only at one or two customers or is intermittent.

    Low: Failures that are unlikely to occur in normal use. Problems do not impact use of the product in any substantive way. Have no or very low impact to business processes
    State exact error messages.

    PRIORITY means something Deserves Prior Attention. It represents the importance of a bug from Customer point of view. Voices precedence established by urgency and it is associated with scheduling a bug Priority Levels can be defined as follows:

    High: This has a major impact on the customer. This must be fixed immediately.

    Medium: This has a major impact on the customer. The problem should be fixed before release of the current version in development or a patch must be issued if possible.

    Low: This has a minor impact on the customer. The flaw should be fixed if there is time, but it can be deferred until the next release.

    8. Provide Screenshots � This is the best approach. For any error say object references, server error, GUI issues, message prompts and any other errors that we can see � should always be saved as a screenshot and be attached to the bug for the proof. It helps the developers understand the issue more specifically.



    More Testing - General Articles



    discussionDiscussion Center
    Discuss
    Discuss

    Query

    Feedback
    Yahoo Groups
    Y! Group
    Sirfdosti Groups
    Sirfdosti
    Contact Us
    Contact

    Looking for Software Testing eBooks and Interview Questions? Join now and get it FREE!
     
    A D V E R T I S E M E N T
       
       

    Members Login


    Email ID:
    Password:


    Forgot Password
    New User
       
       
    Testing Interview Questions
  • General Testing
  • Automation Testing
  • Manual Testing
  • Software Development Life Cycle
  • Software Testing Life Cycle
  • Testing Models
  • Automated Testing Tools
  • Silk Test
  • Win Runner
  •    
       
    Testing Highlights

  • Software Testing Ebooks
  • Testing Jobs
  • Testing Frequently Asked Questions
  • Testing News
  • Testing Interview Questions
  • Testing Jobs
  • Testing Companies
  • Testing Job Consultants
  • ISTQB Certification Questions
  •    
       
    Interview Questions

  • WinRunner
  • LoadRunner
  • SilkTest
  • TestDirector
  • General Testing Questions
  •    
       
    Resources

  • Testing Forum
  • Downloads
  • E-Books
  • Testing Jobs
  • Testing Interview Questions
  • Testing Tools Questions
  • Testing Jobs
  • A-Z Knowledge
  •    
    Planning
    for
    Study ABROAD ?


    Study Abroad


    Vyom Network : Free SMS, GRE, GMAT, MBA | Online Exams | Freshers Jobs | Software Downloads | Programming & Source Codes | Free eBooks | Job Interview Questions | Free Tutorials | Jokes, Songs, Fun | Free Classifieds | Free Recipes | Bangalore Info | GATE Preparation | MBA Preparation | Free SAP Training
    Privacy Policy | Terms and Conditions
    Sitemap | Sitemap (XML)
    Job Interview Questions | Placement Papers | SMS Jokes | C++ Interview Questions | C Interview Questions | Web Hosting
    German | French | Portugese | Italian