OneStopTesting - Quality Testing Jobs, eBooks, Articles, FAQs, Training Institutes, Testing Software, Testing downloads, testing news, testing tools, learn testing, manual testing, automated testing, load runner, winrunner, test director, silk test, STLC

Forum| Contact Us| Testimonials| Sitemap| Employee Referrals| News| Articles| Feedback| Enquiry
 
Testing Resources
 
  • Testing Articles
  • Testing Books
  • Testing Certification
  • Testing FAQs
  • Testing Downloads
  • Testing Interview Questions
  • Career In Software Testing
  • Testing Jobs
  • Testing Job Consultants
  • Testing News
  • Testing Training Institutes
  •  
    Fundamentals
     
  • Introduction
  • Designing Test Cases
  • Developing Test Cases
  • Writing Test Cases
  • Test Case Templates
  • Purpose
  • What Is a Good Test Case?
  • Test Specifications
  • UML
  • Scenario Testing
  • Test Script
  • Test Summary Report
  • Test Data
  • Defect Tracking
  •  
    Software testing
     
  • Testing Forum
  • Introduction
  • Testing Start Process
  • Testing Stop Process
  • Testing Strategy
  • Risk Analysis
  • Software Listings
  • Test Metrics
  • Release Life Cycle
  • Interoperability Testing
  • Extreme Programming
  • Cyclomatic Complexity
  • Equivalence Partitioning
  • Error Guessing
  • Boundary Value Analysis
  • Traceability Matrix
  •  
    SDLC Models
     
  • Introduction
  • Waterfall Model
  • Iterative Model
  • V-Model
  • Spiral Model
  • Big Bang Model
  • RAD Model
  • Prototyping Model
  •  
    Software Testing Types
     
  • Static Testing
  • Dynamic Testing
  • Blackbox Testing
  • Whitebox Testing
  • Unit Testing
  • Requirements Testing
  • Regression Testing
  • Error Handling Testing
  • Manual support Testing
  • Intersystem Testing
  • Control Testing
  • Parallel Testing
  • Volume Testing
  • Stress Testing
  • Performance Testing
  • Agile Testing
  • Localization Testing
  • Globalization Testing
  • Internationalization Testing
  •  
    Test Plan
     
  • Introduction
  • Test Plan Development
  • Test Plan Template
  • Regional Differences
  • Criticism
  • Hardware Development
  • IEEE 829-1998
  • Testing Without a TestPlan
  •  
    Code Coverage
     
  • Introduction
  • Measures
  • Working
  • Statement Coverage
  • Branch Coverage
  • Path Coverage
  • Coverage criteria
  • Code coverage in practice
  • Tools
  • Features
  •  
    Quality Management
     
  • Introduction
  • Components
  • Capability Maturity Model
  • CMMI
  • Six Sigma
  •  
    Project Management
     
  • Introduction
  • PM Activities
  • Project Control Variables
  • PM Methodology
  • PM Phases
  • PM Templates
  • Agile PM
  •  
    Automated Testing Tools
     
  • Quick Test Professional
  • WinRunner
  • LoadRunner
  • Test Director
  • Silk Test
  • Test Partner
  • Rational Robot
  •  
    Performance Testing Tools
     
  • Apache JMeter
  • Rational Performance Tester
  • LoadRunner
  • NeoLoad
  • WAPT
  • WebLOAD
  • Loadster
  • OpenSTA
  • LoadUI
  • Appvance
  • Loadstorm
  • LoadImpact
  • QEngine
  • Httperf
  • CloudTest
  •  
    Languages
     
  • Perl Testing
  • Python Testing
  • JUnit Testing
  • Unix Shell Scripting
  •  
    Automation Framework
     
  • Introduction
  • Keyword-driven Testing
  • Data-driven Testing
  •  
    Configuration Management
     
  • History
  • What is CM?
  • Meaning of CM
  • Graphically Representation
  • Traditional CM
  • CM Activities
  • Tools
  •  
    Articles
     
  • What Is Software Testing?
  • Effective Defect Reports
  • Software Security
  • Tracking Defects
  • Bug Report
  • Web Testing
  • Exploratory Testing
  • Good Test Case
  • Write a Test
  • Code Coverage
  • WinRunner vs. QuickTest
  • Web Testing Tools
  • Automated Testing
  • Testing Estimation Process
  • Quality Assurance
  • The Interview Guide
  • Upgrade Path Testing
  • Priority and Severity of Bug
  • Three Questions About Bug
  •    
     
    Home » Testing Articles » Automated Testing Articles » Thoughts on Test Automation in Agile

    Thoughts on Test Automation in Agile

    A D V E R T I S E M E N T


    Working in the time box of a scrum sprint does present some challenges � especially if your team is building a product from scratch. Sprint after sprint you have to add new features and ensure that whatever you built previously continues to work. Having an automated testing framework, which takes care of both system and integration tests, adds a lot of firepower to such a team. It not only acts as a safety net against regressions caused by new development, but more importantly frees up a lot of precious

    In this article, I intend to share with you some of the test automation techniques our team successfully implemented on a recent project � the results of which have proven to be a huge asset. Although it took us quite some time to reach where we are now, it's been well worth the investment and the efforts are paying off in big way. We are now able to build, integrate, test and deploy our application software with production-like quality in a production-like environment every single day. While we had our share of good days and bad days, we managed to learn something new through the experience and apply that learning to make things better.

    Our team is very happy with the results we are seeing from applying test automation techniques to our project. Applying these techniques has allowed us to consistently add new features to the product at a very brisk pace with every sprint and has proven to be crucial in finding and fixing critical issues we faced as a result of regressions.

    Here are a few of the real life lessons we learned which should keep in you in good stead if you begin adding test automation to your projects.

    Start Small

    The process you follow when creating automated tests is very similar to the process you follow for creating the software being tested. It involves a fair bit of design, coding and testing of its own to get it working correctly. So just like the application itself, automated tests are best developed incrementally � adding new tests and features to the automation framework over several different sprints. It's important not to aim for the perfect "it-can-do-everything" test framework right at the start, as it would never materialize. Balance the cost vs. ROI well and come up with a bare minimum working solution at the start. Working tests, just like working software, are useful; they build confidence and get everyone excited about the progress being made. Successes, even small ones, make it easier to bring everyone on-board � especially when the test automation solution you have created actually runs and proves to be of real value to the team.

    Test Automation Backlog

    Maintain a test automation backlog for your project that contains all needed automation tasks and identified improvements. If you then target a few items from the backlog every sprint, in no time you will start to see the new regression test suite taking shape. Occasionally, stories from the test automation backlog may require dedicated developer time to implement and consequently some buy-in from the product owner in order to proceed. However, it should not be difficult to convince the product owner about the value of such stories if everyone on team is committed to quality.

    A test automation backlog could contain a prioritized list of items such as:

    • Parameterize the test environment for test execution.
    • Integrate with Continuous Integration.
    • Enhance reporting mechanism.
    • Provide an option to attach error logs in notification emails.
    • Collect performance metrics for workflow scenarios.
    • Add tests to check for concurrent execution of critical test cases.

    Tools Are Just the Means And Not The End

    The tools and frameworks you use to achieve test automation are not the real goal of your testing efforts. If you focus on the big picture, the real goal is to support new development efforts by providing rapid feedback to the team. This helps keep everyone informed about the current state of the project so that interested stakeholders can make informed decisions. Since tools and frameworks are only a means to achieve a much broader end, it is important to not get obsessed with the new tools and lose sight of the ultimate goal.

    It is also important to keep your tests and test data independent of the selected test automation tool as much as possible. Creating tests with hard-coded test data, system configuration, and properties makes them difficult to maintain. And in the long run, strong coupling with the test data makes it difficult to change your test tooling midway through the project if you were to run into any unanticipated issues.

    Create Meaningful Tests and Don't Try to Automate Everything

    The most important part of your test solution is the "tests". A lot of teams spend most of their time and effort creating a nice framework with lot of features but bereft of any meaningful tests. Don't let the framework code become more important than the test code. The proposition of creating a state-of-the-art framework is tempting but you should avoid this trap. The real value of any test automation effort is derived from the tests it produces, so stay focused on creating meaningful tests.

    Additionally, do not automate for the sake of automation. Due consideration should be given to concerns like maintainability and execution time before adding new tests. Each test the team adds to the automated test suite becomes part of the production code base and therefore must be maintained just like the rest of the code base � for the entire life of the application. Adding tests that are overly complex or difficult to maintain, end up slowing down the feedback cycle to the team and should be avoided.

    Get It Out Of Your Local Machine

    If you are creating automated tests for your project and the only place where the automation runs is your local machine, the tests are basically of no use. Everyone on the team should benefit from the safety net of tests you have created and to do that you must get the tests off of your local machine.

    The automated test suite should be easily accessible by all the team members and everyone should be able to execute it with a push of a button. If someone wants to run the test prior to or after committing a large number of changes, they should be able to run the test suite and get feedback with minimum fuss. Ideally, the automated test suite should be hosted on an external server and wired to run as part of the build process or the continuous integration environment. Schedule the test suite to run frequently (i.e. on every check-in or at least daily) and keep a tab on the status. Have notification mechanisms in place to notify everyone involved of the latest test execution status.

    Execution Time Matters

    How long the test suite takes to run all of the tests is a critical concern. If the test automation takes too long to execute it ceases to add value since the intended feedback is no longer quick (especially for Agile projects with short iterations). In addition, those running the test suite will quickly stop running them since it is just too painful to have to wait that long. Use parallelism, production like infrastructure, or any other trick in the book � but make your tests run fast so that you can maintain quick feedback cycles.

    Additionally, test cases can be tagged and run selectively based on the component/feature being worked on. The ability to run selected tests will save execution time especially if the test suite is fairly large and running the entire test suite doesn't make real sense in the given context.

    Keep It Green

    Keep all the tests in your test suite "green" (i.e. running successfully). Sometimes, however, you may have tests that are failing for known reasons. Perhaps part of the system is unavailable or perhaps a fix is under development but may be delayed for some time. In this situation, you might choose to tag these tests in the test suite as "known failures" allowing the test framework to ignore or skip those tests. Doing this will keep the build from failing and other tests executing so that new test failures are immediately visible.

    Be disciplined and make it a priority to turn the tests green as soon as they turn red (i.e. fix the tests or fix the code). The sooner you address failed tests, the easier it is to correct them � especially if it was a recent code change that was just checked in which made them fail. Pay attention as well to test suites that never seem to fail since simply having automated tests in place can create a false sense of security. In some cases, adding automated tests to your project can create a sense of insecurity if the tests are brittle and fail frequently. In either case it should provoke some investigation by the team.

    Some amount of testing of the test code is recommended to flush out problems, and create confidence that the automated tests are working reliably. Also, it is a good idea to use data fuzzing so that you don't use the same data for every test execution. This will help to create more robust and meaningful tests.

    Precise Reporting

    Make it a point to spend some time on the reporting feature of your test framework. Report failures and errors in a clear and concise manner so that people who are investigating them need not spend too much time figuring out what went wrong and where. Keep the reports absolutely simple and make them visual if you can afford to do so.

    Make It Visible To Everyone

    Last but not the least, keep the process simple and accessible and make the results of test automation visible to all the stakeholders. Have the test history and trends available online and if possible, hook it to a code quality analysis tool like Sonar. Let people look at the results from their perspective. Make it extremely simple to add and update tests and let everyone participate in making it even better.

    Summary

    To summarize, the test automation techniques we found most effective, and the ones you should pay particular attention to, include:

    • Start small at the beginning of the project and build the test suite iteratively with each passing sprint.
    • Create a test automation backlog to serve as a prioritized list of automation tasks. This will help you stay focused on the immediate tasks without losing sight of the long-term goals. Have a good look at the available testing tools and their capabilities and don't be afraid to invest a sprint or two in getting your hands dirty with them. This will ensure that you get started with the best available testing options and establish a realistic assessment of their features.
    • Keeping your tests and data loosely bound will help you to switch testing tools with ease in the future if such a need arises.
    • Create meaningful tests and give due consideration to concerns like maintainability and execution time while adding tests to the automation suite.
    • As soon as possible, make every effort to enable the whole team to use the security net you've created by putting it on a build/CI system.
    • Create meaningful tests and ensure that they don't create a false sense of security.
    • Make every effort to resolve test failures quickly and keep test execution times as short as possible.
    • Last but not the least, have an intuitive reporting mechanism in place and give everyone on the team visibility to test results and historical trends. This will help everyone involved in the project monitor the progress and health of development and make more informed decisions.

    Conclusion

    In this article, I shared many of the lessons my team and I learned while implementing test automation on a recent project. The test automation techniques described in this article are by no means a complete list. They are just a compilation of a few little gems I gathered while implementing test automation with a great team.

    It's important to remember that automating software tests allows computers to do what they do best, rapid execution of regression test suites to verify software functionality over and over again. This allows the people on the team to focus more on what we do best, developing and testing the system using our cognitive skills in an exploratory fashion.

    If you follow the techniques recommended in this article and invest in test automation, you too will be able to build, integrate, test and deploy your application software with production-like quality, in a production-like environment, every single day.



    More Automated Testing Articles



    discussionDiscussion Center
    Discuss
    Discuss

    Query

    Feedback
    Yahoo Groups
    Y! Group
    Sirfdosti Groups
    Sirfdosti
    Contact Us
    Contact

    Looking for Software Testing eBooks and Interview Questions? Join now and get it FREE!
     
    A D V E R T I S E M E N T
       
       

    Members Login


    Email ID:
    Password:


    Forgot Password
    New User
       
       
    Testing Interview Questions
  • General Testing
  • Automation Testing
  • Manual Testing
  • Software Development Life Cycle
  • Software Testing Life Cycle
  • Testing Models
  • Automated Testing Tools
  • Silk Test
  • Win Runner
  •    
       
    Testing Highlights

  • Software Testing Ebooks
  • Testing Jobs
  • Testing Frequently Asked Questions
  • Testing News
  • Testing Interview Questions
  • Testing Jobs
  • Testing Companies
  • Testing Job Consultants
  • ISTQB Certification Questions
  •    
       
    Interview Questions

  • WinRunner
  • LoadRunner
  • SilkTest
  • TestDirector
  • General Testing Questions
  •    
       
    Resources

  • Testing Forum
  • Downloads
  • E-Books
  • Testing Jobs
  • Testing Interview Questions
  • Testing Tools Questions
  • Testing Jobs
  • A-Z Knowledge
  •    
    Planning
    for
    Study ABROAD ?


    Study Abroad


    Vyom Network : Free SMS, GRE, GMAT, MBA | Online Exams | Freshers Jobs | Software Downloads | Programming & Source Codes | Free eBooks | Job Interview Questions | Free Tutorials | Jokes, Songs, Fun | Free Classifieds | Free Recipes | Bangalore Info | GATE Preparation | MBA Preparation | Free SAP Training
    Privacy Policy | Terms and Conditions
    Sitemap | Sitemap (XML)
    Job Interview Questions | Placement Papers | SMS Jokes | C++ Interview Questions | C Interview Questions | Web Hosting
    German | French | Portugese | Italian