Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

The Release Engineering team assists project managers, business analysts and software engineers in ensuring that IS&T's deployed applications, Application Programming Interfaces and Software-as-a-Service offerings are production ready.  Our testing engineers offer consultation regarding testing strategies and build automated API tests, performance tests and regression tests, as well as offering consultation regarding testing strategies.  The Veracode platform is available to IS&T staff for security testing.  See below for details about our testing services.  Questions?  Email release-engineering@mit.edu. 

A note about system requirements 

  • For new projects, system requirements should be written such that they can be used as test criteria.  They will be used as test cases for performance testing and regression testing. 

API Testing

  • Objective: 
    • Validate that data updates GET / POST / PUT requests are processed as expected, based on system requirementsAPI RAML specification
    • Validate any business logic encoded in the API, based on system business requirements
    • Validate that negative test cases have been adequately covered
       
  • Tools:
    • Python scripts
    • Gazoonta tester: Java application uses XML input to make JSON requests to a REST API and  SQL commands to database.
    • Integrate API tests with Travis to Travis-CI: Used to build and then run API tests at build time.after deployment to DEV 
       
  • Timing 
      When: 
      • Ideally, the API tests are written as the API becomes available in the Dev and Test environments.  When test scripts are written during API development, software engineers can use the scripts to test that new development does not have unintended consequences for existing code.
      • API tests are automatically executed for each build of the API

    Performance Testing

    • Objective: 
      • Ensure average response times are consistent with product owner expectations.  
      • Create test scripts based on relevant user scenarios.
      • Work with Business Analysts and Project Managers to develop accurate use cases and test scenarios. 
      • Execute load/stress testing based on current and predicted usage metricsCalibrate load testing based on system requirements.
         
    • Tools: 
      • Blazemeter, JMeter, New Relic, Selenium WebDriver, Taurus 
         
    • Timing
      • Performance testing can usually be done when the application becomes
      When: 
      • Application is available in Test environment

    ...

      • .
      • Depending on complexity, performance testing can take from 2 days to 2 weeks.  
      • Allow time in project timeline for mitigation of performance issues.

    Functional Testing

    • Objective:
      • Validate that set of defined use cases behave as expected.
         
    • Tools:
      • Selenium WebDriver, Python, HP Unified Functional Testing (UFT)
         
    • WhenTiming:
      • During requirements definition, so that system requirements can be written For new projects, system requirements should be written such that they can be used as test case scenarios.  
      • Begin executing regression tests when the application is available in the Test environment
      • We see highest Return On Investment return on investment when automated regression tests are written during application development.  Developers can  This allows developers to use the scripts to test that new development does not have unintended consequences for existing code.
         
    • Consult with Release Engineering Team as to whether Considerations for whether the project would be a good candidate for automated regression tests make sense for your project.  Factors to consider:
      • BA or product owner resources available to help define test cases and validate that automated test scripts cover the needed functionality
      • Stable data set against which to test or ability to seed data for testing
      • Identification and maintenance of authorizations needed in order to run the automated tests
      • Consider the Return on Investment (ROI) return on investment for writing automated regression tests for this application
        • Will the tests be used during development phase to validate that new builds do not break existing functionality?
        • After release, how often is the application likely to be updated?
        • Will automated regression tests enable the development team to make more frequent releases for bug fixes and enhancement requests?
        • Are there additional factors that necessitate the application being tested on a regular basis? (e.g. SAP Annual Support Packs an annual system upgrade)
        • Rough estimate of how long it will take to create the automated test scripts. Do objects on the page have unique identifiers which can easily be used in the test scripts? How many test cases are needed?

    Security Testing

    • Objective:
      • Detect and mitigate security vulnerabilities within the application
    • Veracode:
      • Release Engineering team can facilitate your use of the Veracode security platform.  
      • Veracode enables developers to detect security vulnerabilities and receive suggestions for mitigation
      • IS&T is currently licensed for 50 apps per year.
      • For Java applications, upload the .war file for scanning by Veracode
      • For non-Java applications and APIs, consult with Veracode support team
    • WhenTiming:
      • During development phase
      • For Java applications, Veracode can be integrated into Eclipse IDE

    More about the Release Engineering Team

    Contact Us

    Team email: release-engineering@mit.edu

    ...