You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Next »

1.0 Document identifiers

1.1 Document Author

The document author is:

Author

Title

Telephone

Email Address

Will Smithee

Senior Practice Manager

336-232-5208

will_smithee@questcon.com

1.2 Document Revisions

Issue

Date

Author

Reason for Change

0.1

01/27/2008

Will Smithee

Initial draft

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

1.3 References

The following documents were used as sources of information for this test plan:

  •          Questcon Technologies, The Questcon Test Management Methodology; 01/07/2005; (Test Management Methodology Release 4.0.doc).
  •         

    2.0 Introduction

    2.1 Purpose
    The objective of this test plan is to outline the test effort to be undertaken for the Touchstone project.  The Touchstone application is a rich-content internet application framework for the management of image and other digital media. It is part of a long-term content management strategy by MIT's Infrastructure Software Development and Architecture (ISDA) department.  It is made up of two parts: a web client and an Image Management Engine (IME) that forms a framework of distinct, reusable components.  The web client is built using Open Laszlo, which ports to Flash and to DHTML. As a result the user experience will feel more like a desktop application than a traditional web application.
    The intended audience of this document includes all IT personnel involved in the development, testing, and support of Touchstone.

    2.2 Scope

    2.2.1 Items To Be Tested

    Each of the following UI components and front end functionality developed as part of the Touchstone project will be tested:
  • Domains
  • Libraries
  • Collections
  • Slideshows
  • Items
  • Search
  • Sharing
  • Login (for users who do not have certificates)
  • Users
  • Administration/Authorization/Registration
    • User Management (includes registration of new users)
    • Domain Management (includes editing metadata mapping)
    • Tag Management (manage categories and tags)
    • Authorization (validate different users can only see/access things they are authorized to)

      2.2.2 Items Not To Be Tested

      The following modules and types of tests are considered to be outside the scope of this test effort and will not be tested by Questcon.  Testing will be performed by internal MIT personnel:
  • All backend Touchstone APIs
    • Library
    • Libraries
    • Authz
    • Authzs
    • Bulk
    • Categories
    • Category
    • Item
    • Items
    • Map
    • Maps
    • User
    • Users
  • Backend Authentication
  • Repository
  • Security testing
  • Performance & scalability testing
  • Recovery testing

    2.3 Risks & Contingencies

    The following risks have been identified, which may impact the testing effort.

    Risk

    Contingency

    QA environment not available

    Utilize development or production environment

    Insufficient time to fully test the Touchstone application with all major web browsers

    Execute ~80% of application functionality with MIT's standard browser (Mozilla Firefox 2.0) and ~20% of the functionality with other browsers.

    Test time increases due to changes in scope requiring additional test analysis and/or test case creation

    If test time cannot be increased, reduce/cut overlap in multi-browser testing and execute highest priority test cases initially followed by lower priority tests until test time runs out

    Excessive defect rate or defect correction time

    Execute test cases in unrelated/unblocked functional areas of the application based on designated priority

    3.0 Approach

    3.1 Testing Strategy

    The overall approach to this test effort will be to validate that the Touchstone UI meets the needs of the user community as a tool for managing images and other digital media.  Validation will be performed based on test cases derived from the documented use cases, front-end functional designs, as well as exploratory testing heuristics.
    MIT has indicated that the user community is largely standardized on Mozilla's Firefox v2.0 web browser in a PC or Mac environment.  Rather than re-executing all tests with each browser, Questcon will execute approximately 80% of the test cases using Firefox and approximately 20% of the tests using IE 6 and 7 (both in a PC/Windows architecture).  There will be some overlap in testing and touch points, but not enough to significantly impact the schedule.
    Questcon will analyze the test cases to identify the best candidates for execution using IE 6 and 7.  IE 6 and 7 test cases will be chosen based on the amount of functionality traversed in the application.  In other words Questcon will attempt to "touch" as much of the application as possible using IE 6 and 7. 
    Furthermore, a significant portion of the user community utilizes the Mac OS X operating system with the Safari browser.  Some duplicate testing will be performed by Questcon utilizing Safari and Firefox for the Mac (10% or less).  MIT should designate a group of users to execute more tests using the Safari/Mac and Firefox/Mac combination of browser and operating system.  Questcon will assist the users in identifying the best tests to execute.
    The following table outlines the various types of testing considered for this test effort, any additional comments about the testing, and the individual or group responsible for completing the testing.

    Type of Testing

    Included
    Y/N

    Comments

    Team Responsible

    Automation

    Y

    MIT personnel will utilize jMeter to automate portions of the backend testing; no test automation tools will be used for the UI testing (use of Flash prevents this).

    MIT – Tester

    Conversion

    N

    There is no pre-existing system, therefore no data conversion is necessary.

    N/A

    Exploratory

    Y

    Some level of exploratory testing will be conducted based on heuristics related to typical rich-content internet applications.

    Questcon

    Functional

    Y

    Functional testing will be performed based on test cases derived on the documented use cases and front-end functional design.

    Questcon

    Installation / Upgrade

    N

    Because this is a web application no installation testing is necessary.

    N/A

    Integration

    Y

    Some integration testing will naturally occur as the front-end of the Touchstone application interfaces with and utilizes the back-end APIs.

    Questcon

    Parallel

    N

    There is no existing system that Touchstone is replacing.

    N/A

    Performance

    Y

    Performance testing will be done on the back-end APIs and servlets.  No front-end performance testing will be done.

    MIT - Tester

    Regression

    Y

    Questcon expects to run at least a minimum regression test set prior to release to production.

    Questcon

    Security

    Y

    Backend security testing will be done by MIT.  Questcon will execute basic security/login testing on the front-end

    MIT – Tester (backend)
    Questcon (front-end)

    UAT

    Y

    The user community will be tasked with performing ad-hoc user acceptance testing, domain specific metadata testing (metadata titles, tag lists, etc.),  as well as previously designated documented functional test cases for multiple browser/OS configurations (primarily Safari or Firefox/Mac configurations).

    MIT – User Community

    Unit

    Y

    Questcon expects the MIT developers to perform unit testing prior to releasing code to the test environment.

    MIT - Developers

    3.2 Tools

    The following tools will be used as part of the overall Touchstone testing effort:

    Tool

    Purpose

    Used By

    Atlassian Jira

    Web-based defect tracking system accessed by

    http://mv.ezproxy.com.ezproxyberklee.flo.org/jira

    Touchstone Project Team (MIT & Questcon)

    Apache JMeter

    Backend performance testing

    MIT - Tester

    3.3 Environmental Needs

    Questcon anticipates the following server and client configurations for the QA environment:

    3.3.1 Web Server Configuration

    The QA environment web server may be accessed via:
    Secure certificate servers https://mv-ezproxy-com.ezproxyberklee.flo.org/thalia-ime/ https://mv-ezproxy-com.ezproxyberklee.flo.org/thalia-ime/
    Non-certificate servers http://mv.ezproxy.com.ezproxyberklee.flo.org/thalia-ime/ http://mv.ezproxy.com.ezproxyberklee.flo.org/thalia-ime/\\

    Hardware

    O/S

    Other

    HP G4

    Red Hat Enterprise Linux AS release 4

    •          Apache 2.2.4
  •          Tomcat 5.5.23
  •          Openssl 0.9.8a
  •          Mod-jk 1.2.21
  •          Jdk 1.6.0 |

    3.3.2 Repository Server Configuration

    Hardware

    O/S

    Other

    HP G4

    Red Hat Enterprise Linux AS release 4

    •          Alfresco 2.0.1 Enterprise

    3.3.3 Database Server Configuration

    Hardware

    O/S

    Other

    HP G4

    Red Hat Enterprise Linux AS release 4

    •          Oracle 10g

    3.3.4 Client Configuration

    Hardware

    O/S

    Other

    PC

    Windows XP Professional SP 2

    •          Mozilla Firefox v2+
  •          Microsoft IE v6+
  •          Adobe Flash v9+1 |

    Macintosh PowerPC

    Mac OS X

    •          Firefox v2+
  •          IE v6+
  •          Flash v9+
  •          Safari v2+ |

    4.0 Schedule of Deliverables and Resources

    4.1 Deliverables

    This section identifies the deliverables, delivery date and resource responsible for each deliverable.
    Key Deliverables
    Expected Delivery Date
    Resource

    Functional Test Tree

    05/04/2007

    Bill Silver

    Test Plan

    05/07/2007

    Shaun Bradshaw

    Test Case Designs

    05/14/2007

    Sylvia Stanfield

    Test Cases

    05/28/2007

    Sylvia Stanfield

    Status Reports

    Weekly

    Shaun Bradshaw

    Test Logs

    Ongoing during test execution

    Sylvia Stanfield

    Defect Reports

    Ongoing during test execution

    Sylvia Stanfield, Shaun Bradshaw

    Test Summary Report

    07/17/2007

    Shaun Bradshaw

    4.2 Test Schedule

    The planned test schedule of the Touchstone project has an anticipated start date of 04/25/2007 and completion date of 07/17/2007.  The estimated completion date is based on several assumptions, some of which have been identified in 2.3 Risks & Contingencies
    Milestone
    Target Timeframe
    Summation of Activities

    Develop test strategy / plan

    04/25/2007 - 05/07/2007

    •          Analyze existing design documents, notes, and other available materials
  •          Develop test plan document |

    Review test plan

    05/07/2007 - 05/08/2007

    •          Review, clarify, correct, and update the test plan

    Perform test analysis

    04/25/2007 - 05/14/2007

    •          Develop FTT
  •          Develop test case design document |

    Review FTT & test case design

    05/14/2007 - 05/18/2007

    •          Review, clarify, correct, and update the test case design

    Build functional test cases / scenarios

    05/14/2007 - 05/28/2007

    •          Combine test objectives into test cases
  •          Document data, procedures, and results
  •          Prioritize test cases
  •          Determine which test cases will be executed in different browser/OS configurations |

    Setup test environment

    04/25/2007 – 05/14/2007

    •          Setup web server and database server
  •          Load application under test
  •          Setup logins and authorizations |

    Setup test data

    05/14/2007 – 05/28/2007

    •          Review & analyze test cases to target data to load in test environment
  •          Load initial test data set |

    Execute functional & exploratory tests

    05/28/2007 – 06/29/2007

    •          Execute documented test cases, as well as exploratory tests
  •          Communicate with the development team when issues are found
  •          Maintain a test run log
  •          Track test metrics |

    Investigate / correct defects

    05/28/2007 – 07/17/2007

    •          Investigate and validate that a defect has been found
  •          Log defects in Jira
  •          Work with the development team, as necessary, to identify the cause of the defect
  •          Accept and retest defect corrections from the development team |

    Execute regression tests

    06/29/2007 – 07/06/2007

    •          Execute a prioritized subset of test cases as regression of the system once all functional and exploratory testing is complete
  •          Validate that no new errors have been introduced as a result of correcting known defects or configuration management / version control issues
  •          Investigate and validate that a defect has been found
  •          Log defects in Jira
  •          Work with the development team, as necessary, to identify the cause of the defect
  •          Accept and retest defect corrections from the development team |

    Execute UAT

    06/29/2007 – 07/13/2007

    •          Work with the user community to identify and manage the execution of user acceptance tests
  •          Communicate with the development team when issues are found
  •          Investigate and validate that a defect has been found
  •          Log defects in Jira
  •          Work with the development team, as necessary, to identify the cause of the defect
  •          Accept and retest defect corrections from the development team |

    Create test summary

    07/13/2007 - 07/17/2007

    •          Create and deliver a test summary report to include:
      o        Summation of planned/actual test activities
      o        Deviation from planned activities
      o        Summary of defects (open defects)
      o        Summary of test metrics

     

  • No labels