You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 18 Next »

Massachusetts Institute of Technology Touchstone Performance Test Plan  

Abstract

This test plan is intended to prescribe the scope, approach, types of performance testing, resources and high-level schedule of the testing activities to be performed in the Touchstone project.  This plan will identify the use cases, data, and related systems to be included in the testing process.

1.0 Document identifiers

1.1 References

The following documents were used as sources of information for this test plan:

  • Questcon Technologies, The Questcon Test Management Methodology; 01/07/2005; (Test Management Methodology Release 4.0.doc).
  • Questcon Technologies, MIT SOW Testing Touchstone; 04/04/2007; (MIT SOW Testing April 07.doc).

2.0 INTRODUCTION

2.1 Purpose

The objective of this test plan is to outline the performance testing effort to be undertaken for the Touchstone project.

2.1.1 Project Description

MIT Touchstone is a new suite of technologies for authenticating a variety of web applications, being introduced by IS&T. MIT Touchstone does provide a single sign-on solution for applications that have been coded and configured to use the system. Within the context of Touchstone enabled applications, users will be able to seamlessly transition between systems without being prompted for additional authentication information.
The intended audience of this document includes all IT personnel involved in the development, testing, and support of Touchstone.

2.1.2 Project Technologies

MIT Touchstone utilizes/integrates with the following technologies:

  • Stanford's WebAuth
  • Internet 2's Shibboleth
  • SAML (the Security Assertion Markup Language)
  • A new account management system for some users outside of the traditional MIT community
  • HTTP/S (extensive redirects)
  • SSL
  • MIT X.509 certificates
  • Kerberos (via the HTTP/SPNEGO protocol)
  • TLS
  • OpenID
  • Web Services
  • MySQL (including replication)
  • Apache
  • Tomcat
  • IDP High Availability Package
  • LDAP
  • KDC
  • DNS load balancing

2.2 Scope

2.2.1 Items To Be Tested

Each of the following business processes (user flows) will be tested under load:

  • CAMS Account Creation
  • CAMS Account Authentication
  • CAMS Account Association (OpenID)
  • Authenticated Kerberos user access
  • Kerberos user id and password authentication
  • Authenticated OpenID user access

2.2.2 Items Not To Be Tested

The following modules and types of tests are considered to be outside the scope of this test effort and will not be tested by Questcon:

  • MIT X.509 certificate access
  • Kerberos (HTTP/SPNEGO) access
  • CAMS Account Association (Kerberos (HTTP/SPNEGO))

2.3 Risks & Contingencies

The success of this testing effort is dependant on the following:

  • Production-like test environment
  • Production-like setup and settings
  • Fully operational test tools
  • Involvement of subject matter experts (SMEs) for all stages of the testing effort
  • Adequate Non-functional Requirements
  • Sufficient access to systems in order monitor (This includes any necessary server side scripts which may need to be developed in order to capture desired metrics.)

If any of the above items are not available the testing effort will not reach all of its goals.  If these items are latent the testing effort will take longer than expected.

If we encounter any substantial issue which requires significant modifications to the application or re-configuration of the system some testing may need to be re-done, possibly including re-scripting etc.

3.0 Approach

3.1 Testing Strategy

The overall strategy for performance testing the Touchstone project is goal based.  There are four main goals whe hope to acheive:

  1. Performance - Benchmark the system to ensure it meets all non-functional requirements related to performance.
  2. Stress - Push the system to it breaking point and beyond to identify how and under what level of load the system fails as well as the ramifications of such a failure.
  3. Endurance - Place the system under a heavy, yet manageable, load for a protracted period of time to identify any performance degradation and/or memory leaks.
  4. Fail-over - Place the system under a heavy, yet manageable, load, wait for it to stabilize and then disconnect the servers from their network connections to identify how the system handles the sudden loss of a server.  This will help satiate any up-time SLAs or non-functional requirements.

Scripts will be designed to model various user interactions with the system.  While most of the user interactions will be scripted, some may be omitted according to the 80/20 rule and/or any time constraints which may exist.

3.2 Tools

The tools we will employ are yet to be determined.

3.3 Environmental Needs

We will need the following:

  • Stable production like system to test against.
  • Stable hardware and software to use to generate load.
  • Adequate rights and privileges to capture server side metrics (monitoring) as well as any server side scripts necessary to accomplish any needed monitoring.

4.0 Scripts

4.1 CAMS Account Creation

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.2 CAMS Association - OpenID

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.3 CAMS Association - Kerberos

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.4 Site Access - Kerberos w/ticket

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.5 Site Access - Web Auth

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.6 Site Access - CAMS Account

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.7 Site Access - OpenID

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.8 Password Reset

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.9 Admin - Password Reset

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.10 Admin - De-Activate Account

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.11 Admin - Delete Account

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.12 Admin - Activate Account

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

5.0 Scenarios

5.1 Performance Testing Scenarios

A performance test is designed to benchmark the system under test under a realistic load scenario that mimics what we anticipate real world usage will be at its peak.

5.1.1 IDPi Only

The objective of this scenario is to benchmark just the internal IDP. 

5.1.1.1 Load Model

Desired Transaction Rate: ???

Script

% of Load

Site Access - Kerberos w/ticket

50%

Site Access - Web Auth

50%

5.1.2 IDPe Only

The objective of this scenario is to benchmark just the exzternal IDP. 

5.1.2.1 Load Model

Desired Transaction Rate: ???

Script

% of Load

CAMS Account Creation

20%

CAMS Association - OpenID

20%

CAMS Association - Kerberos

20%

Site Access - CAMS Account

20%

Site Access - OpenID

20%

5.1.3 Integrated IDP External & Internal

The objective of this scenario is to benchmark both IDPs concurrently.

5.1.3.1 Load Model

Desired Transaction Rate: ???

Script

% of Load

CAMS Account Creation

10%

CAMS Association - OpenID

10%

CAMS Association - Kerberos

10%

Site Access - CAMS Account

10%

Site Access - OpenID

10%

Site Access - Kerberos w/ticket

25%

Site Access - Web Auth

25%

5.2 Stress Testing Scenarios

5.2.1 IDPi Only

5.2.2 IDPe Only

5.2.3 Integrated IDP External & Internal

5.3 Endurance Testing Scenarios

5.4 Fail-over Testing Scenarios

6.0 Monitoring

7.0 Non-functional Requirements

8.0 Architectures

8.1 Physical

8.2 IDPi Logical

8.2 IDPe Logical

  • No labels