You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 16 Next »

Massachusetts Institute of Technology Touchstone Performance Test Plan  

Abstract

This test plan is intended to prescribe the scope, approach, types of performance testing, resources and high-level schedule of the testing activities to be performed in the Touchstone project.  This plan will identify the use cases, data, and related systems to be included in the testing process.

1.0 Document identifiers

1.1 References

The following documents were used as sources of information for this test plan:

  • Questcon Technologies, The Questcon Test Management Methodology; 01/07/2005; (Test Management Methodology Release 4.0.doc).
  • Questcon Technologies, MIT SOW Testing Touchstone; 04/04/2007; (MIT SOW Testing April 07.doc).

2.0 INTRODUCTION

2.1 Purpose

The objective of this test plan is to outline the performance testing effort to be undertaken for the Touchstone project.

2.1.1 Project Description

MIT Touchstone is a new suite of technologies for authenticating a variety of web applications, being introduced by IS&T. MIT Touchstone does provide a single sign-on solution for applications that have been coded and configured to use the system. Within the context of Touchstone enabled applications, users will be able to seamlessly transition between systems without being prompted for additional authentication information.
The intended audience of this document includes all IT personnel involved in the development, testing, and support of Touchstone.

2.1.2 Project Technologies

MIT Touchstone utilizes/integrates with the following technologies:

  • Stanford's WebAuth
  • Internet 2's Shibboleth
  • SAML (the Security Assertion Markup Language)
  • A new account management system for some users outside of the traditional MIT community
  • HTTP/S (extensive redirects)
  • SSL
  • MIT X.509 certificates
  • Kerberos (via the HTTP/SPNEGO protocol)
  • TLS
  • OpenID
  • Web Services
  • MySQL (including replication)
  • Apache
  • Tomcat
  • IDP High Availability Package
  • LDAP
  • KDC
  • DNS load balancing

2.2 Scope

2.2.1 Items To Be Tested

Each of the following business processes (user flows) will be tested under load:

  • CAMS Account Creation
  • CAMS Account Authentication
  • CAMS Account Association (OpenID)
  • Authenticated Kerberos user access
  • Kerberos user id and password authentication
  • Authenticated OpenID user access

2.2.2 Items Not To Be Tested

The following modules and types of tests are considered to be outside the scope of this test effort and will not be tested by Questcon:

  • MIT X.509 certificate access
  • Kerberos (HTTP/SPNEGO) access
  • CAMS Account Association (Kerberos (HTTP/SPNEGO))

2.3 Risks & Contingencies

3.0 Approach

3.1 Testing Strategy

3.2 Tools

3.3 Environmental Needs

4.0 Scripts

4.1 CAMS Account Creation

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.2 CAMS Association - OpenID

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.3 CAMS Association - Kerberos

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.4 Site Access - Kerberos w/ticket

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.5 Site Access - Web Auth

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.6 Site Access - CAMS Account

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.7 Site Access - OpenID

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.8 Password Reset

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.9 Admin - Password Reset

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.10 Admin - De-Activate Account

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.11 Admin - Delete Account

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

4.12 Admin - Activate Account

Precondition:

Data Needed:

Transaction Name

Step(s)

Expected Result

95th % Response Time

 

 

 

 

5.0 Scenarios

5.1 Performance Testing Scenarios

A performance test is designed to benchmark the system under test under a realistic load scenario that mimics what we anticipate real world usage will be at its peak.

5.1.1 IDPi Only

The objective of this scenario is to benchmark just the internal IDP. 

5.1.1.1 Load Model

Desired Transaction Rate: ???

Script

% of Load

Site Access - Kerberos w/ticket

50%

Site Access - Web Auth

50%

5.1.2 IDPe Only

The objective of this scenario is to benchmark just the exzternal IDP. 

5.1.2.1 Load Model

Desired Transaction Rate: ???

Script

% of Load

CAMS Account Creation

20%

CAMS Association - OpenID

20%

CAMS Association - Kerberos

20%

Site Access - CAMS Account

20%

Site Access - OpenID

20%

5.1.3 Integrated IDP External & Internal

The objective of this scenario is to benchmark both IDPs concurrently.

5.1.3.1 Load Model

Desired Transaction Rate: ???

Script

% of Load

CAMS Account Creation

10%

CAMS Association - OpenID

10%

CAMS Association - Kerberos

10%

Site Access - CAMS Account

10%

Site Access - OpenID

10%

Site Access - Kerberos w/ticket

25%

Site Access - Web Auth

25%

5.2 Stress Testing Scenarios

5.2.1 IDPi Only

5.2.2 IDPe Only

5.2.3 Integrated IDP External & Internal

5.3 Endurance Testing Scenarios

5.4 Fail-over Testing Scenarios

6.0 Monitoring

7.0 Non-functional Requirements

8.0 Architectures

8.1 Physical

8.2 IDPi Logical

8.2 IDPe Logical

  • No labels