CS计算机代考程序代写 SQL data structure database gui Excel System Integration Test Plan Pretty Good Project team (PGPt)

System Integration Test Plan Pretty Good Project team (PGPt)

System Integration Test Plan

Video Streaming Scheduling System

Client

Mike Dixon

School of Information Technology

Murdoch University

Pretty Good Project Team

Supervisor: Shri Rai

Version 1.4

5/11/2004
Table of Contents

3Member Contributions

41 Introduction

41.1 Purpose

41.2 Description and Scope

41.2.1 Scope

41.2.2 Baseline

51.2.3 Test Phases and Test Types

61.2.4 Out of Scope

61.3 Definitions, Acronyms, and Abbreviations

71.4 References

71.5 Overview of Test Plan

91.6 Items to be tested

112 Test Environment

112.1 Test Integration Environment

112.2 Test Data

122.3 Backup, Recovery and Maintenance Procedures

122.4 Security/Access Control

122.5 Test Tools

133 Management Procedures

133.1 Human Resources

143.2 Standards and Procedures

143.2.1 Pass/Fail Descriptions

143.2.2 Defect Priorities

143.2.3 Defect Workflow

153.2.4 Defect Reporting and Removal

153.2.5 Test Reviews

153.3 Entrance and Completion Criteria

153.3.1 Test Entrance Criteria

163.3.2 Test Completion Criteria.

163.4 Suspension and Resumption of Testing

163.5 Assumptions and Constraints

163.6 Change Management Procedures

174 Integration Strategies

174.1 Description of Integration Build Process

174.2 Integration Order and Dependencies

174.3 Integration Verification Approach

185 Test Design

185.1 Test Types (Test Cases)

19Configuration Management:

Member Contributions
This document was created by Michelle Lister.

Formatted for submission Tim Radbone and David Patullo.
SHRI – Please note that due to time constraints, we have been unable to complete System Integration testing. However, each system component has been thoroughly tested (see Unit Test Plan/Test Results) and all unit test results have been documented. Please note that this document has been included by the development team despite the fact that it is NOT specifically required. We hope that future groups may use this document as a basis for their own System Integration testing.
1 Introduction

1.1 Purpose
This system integration test plan applies to the Video Scheduling System described in the Software Test Plan document.
This SIT Plan provides guidance for integration, development and execution of system-level functional tests. Supporting documentation of test results will be in files captured from test runs and test cases. These documents will form the basis of evidence needed to provide assurance that the Video Scheduling System is ready for User Acceptance Testing and Production Deployment.
1.2 Description and Scope

1.2.1 Scope

This test plan describes the SIT of the Video Scheduling System. Unit level tests, early-adopted acceptance tests, and operational test and evaluation tests are out of the scope for these tests. In addition, the testing of subsequent releases of the Video Scheduling System is out of scope for this test plan.

1.2.2 Baseline

The baseline, upon which Video Scheduling System (VSS) will be built, includes the deliverables as defined in the User Requirements & Design Documents as well as any changes up to and including the unit-testing phase. The integration test will be run against this baseline.

Three points are identified to create application development baselines.
1. Baseline of all relevant application as it is before any additional functionality is added.

· Short regression test or dry run before baseline is created to ensure unchanged environment is fully functioning as expected.

· If successful a baseline will be established else fixes and retesting will need to be done until successful.

2. After successful migration of newly developed application i.e. VSS deliverables have been successfully migrated and full regression testing have been completed and accepted.

· This new baseline will then become the main baseline on which the formal SIT can start.

3. Specific checkpoints can be identified within the SIT plan, as the testing continues, on which a new baseline may/can be established. (Caution must be taken when a new base line is identified to ensure that all functions are working as expected).

1.2.3 Test Phases and Test Types

This is the integration phase of the software development life cycle. It is initiated upon completion of development and unit level testing. The criteria for migration of a deliverable from development into SIT are successful completion and a readiness review by the PGPt of the required developed application. During the System Integration Test, the following types of tests will be conducted.

1.2.3.1 Integration Testing

Integration Testing verifies that the testing environment has been established per the test plan and that applications have been properly migrated to the testing environment. Integration Testing will occur at several points prior to the formal Test. The first test will verify that the test environment has the proper configuration as the baseline for the VSS. The second test will involve migrating and configuring each VSS deliverable to the test bed and verification of the load instructions. The final test will involve an inspection of the test bed environment to verify that the test bed is ready to support the formal testing to be conducted on the Video Scheduling System deliverables.

If at any time during testing there is a major update or change of any of the applications, applicable regression tests will be used to validate the change.

1.2.3.2 Interface Testing

Interface Tests to determine whether pairs of applications or major subsystems are capable of sending and receiving interface data correctly. Complete interfaces between applications often cannot be tested until integration testing.
Application Interface testing includes the following:

· Interfaces to be tested

· Interface type: simulated or fully integrated

· Interface functions to be tested, including any of the following:

· Format

· Timing

· Data synchrony

· Initiation triggers

· Receipt acknowledgment

· Processing of interface data.

· Expected results

1.2.3.3 Security Test

Security Tests are used to determine if security requirements are implemented as specified. This test phase will Test all security functionality, including user access/authentication testing and penetration testing.
1.2.3.4 Regression Testing

Regression Testing is used to verify that the baseline upon which VSS is built continues to function properly as the new functionality is added to the test environment. Where possible, Regression Testing will be conducted through the execution of baseline test cases. Regression Testing of the baseline will occur at several points during the testing cycle.

1.2.3.5 Performance Testing

The purpose of Performance Testing is to test the VSS deliverables against accepted performance requirements established for and agreed upon by the customer for the VSS. Stress testing will be out of scope due to limited testing tools being available to the test team.

1.2.4 Out of Scope

The above listed test types are the only tests that are within the scope of this SIT, all other test types are out of scope for this project.

The VLC will be tested within the scope of this project in that the scheduler will start the VLC but no other VLC functionality will be tested. This piece of software is already being utilised and any testing problems not related to the video scheduler software are outside the scope of this project.

The listed hardware and software is the only available testing platforms for this project any other types of browsers, operating systems or hardware are outside the capacity of the project.
1.3 Definitions, Acronyms, and Abbreviations
Definitions, acronyms, and abbreviations used within this Unit Test Plan are listed below. Other definitions, acronyms, and abbreviations can be located in the related Test Plan Strategy document.

Baseline
A specification, or product, that has been formally agreed upon which serves as the starting point against which progress will be judged.

PGPt
Pretty Good Project Team – Team name and our group number is group 4

VLC
Video LAN Client is a complete software solution for video streaming. Video LAN is designed to stream MPEG videos on high bandwidth networks.

GUI
Graphical User Interface – a user interface (displayed as part of a program) which uses graphical images (pictures, icons. etc.) to interact with the end user

MySQL
A popular open source DBMS

DBMS
Database Management System – a software application to manage databases

SIT
System Integration Testing

System Integration Testing
Each individual work product (such as application software, technical infrastructure, facility, documentation, or training material) still meets requirements when integrated with the rest of the release and the business system, the system as a whole meets its requirements, and the system will pass the subsequent acceptance testing.

Test Cases
Test Case is a collection of test sequences with expected outcomes for testing the identified functionality. Test sequences include sample test data to be used to uncover defects. The test case also lists the expected outcome for every test sequence, enabling the developer to verify execution of test cases as planned.

TIE
Test Integration Environment

VSS
Video Scheduling System

1.4 References
Refer to the following document(s).

User Requirements Document (URD)

Design Document

Unit Test Plan/Test Results

Software Test Plan
1.5 Overview of Test Plan
The following high-level plan indicates a path, which may be followed to provide a structured process for ensuring that the SIT is properly executed.

1. Develop integration testing scenarios.

· Used to test deliverables as defined for Video Scheduling System only

· See the “Integration Test Cases”

2. Develop Configuration scenarios.

· Identify scenarios.

· Develop scenarios.

3. Develop Data Entry scenarios.

· Identify scenarios.

· Develop scenarios.

4. Develop Interface scenarios

· Identify scenarios.

· Develop scenarios.

5. Finalize integration testing plan.

· Determine and document any additional required steps.

· Verify time requirements.

6. Finalize acceptance criteria.

· Determine the testing results, which will lead to acceptance.

7. Identify and create baseline.

8. Phase I – Module Testing

· Ensure that all Video Scheduling System modules fit together correctly (i.e. “everything’s working”).

· The testing will be performed, by executing scenarios selected from the test cases and documented in the test case tables. If any cyclical testing is required then:

· Each cycle will be comprised of the following steps:

· Execute the scenario

· Resolve problems, if necessary

· Re-execute the scenario, if necessary

· Verify and document results

9. Phase II – Detailed Testing

· This is a more detailed examination of the Video Scheduling System deliverables and functional components. The purpose is to ensure data passes correctly between modules, verify data was converted correctly, and to confirm the system set-up.

· The testing will be performed, by executing scenarios selected from the test cases and documented in the test case tables. If any cyclical testing is required then:

· Each cycle will be comprised of the following steps:

· Execute the scenario

· Resolve problems, if necessary

· Re-execute the scenario, if necessary

· Verify and document results

10. Phase III – Interface Testing

· The goal of this phase is to verify external interfaces, both automated and manual, and to verify that data passes correctly between all components of the system and external interfaces.

· The testing will be performed, by executing scenarios selected from the test cases and documented in the test case tables. If any cyclical testing is required then:

· Each cycle will be comprised of the following steps:

· Execute the scenario

· Resolve problems, if necessary

· Re-execute the scenario, if necessary

· Verify and document results

11. Phase IV – System Performance Testing

· A final step will be to examine the performance of the system. The testing will be performed, by executing scenarios selected from the test cases and documented in the test case tables. If any cyclical testing is required then:

· This cycle will be comprised of the following steps:

· Execute the scenario

· Resolve problems, if necessary

· Re-execute the scenario, if necessary

· Verify and document results

12. Acceptance
· At this point the application will have been successfully tested and is ready for acceptance by the client, after which the next phase User Acceptance Testing could be started

· By accepting the results from the System Integration test, the client will acknowledge that the testing was successfully completed.

· Verify that test results meet the acceptance criteria outlined in step 7, “Finalize Acceptance Criteria”.

· Document any new issues

1.6 Items to be tested
Module
Type
Title/Description

Database
Database
A MySQL relational database will act as the central repository of data for the system. It shall hold information regarding videos that can be streamed, users, streaming channels and other data. It shall reside on a single server in the system and will be used by the Administrator GUI, dynamic Web Page and Scheduler daemon described below. This database shall provide the underlying data structure of the system and will perform an integral role in the operation of all applications the project team will create.

Scheduler
Program
The scheduler is to initiate VLC on the server-end whenever a stream is scheduled to take place. For each time slot the database is queried and for each stream (on separate channels) scheduled for that slot, a separate VLC process is created/forked to deliver a multicast stream to the channels network destination

Administrator Graphical User Interface
Program

Screens
The Administrator GUI connects to a MySQL database via the menu and maintains this connection until the program is exited or the user chooses to disconnect. The Administration GUI sends SQL queries via database APIs and processes the results.

Dynamic Web Page
Program

Screens
The Administrator GUI connects to a MySQL database via the menu and maintains this connection until the program is exited or the user chooses to disconnect. The Administration GUI sends SQL queries via database APIs and processes the results.

2 Test Environment

2.1 Test Integration Environment

The development environment has been set-up in an office and the following computers are available for all system integration testing:

Windows XP – Client – 134.115.65.33 (AKA it3051-01.murdoch.edu.au)

RedHat 9 – SERVER – 134.115.65.36 (AKA it3051-02.murdoch.edu.au)

RedHat 9 – Client – 134.115.65.112

The three computers will have access to the following software:
Computer

IP Address & Hostname

Users
Software

Server
IP: 134.115.65.36

Hostname: it3051-02.murdoch.edu.au
User: PGPt
Pass: PGPt04

Super user: root

Pass: PGPt04
RedHat 9

VLC

MySQL

Apache

Opera

Perl

System Services (incl. mail, sftp, SSH etc)

Windows Client
IP: 134.115.65.33

Hostname: it3051-01.murdoch.edu.au
Super user: PGPt
Pass: “”
Windows XP

VLC

Firefox

WinSCP3

Linux Client
IP: 134.115.65.112

User: PGPt
Pass: PGPt04

Super user: root

Pass: PGPt04
RedHat 9

VLC

The database, administration GUI, scheduler and dynamic web pages i.e. all components of the Video Scheduling System will be loaded onto the computers or servers as required to perform system integration testing.

2.2 Test Data
Test Data developed for Video Scheduling System unit testing will be reused for SIT. Additional test data will be generated through the execution of test cases and procedures. The test cases will need to be tested in the correct order to ensure that data is available in the test system for procedures used later in the sequence.

Configuration data and files for the Scheduler and Administration GUI will need to be checked for correctness before the SIT can be performed. Video files will also need to be made available on the appropriate servers and entered in the database so that the system can be tested.

2.3 Backup, Recovery and Maintenance Procedures
Backups of all programs have been periodically performed and there are copies of the programs on Gryphon as well as home computers.
Version Control has been implemented to manage all versions of the programs. If any program becomes corrupted or non recoverable either a backup copy or the last version of the program will be used. The Tester(s) as is required will cater for any special backup or recovery requirements.

2.4 Security/Access Control
The test environment is located in a secure room within the University and access is restricted to group members and authorised University Staff. The computers as per above have specified logins for the team. Documents and programs will be held on a secure directory on Gryphon.

2.5 Test Tools
No specific testing tools have been made available for this project. Testing was carried out by testing on the web page or admin application as required and the results tabulated in a text editor, and screen shots added.
3 Management Procedures

3.1 Human Resources

This is a list of resources required to conduct the System Integration testing activities related to this document and their responsibilities within the team.

Role
Available
Name
Responsibility

Test Manager

100%
Cliffe Schreuders
Test Management

Conduct Reviews

Defect Tracking and Reporting

Issue Reporting and Tracking

Suspension and Escalation of testing

Test Review Meetings

Conduct of Test Completion Review

Test Coordinator
100%
Michelle Lister
Test Planning

Test Documentation

Test Case Creation

Test Coverage Analysis

Test Result Analysis

Defect Tracking and Reporting

Conduct Reviews and Test Completion Review

Tester
100%
David Patullo
Install & Configure

Test Case Creation

Test Case Execution

Defect Reporting to the group

Resolution of the problems and regression testing

Tester
100%
Martin Sulzynski
Install & Configure

Test Case Creation

Test Case Execution

Defect Reporting to the group

Resolution of the problems and regression testing

Tester
100%
Tim Radbone
Install & Configure

Test Case Creation

Test Case Execution

Defect Reporting to the group

Resolution of the problems and regression testing

3.2 Standards and Procedures
This section of the System Integration Test Plan describes the standards and procedures used during testing.

3.2.1 Pass/Fail Descriptions

As Tests, Test Procedures, and Test Steps are completed, they are graded on a Pass/Fail Basis. The table below lists the criteria for determining the Pass/Fail status of Testing.

Status
Level
Description

Pass
Step
Expected result achieved

Pass
Procedure
Successful completion of all test steps

Pass
Test
Successful completion of all procedures comprising the test

Fail
Step
Expected result not achieved

Fail
Procedure
Failure of one or more test steps

Fail
Test
Failure of one or more procedures comprising the test

3.2.2 Defect Priorities

The Testers assign priorities to each defect, and each defect must have a priority. Once assigned, only the Test Manager and the Test Coordinator have the authority to change the priority. The table below lists the defect priorities to be used during the SIT Testing.

Priority
Description

1
A defect with this priority prevents the tester progressing further with the test cases and therefore jeopardizes the project schedule. Immediate action to resolve this defect is required.

2
The Test Case is executed, but the expected result is not obtained. The system does not meet the specified requirement/design.

3
Test Case is executed, and expected result is obtained with an exception. The system meets the specified requirement/design.

4
Change request. The system meets the specified requirement/design, however a change is requested.

3.2.3 Defect Workflow

Each defect has to follow a workflow that begins with the open status and ends with one of the final states. Throughout the life of a defect, its status may change several times. The table details the possible states.
Status
Description

New
Every new defect starts with this status of New. If a retest of a fixed defect fails, it will be set to New again.

Assigned
Every defect with the status New has to be Assigned to a developer (usually the component developer unless they are unavailable).

Fixed
After resolving and successfully SIT testing an Assigned defect, the developer will set the status to Fixed.

Closed
After successful retesting of a resolved defect the status will be set to Closed by the tester.

Rejected
The Test Manager/ Test Coordinator may reject defects that are not in accordance with the system requirements.

Out of Scope
Defects which will not be fixed within this phase, e.g. because they are not within the scope or due to other reasons, will be marked as Out of Scope.

3.2.4 Defect Reporting and Removal

If a number of defects are reported during SIT testing an excel spreadsheet should be created to keep track of the defects, priorities and status. The PGPt will then be able view the status of defects.

If a defect is reported by another team member this will be forwarded to the appropriate development person for correction in accordance with their assigned severity and priority. The developer will endeavour to fix the problem as soon as possible to keep the SIT testing on Schedule. Once the problem is corrected the developer will inform the rest of the team and the correct delivery procedures will be followed to move the change to the test bed. See section 9 of the Software Test Plan for redelivery procedures for problem fixes.

3.2.5 Test Reviews

During the period of SIT testing regular debriefing will be held with the PGPT team to review the progress and report any problems or concerns with the system. All required resources would participate in defect discussions to help determine the validity of the test incident as an actionable item.

3.3 Entrance and Completion Criteria
3.3.1 Test Entrance Criteria

The following describes the entrance criteria for migrating the Video Scheduling System deliverables from development to SIT for formal testing.

A review will be conducted for each Video Scheduling System deliverable prior to the Integration to ensure that all items under test are ready for installation in the Test Integration Environment.
Entry Criteria
Complete (Y/N)

Unit Test Phase is completed
Y

Baseline Requirements Defined
Y

Test Strategy / SIT Test Plan
Y

Test Cases / Test Data
N

Test Environment
Y

Components have been built according to the requirements document
Y

3.3.2 Test Completion Criteria.

The following describes the exit criteria for the Video Scheduling System.

All Formal tests have passed and there are no outstanding defects. The System Integration Test Plan and Results are complete. A Test Completion Review has been held to ensure that all exit criteria is met and the System is ready for User Acceptance Testing.

Exit Criteria
Complete (Y/N)

All test cases have been executed
N

Outstanding problem records are approved by the project team as acceptable or out of scope
Y

All fixes have been re-tested regression testing has been performed
Y

Systems Integration Plan has been completed with results
N

All deliverables completed
Y

Testing is baseline.
N

Please note that Unit testing has been performed for each system component and all requirements have been met. The ‘N’ values in the table above are simply to indicate that due to time constraints, we have been unable to perform System Integration Testing.

3.4 Suspension and Resumption of Testing
Please refer to the Software Test Plan for the approach to the Suspension and Resumption of testing in this project.

3.5 Assumptions and Constraints
Please refer to the Software Test Plan documentation

3.6 Change Management Procedures
Please refer to the Software Test Plan for the approach to the change management procedures.

4 Integration Strategies
4.1 Description of Integration Build Process
The test bed will be checked against the documentation in section 2.2 to ensure that the test environment is as planned. Additional hardware required for the VSS deliverables will be added and checked against the documentation in 2.2 to ensure that the additions to the test environment are as planned. As each deliverable is loaded and configured, a focused sub-set of the regression test will be run to ensure that there is no unplanned impact on the baseline. Upon completion of integration the system will be base lined again. The Test Manager will need to backup and document the test environment at each stage of the process, both to permit a restore of the base line and comparison for changes.

4.2 Integration Order and Dependencies
The Integration order will be based on the sequence in which the deliverables are received and the type of platform they must use. If the review process reveals a sequence dependency for loading the deliverables an amendment will be made to this test plan.
4.3 Integration Verification Approach
Successful completion of the installation and configuration of each deliverable and the associated regression tests will constitute verification of integration for the VSS Baseline.

All the entry criteria will need to be met before the Integration test phase can begin.

The next stage of the SIT process involves verification of functional and interface requirements, user interface procedures and performance requirements.

5 Test Design

5.1 Test Types (Test Cases)

The following test types will be used during VSS testing.

Test Level
Test Type

Development
· Unit Tests

System Integration
· Integration Test

· Security Test

· Regression Test

· Performance Test

Acceptance
· User Acceptance Tests

Deployment
· Operational Test and Evaluation

The following test cases have been modified from the Unit Test Cases to test the procedures and functionality described in the User Requirements and Design documents.

Configuration Management:
Version
Date
Author
Description of change

1.1
30/09/2004
Michelle Lister
Initial Systems Integration Plan

1.2
04/10/2004
Michelle Lister
Integrated Test Results

1.3
05/11/2004
Tim Radbone
Updated

1.4
05/11/2004
David Patullo
Formatted / Final version

PAGE
System Integration Test Plan Page 2 17/04/2014