Items that are intended to stay in as part of your document are in bold; explanatory comments are in italic text
Bash! – Test Plan
TEST PLAN
For
PocketMILS Software Project
Version 0.4
Prepared by Widury Bizar, Stephen Keating, Joseph Moore,
Jasmin Nankiville, Simon Thompson
BASH!
DOCUMENT CHANGES TABLE
Title:
Document Control List
Type:
Document
Author(s):
BASH!
Version:
0.1
Date:
7 Nov 2005
Description:
Test Plan Documentation
From:
BASH!
To:
Shri Rai, Chris Klisc
Document Changes
Draft
V0.1
25-10-05
Document Created –Jasmin
V0.2
27-10-05
Document Updated – Jasmin
V0.3
6-11-05
Document Updated – Jasmin
V0.4
7-11-05
Document Completed – Jasmin
TABLE OF CONTENTS
Document changes table
2
1
INTRODUCTION
5
1.1 Objectives
6 – 7
1.2 Testing Strategy
8
1.3 Reference Material
9
1.4 Definitions and Acronyms
10
2
TEST ITEMS
11
2.1 Program Modules
11 – 13
2.2 User Procedures
14
2.3 Operator Procedures
14
3
Features to be Tested
15
4
FEATURES NOT TO BE TESTED
15
5
APPROACH
16
5.1 Component Testing
17 – 22
5.2 Integration Testing
23 – 29
5.3 Conversion Testing
30
5.4 Recovery Testing
30
5.5 Performance Testing
31
6
PASS / FAIL CRITERIA
32
6.1 Approval Criteria
33
7
Testing Process
34
7.1 Test Deliverables
34
7.2 Testing Tasks
34
7.3 Responsibilities
35
8
Environmental Requirements
36
8.1 Hardware
36
8.2 Software
36
8.3 Tools
37
8.4 Publications
37
1.
INTRODUCTION
The purpose of this document is to identify testing requirements that must be carried out whilst also describing the techniques and tools used.
This document encompasses the stringent quality control techniques that BASH! Incorporates: as per (IEEE standard 829).
To ensure all requirements are satisfied, testing will be undertaken for all possibilities or scenarios. This document comprehensively defines all desired results of the TGS and systematically qualifies every outcome through the Bash! Quality Control Framework. All functionality must be incorporated error free and all necessary tasks must be completed.
NOTE 1: THE SOFTWARE TEST PLAN GUIDELINES WERE DERIVED AND DEVELOPED FROM IEEE STANDARD FOR SOFTWARE TEST DOCUMENTATION 829-1998.
Note 2: The ordering of Software Test Plan STP elements is not meant to imply that the sections or subsections must be developed or presented in that order. The order of presentation is intended for ease of use, not as a guide to preparing the various elements of the Software Test Plan. If some or all of the content of a section is in another document, then a reference to that material may be listed in place of the corresponding content.
The Introduction section of the Software Test Plan STP provides an overview of the project and the product test strategy, a list of testing deliverables, the plan for development and evolution of the STP, reference material, and agency definitions and acronyms used in the STP.
The Software Test Plan STP is designed to prescribe the scope, approach, resources, and schedule of all testing activities. The plan must identify the items to be tested, the features to be tested, the types of testing to be performed, the personnel responsible for testing, the resources and schedule required to complete testing, and the risks associated with the plan.
1.1 Objectives
For all system components and relationships testing for functionality is listed below:
· Performs necessary action
· No errors present
· All possible outcomes tested for all alternatives
· Test all relationships
White box testing is performed by the developers and programmers systematically through coding and program formation. This will be done for every new function and code implemented, to be tested in relation to whole program.
Program functionality will be tested by two types of testers (novice and expert). Both types of testers will need a general knowledge of what the system is supposed to do
Novice testers who will have little to no programming experience will be used to test the system usability, how easy it is to use and can they perform necessary tasks.
Expert testers who will have some programming experience will also test all components together and test all code relationships to ascertain all necessary components.
All testing will be conducted systematically and continuously throughout the project lifecycle. Developers will test all new functionality as implemented and specialised testers will be implemented near completion to satisfy all requirements.
PDA portability of the existing Uni-U TGS allowing users to utilise all system functionality remotely. Necessary software must be delivered to upload (input all necessary data) and download all necessary data for the TGS experience from a PDA ‘in the field’.
Testing that our system has portability and functionality requirements. This has been achieved using VB .Net and by incorporating Wireless functionality.
Describe, at a high level, the scope, approach, resources, and schedule of the testing activities. Provide a concise summary of the test plan objectives, the products to be delivered, major work activities, major work products, major milestones, required resources, and master high-level schedules, budget, and effort requirements.
1.2
Testing Strategy
As per the stringent quality control practices implemented by BASH!, iterative updating of testing procedures will continuously be enforced.
Most importantly a clear perspective of system functionality and purpose is vital before testing the system. Throughout our testing a detailed knowledge of all system components:
Classes, variables, functions, objects, programs, operating systems, etc is mandatory before testing anything on an internal level. This should include how each component interacts with each other and what purpose each has in the system.
Programmers or developers systematically test all functionality throughout project lifecycle, as they have best understanding of all components as a single entity.
Testing is the process of analyzing a software item to detect the differences between existing and required conditions and to evaluate the features of the software item. This may appear as a specific document such as a Test Specification, or it may be part of the organization’s standard test approach. For each level of testing, there should be a test plan and an appropriate set of deliverables. The test strategy should be clearly defined and the Software Test Plan acts as the high-level test plan. Specific testing activities will have their own test plan. Refer to section 5 of this document for a detailed list of specific test plans.
Specific test plan components include:
Purpose for this level of test, Items to be tested, Features to be tested, Features not to be tested, Management and technical approach, Pass / Fail criteria, Individual roles and responsibilities, Milestones, Schedules, and Risk assumptions and constraints.
1.3
Reference Material
· Software Requirements Specification Documentation
· Architecture and Design Document
· Software Project Management Plan
· Correspondence
· Relevant Standards IEEE
Provide a complete list of all documents and other sources referenced in the Software Test Plan. Reference to the following documents when they exist is required for the high-level test plan: Project authorization, Project plan, Quality assurance plan, Configuration management plan, Organization policies and procedures, and Relevant standards.
1.4
Definitions and Acronyms
Term/Acronym
Definition Relevant to This Document
Client
Uni-u International
DCL
Document Control List
FTP Server
BASH! internal FTP Server, administered by the FTP Administrator. Used for uploading Project Files for centralisation.
Game
The Travel Game referred to in 1.1 (above) that Uni-u International has been contracted to deliver to a Hong Kong University.
PDA
Personal Digital Assistant – Includes any device which can be used as a portable computer.
XML
eXtensible Markup Language
XSLT
eXtensible Stylesheet Language Transfer
XHTML
eXtensible Hyper Text Markup Language
SQL
Structured Query Language – Used for interrogating/updating a database
ASP
Active Server Pages – Microsoft scripting language interpreted on the server side.
PHP
PHP Hypertext Pre Processor – An open source scripting language interpreted on the server side.
VB .NET
Visual Basic .NET programming application that allows users to create .NET applications, including windows applications, web servers/applications.
MSDN
Microsoft Development Network – set of online and offline services designed to help developers write applications using Microsoft products and technologies.
WiFi
Handles wireless networking
Specify definitions of all terms and agency acronyms required to properly interpret the Software Test Plan. Reference may be made to the Glossary of Terms on the IRMC web page.
2.
TEST ITEMS
Specify the test items included in the plan. Supply references to the following item documentation: Requirements specification, Design specification, Users guide, Operations guide, Installation guide, Features availability, response time, Defect removal procedures, and Verification and validation plans.
2.1 Program Modules
Connection Module
· Test if there is a connection to the database/server
· Test if there is a connection to the database/server over wireless connection using WiFi
· Test if there is a connection to the database/server from a different computer
· Test if there is a connection to the database/server from a computer in a different lab
· Test if you can view entries in the database
· Test what happens when the database does not exist
Login Module
· Add user (userName, password) *userID created automatically
· Test user can login successfully
· Remove user
· Test user is removed successfully
· Add user to database without password, see what happens
· Add user to database without userName see what happens
· Modify users details
· Test user details are updated successfully
Activity Module
· Remove activity from database
· Test activity is removed successfully
· Add new activity to database
· Test if activity is created successfully
· Test what happens if you add activity to database when database is read only
· Test what happens if blank activity is added
· Test what happens if you add an activity that already exists in database
· Modify activity in database
· Test activity is updated successfully (should only be one copy)
· Add audio file to database
· Test audio file is added successfully
· Test audio file plays and sound is heard successfully
· Test audio file format is recognised
· Remove audio
· Test audio file is removed successfully
· Add picture to database
· Test picture is added successfully
· Test picture file is displayed successfully
· Test picture is displayed in correct size
· Test picture format is recognised
· Remove picture
· Test picture is removed successfully
Main Module
· Test ‘New Activity’ button works (Activity form should open)
· Test ‘Edit Activity’ button works (Edit Activity form should open)
· Test ‘Options’ button works (Options form should open)
· Test ‘Logout’ button works (should logoff server – remote)
· Test ‘Exit Program’ button (program should terminate)
Options Module
Outline testing to be performed by the developer for each module being built.
2.2
User Procedures
Reporting tests carried out
When testing, several aspects need to be recorded. These aspects are recorded onto the Test Report document. Aspects such as:
· Type of test
· Modules used
· Defects found
· Defects solved and how
· Retest results
· Course of Action
Describe the testing to be performed on all user documentation to ensure that it is correct, complete, and comprehensive.
2.3 Operator Procedures
Help Desk Software is powerful management software that automates many features of a company’s help desk environment. Typical functionality includes call management, call tracking, knowledge management, problem resolution, and self-help capabilities. The software is shared by all members of a support area, including the first point of contact for the helpdesk, and the staff that receive job requests for later resolution.
Describe the testing procedures to ensure that the application can be run and supported in a production environment include Help Desk procedures.
3.
Features to be Tested
All requirements should pass all tests when utilised on the PDA with Active Sync installed using VB .NET Compact Framework. Both the PDA and PC involved should have Windows XP or Windows Pocket PC operating systems. The VB .NET compact framework should be implemented through Microsoft Visual Studio Development Environment Version 6.
Identify all software features and combinations of software features to be tested. Identify the test design specifications associated with each feature and each combination of features.
4.
Features Not To Be Tested
The following features will not be tested for:
· GPS (Global Positioning System)
GPS will not be tested for because our PDA does not have GPS capabilities built in.
· MILS database
We do not have sufficient user privileges to alter the existing MILS database schema.
Identify all features and specific combinations of features that will not be tested along with the reasons.
5.
APPROACH
As per the stringent quality control practices implemented by BASH!, iterative updating of testing procedures will continuously be enforced.
Most importantly a clear perspective of system functionality and purpose is vital before testing the system. Throughout our testing a detailed knowledge of all system components:
Classes, variables, functions, objects, programs, operating systems, etc is mandatory before testing anything on an internal level. This should include how each component interacts with each other and what purpose each has in the system.
Programmers or developers systematically test all functionality throughout project lifecycle, as they have best understanding of all components as a single entity.
Describe the overall approaches to testing. The approach should be described in sufficient detail to permit identification of the major testing tasks and estimation of the time required to do each task. Identify the types of testing to be performed along with the methods and criteria to be used in performing test activities. Describe the specific methods and procedures for each type of testing. Define the detailed criteria for evaluating the test results.
For each level of testing there should be a test plan and the appropriate set of deliverables. Identify the inputs required for each type of test. Specify the source of the input. Also, identify the outputs from each type of testing and specify the purpose and format for each test output. Specify the minimum degree of comprehensiveness desired. Identify the techniques that will be used to judge the comprehensiveness of the testing effort. Specify any additional completion criteria e.g., error frequency. The techniques to be used to trace requirements should also be specified.
5.1
Component Testing
ACTIVITY MODULE (Activity Form)
Software Element
Data Type (Size)
Description
Pass/Fail
Description TAB
txtShortTitle
varchar (50)
Can be saved and uploaded to database.
(
cboCategory
int (4)
Displays list of Categories (Cultural, Travel). Category selected is saved and uploaded to database.
(
txtDescription
varchar (4000)
Can be saved and uploaded to database.
(
txtActivityID
int (4)
Textbox cannot be modified. Created automatically by the Database. Is not displayed to user.
(
Data TAB
txtStarRating
int (4)
Textbox cannot be modified. Value is taken from server, otherwise initialized to 0.
(
txtStatus
int (4)
Textbox cannot be modified. Value is taken from server.
(
txtQuantity
int (4)
???????????????????????
(
txtAuthorID
int (4)
Textbox cannot be modified. Dependent upon user selecting user. If user selects user that exists in database, txtAuthorID value is displayed for user. If user doesn’t exist, neither does txtAuthorID.
(
txtCreated
datetime (8)
Textbox cannot be modified. Created by database, depending on when activity was created.
(
txtLastMod
datetime (8)
Textbox cannot be modified. Created by database, depending on when activity was last modified.
(
txtLastModAuthor
int (4)
Textbox cannot be modified. Displays userID of user who last modified the current activities details.
(
ACTIVITY MODULE (Activity Form Continued)
Software Element
Data Type (Size)
Description
Pass/Fail
Resources TAB
txtResource1
varchar (100)
Value can be saved and uploaded to database.
(
txtResource2
varchar (100)
Value can be saved and uploaded to database.
(
txtResource3
varchar (100)
Value can be saved and uploaded to database.
(
Picture Tab
cboInsertPic
combobox
User can select ‘Insert Picture’ or ‘Insert URL’
(
pbxPicture
.jpg
Displays picture if user decided to ‘Insert Picture’ from the combobox
(
lblPictureFilePath
label
Displays URL if user decided to ‘Insert URL’ from the combobox
(
Details Tab
txtDuration
int (4)
Value can be saved and uploaded to database.
(
cboDurationType
int (4)
Combo box cannot be modified. Displays day, month or year for duration type. Value can be saved and uploaded to database.
(
txtCost
money (8)
Value can be saved and uploaded to database.
(
txtLatitude
int (4)
Value can be saved and uploaded to database.
(
txtLongitude
int (4)
Value can be saved and uploaded to database.
(
cmdGetGPSData
button
Retrieves value from a file that emulates GPS Data
(
ACTIVITY MODULE (Activity Form Continued)
Software Element
Data Type (Size)
Description
Pass/Fail
General (Below are listed for every TAB)
tabActivity
button
Allows user to view different sections of activity by selecting on different tabs (Description, Data, Resources, Picture and Details).
(
mnuMILS
context menu
Displays items mnuOpen…, mnuSave, mnuSaveAs . .., mnuClose
(
– mnuOpen
context menu
– mnuSave
context menu
Saves data from Description, Data, Resources, Picture and Details Tabs to PDA. (local).
(
– mnuSaveAs . . .
context menu
Saves data from Description, Data, Resources, Picture and Details Tabs to PDA. (local) as a different file name or to a different directory etc.
(
– mnuClose
context menu
Closes current form and returns user back to Main Menu
(
mnuRemote
context menu
Displays items mnuLogin, mnuLogout, mnuUpload, mnuEdit
(
– mnuLogin
context menu
Opens up Login Form
(
– mnuLogout
context menu
Logs user out of Server (remote)
(
– mnuUpload
context menu
Saves data from Description, Data, Resources, Picture and Details Tabs to Database. (remote).
(
– mnuEdit
context menu
(
LOGIN (Login Form)
Software Element
Data Type (Size)
Description
Pass/Fail
General
txtUserName
varchar (50)
User enters in name to login to system. Wont work unless txtPassword has been filled out correctly.
(
txtPassword
varchar (50)
User enters in password to login to system. Wont work unless txtUserName has been filled out correctly.
(
txtLoginStatus
textbox
Once user has clicked the Login button, this textbox displas the status of the user. (server details and user name)
(
cmdLogin
button
User clicks this button to login to system. Refers to value in txtUserName. If user exists, user has access to program, otherwise user is denied access.
(
cmdBack
button
Closes the Login form and returns to the Main Menu.
(
OPTIONS MODULE (Options Form)
Software Element
Data Type (Size)
Description
Pass/Fail
General (All fields must be correct, in order for connection to database)
txtServer
textbox
Valid ip address of server must be supplied or there will not be a connection to the server.
(
txtDatabase
textbox
Valid database name must be supplied or there will not be a connection to the database.
(
txtUser
textbox
Valid user name must be supplied or there will not be a connection to the database or server.
(
txtHost
textbox
Valid ip address for ftp server must be supplied or there will not be a connection to the database or server.
(
cmdSave
button
Saves details on form to options.conf file (XML) on the PDA (txtServer, txtDatabase, txtUser, txtHost).
(
cmdCancel
button
Closes the Options Form and returns to the Main Menu.
(
MAIN MODULE (Main Menu Form)
Software Element
Data Type (Size)
Description
Pass/Fail
General (All fields must be correct, in order for connection to database)
cmdNewActivity
button
Opens Activity Form. All fields are blank.
(
cmdOptions
button
Opens the Options Form.
(
cmdEditActivity
button
Opens EditActivityMain form.
(
cmdExit
button
Terminates Program
x
txtLoginStatus
text box
Displays login details once user has select the ‘Login to MILs’ button. If user has not logged in “Not currently logged in” is displayed.
(
REMOTE ACTIVITY MODULE (RemoteActivityList Form)
Software Element
Data Type (Size)
Description
Pass/Fail
General (All fields must be correct, in order for connection to database)
lvwActivities
listview
List view cannot be modified. All activities for a particular user are listed here. User can select an activity to open. This then opens up the Activity Form for the selected activity.
(
cmdOpen
button
Opens the Options Form.
(
cmdCancel
button
Closes the RemoteActivityList form, cancesl the operation.
(
EDIT ACTIVITY MODULE (Edit Activity Form)
Software Element
Data Type (Size)
Description
Pass/Fail
General (All fields must be correct, in order for connection to database)
cmdEditRemote
button
Opens editRemoteActivityList
(
cmdEditLocal
button
Opens Activity Form.
(
cmdBack
button
Closes the RemoteActivityList form, and returns you to Main Menu.
(
cmdLogin
button
Logs user into the Server (remote)
(
txtLoginStatus
textbox
Displays login details once user has selected the ‘Login to MILs’ button. If user has not logged in “Not currently logged in” is displayed.
(
Testing conducted to verify the implementation of the design for one software element e.g., unit, module or a collection of software elements. Sometimes called unit testing. The purpose of component testing is to ensure that the program logic is complete and correct and ensuring that the component works as designed.
5.2 Integration Testing
Task No.
Task Description
Steps Required to Complete Task
Result
1.
Create a new activity
Click ‘New Activity’ button
Activity created successfully
2.
Save local activity through edit activity
– Click ‘Edit Activity’ button
– Click ‘Local Activity’ button
– Select Activity
– Select ‘File/Save’ from menu
Verify Details
– Select ‘OK’ to save
Activity saved successfully
3.
Save local activity through new activity
– Click ‘New Activity’ button
– Add Details
– Select ‘File/Save’ button
– Select ‘OK’ to save
Activity saved successfully
4.
Modify Settings
– Click ‘Options’ button
– Select ‘Save’ button
Setting modified successfully
5.
View Settings
– Click ‘Options’ button
– Click ‘Cancel’ button
Setting viewed successfully
6.
Save remote activity
– Click ‘Edit Activity’ button
– Click ‘Login’ button
Enter details
– Click ‘Login’ button
– Click ‘Remote Activity’ button
– Select Activity
– Click ‘Open’ button
Edit/View Details
– Select ‘File/Save’ from Menu
Verify Details
– Select ‘OK’ to save
– Click ‘Yes’ button (to replace existing file)
Activity saved successfully
7.
Upload local activity
– Click ‘Edit Activity’ button
– Click ‘Login’ button
– Enter details
– Click ‘Login’ button
– Click ‘Local Activity’ button
– Select Activity
– Select ‘Remote/Upload’ from menu
Activity uploaded successfully
8.
View about details
– Click ‘About’ button
About details displayed successfully
9.
View contents through new activity
– Click ‘New Activity’ button
– Select ‘Help/Contents’ from menu
Contents displayed successfully
10.
View contents through local activity
– Click ‘Edit Activity’ button
– Click ‘Local Activity’ button
Select activity
– Select ‘Help/Contents’ from menu
Contents displayed successfully
11.
View contents through remote activity
– Click ‘Edit Activity’ button
– Click ‘Remote Activity’ button
– Select Activity
– Click ‘Open’ button
– Select ‘Help/Contents’ from menu
Contents displayed successfully
12.
Exit Program
– Click ‘Exit Program’ button
Program Exits Successfully
13.
Login through edit activity
– Click ‘Edit Activity’ button
– Click ‘Login’ button
– Enter Details
– Click ‘Login’ button
14.
Login through new activity
– Click ‘New Activity’ button
– Select ‘Remote/Login’ from menu
Enter Details
– Click ‘Login’ button
Login successful
15.
Login through local activity
– Click ‘Edit Activity’ button
– Click ‘Local Activity’ button
– Select Activity
– Select ‘Remote/Login’ from menu
Enter Details
– Select ‘Login’ button
Login successful
16.
Logout through Main menu
– Click ‘Logout’ button
Logout successful
17.
Logout through local activity
– Click ‘Edit Activity’ button
– Click ‘Local Activity’ button
– Select Activity
– Select ‘Remote/logout’ from menu
– Click ‘Logout’ button
Logout successful
18.
Logout through remote activity
– Click ‘Edit Activity’ button
– Click ‘Remote Activity’ button
– Select Activity
– Click ‘Open’ button
– Select ‘Remote/logout’ from menu
– Click ‘Logout’ button
Logout successful
19.
Edit remote activity through edit activity
– Click ‘Edit Activity’ button
– Click ‘Login’ button
– Enter details
– Click ‘Login’ button
– Select ‘Remote Activity’ button
– Select activity
– Click ‘Open’ button
Edit remote activity successful
20.
Edit remote activity through activity screen
– Click ‘Edit Activity’ button
– Click ‘Login’ button
– Enter details
– Click ‘Login’ button
– Select ‘Remote Activity’ button
– Select activity
– Click ‘Open’ button
– Select ‘Remote/Edit(Download)’ from menu
Edit remote activity successful
21.
Save as activity through new activity
– Click ‘New Activity’ button
– Add Details
– Click ‘File/Save As . . .’ button
– Change details
– Click ‘OK’ button to save
Activity saved with different details/location etc successfully
22.
Save as activity through edit activity (remote)
– Click ‘Edit Activity’ button
– Click ‘Login’ button
– Enter details
– Click ‘Login’ button
– Click ‘Remote Activity’ button
– Select activity
– Click ‘Open’ button
– Select ‘File/Save As . . . ‘ from menu
Activity saved with different details/location etc successfully
23.
Save as activity through edit activity (local)
– Click ‘Edit Activity’ button
– Click ‘Local Activity’ button
– Select activity
– Select ‘File/Save As . . . ‘ from menu
Activity saved with different details/location etc successfully
24.
Close activity through new activity
– Click ‘New Activity’ button
– Select ‘File/Close’ from menu
Activity closed successfully
25.
Close activity through edit activity (remote)
– Click ‘Edit Activity’ button
– Click ‘Login’ button
– Enter details
– Click ‘Login’ button
– Click ‘Remote Activity’ button
– Select activity
– Click ‘Open’ button
– Select ‘File/Close’ from menu
Activity closed successfully
26.
Close activity through edit activity (local)
– Click ‘Edit Activity’ button
– Click ‘Local Activity’ button
– Select activity
– Select ‘File/Close’ from menu
Activity closed successfully
27.
Download from server, save to PDA and upload back to server
– Edit Activity
– Click ‘Login’ button
– Enter details
– Click ‘Login’ button
– Click ‘Remote Activity’ button
– Select Activity
– Click ‘Open’ button
– Select ‘File/Save’ from menu
– Click ‘Yes’ button
– Select ‘File/Open’ from menu
– Select activity
– Select ‘Remote/Upload’
Activity downloaded from server and saved to PDA, then uploaded back to server successfully.
28.
Download from server, edit activity and upload back to server
– Edit Activity
– Click ‘Login’ button
– Enter details
– Click ‘Login’ button
– Click ‘Remote Activity’ button
– Select Activity
– Click ‘Open’ button
– Edit details
– Select ‘Remote/Upload’
Activity downloaded, edited and uploaded back to server successfully.
Testing conducted in which software elements, hardware elements, or both are combined and tested until the entire system has been integrated. The purpose of integration testing is to ensure that design objectives are met and ensures that the software, as a complete entity, complies with operational requirements. Integration testing is also called System Testing.
5.3 Conversion Testing
As our software is an extension of the current system already implemented by Uni-u, no conversion of formats is required.
Nevertheless the new data entry characteristics should be easily convertible from the current TGS interface. Any forms that are utilised on the PDA must ensure that accurate data is always submitted complete into the database. The integrity of the system should be maintained and upheld at all times, retaining all data from the TGS.
Testing to ensure that all data elements and historical data is converted from an old system format to the new system format.
5.4 Recovery Testing
We believe that testing for failure and testing for success are both critical requirements that should be tested for. After everything seemed to be working with our software we tried to make the procedures fail by implementing abnormal conditions such as high volume loads, missing links etc.
Testing done to ensure that application restart and backup and recovery facilities operate as designed.
5.5 Regression Testing
Our main method for regression testing was re-running previously run tests and checking whether previously fixed faults reemerged. If old bugs reappeared we retrieved the last saved working version and compared the difference between the two pieces of code.
Each time we found an error we recorded the error details and also the solution for the error.
Testing done to ensure that that applied changes to the application have not adversely affected previously tested functionality.
6. PASS / FAIL CRITERIA
CRITERIA
PASS / FAIL
User can capture and upload/save Pictures
(
User can interface easily with software on PDA
(
User can upload data from PDA to SQL Server
(
User can enter URL into PDA
(
User can only edit their own activities
(
User must supply username and password for logon
(
Software can be implemented over a wireless network
(
User can edit/open/add Local activity
(
User can edit/open/add Remote activity
(
User can modify Server/Database/FTP Hostname details
(
User can exit Program
(
User is able to enter URLs into the PDA
(
User is able to update their own activities with data from the PDA
(
User is able to upload data from PDA to SQL Server
(
User is able to download data from SQL server to PDA
(
Specify the criteria to be used to determine whether each item has passed or failed testing.
6.1 Approval Criteria
For test results to be approved our software must meet the following criteria:
· User is able to interface easily and intuitively with the software on the PDA
· User is able to update their own activities with data from the PDA
· Other people cannot edit another user’s activity
The following must be approved for each of the above:
· accurate description
· test number
· test type
· Desired result
· Actual result
· recommendation
For the above requirements we ran several different scenarios for each to see if any errors were found during testing. A test will be required for each component as aforementioned of our system. All components must pass all tests by the actual result and desired result correlating to whether the test passed.
Specify the conditions that need to be met to approve test results. Define the formal testing approval process.
7.
Testing Process
Identify the methods and criteria used in performing test activities. Define the specific methods and procedures for each type of test. Define the detailed criteria for evaluating test results.
7.1 Test Deliverables
· Integration Testing.doc
· Test Errors.doc
Identify the deliverable documents from the test process. Test input and output data should be identified as deliverables. Testing report logs, test incident reports, test summary reports, and metrics’ reports must be considered testing deliverables.
7.2 Testing Tasks
· Know what is happening
· Understand flow of system
· Understand User requirements
· Understand system requirements
· Understand individual components in larger model (abstraction)
· Discover a problem
· Discover possible action
· Decide on preferred action
· Recommend and undertake action
· Re Verify System effect (does system work as intended)
· Iterate process
· Team consensus
Identify the set of tasks necessary to prepare for and perform testing activities. Identify all intertask dependencies and any specific skills required.
7.3
Responsibilities
Managing
Designing
Preparing
Executing
Witnessing
Checking
Resolving
Developers
X
X
X
Testers
X
X
X
Operation Staff
X
X
X
Technical Support Staff
X
User
X
X
X
X
Identify the groups responsible for managing, designing, preparing, executing, witnessing, checking, and resolving test activities. These groups may include the developers, testers, operations staff, technical support staff, data administration staff, and the user staff.
8.
Environmental Requirements
Specify both the necessary and desired properties of the test environment including the physical characteristics, communications, mode of usage, and testing supplies. Also provide the levels of security required to perform test activities. Identify special test tools needed and other testing needs space, machine time, stationary supplies. Identify the source of all needs that is not currently available to the test group.
8.1 Hardware
Hardware required for testing includes:
· PDA with built in camera & audio capabilities. PDA should be running an Intel xScale processor with Windows XP Mobile 2003 for Pocket PC as operating system.
· Router
· Network Card and WiFi Card
Identify the computer hardware and network requirements needed to complete test activities.
8.2 Software
Software required for testing includes:
· Microsoft SQL Server 2000 Desktop Edition
· VB .Net Studio Compact Framework 1.1
· Smart Device Application
· Active Sync 3.8.0
· Smart Device Application
Identify the software requirements needed to complete testing activities.
8.3
Tools
· Microsoft Project
· Microsoft Word
Identify the special software tools, techniques, and methodologies employed in the testing efforts. The purpose and use of each tool shall be described. Plans for the acquisition, training, support, and qualification for each tool or technique.
8.4 Publications
· This Test Plan
· Requirements & Analysis Form
· Deliverable Breakdown Statement
· Project Management Plan
· Design & Architecture Form
· Final System and User Manual
· Integration Testing.doc
Identify the documents and publications that are required to support testing activities.
2