Chapter 8: Testing & Evaluation Flashcards
What is Usability Testing?
Useful tool that can be implemented to improve functionality. A group of users are given access to the current version of the code and asked to perform certain tasks. The users then give feedback on how easy it was to perform those tasks, and other feedback on the user interface and other improvements.
What is involved in testing a program using a testing table?
First selecting appropriate functions to check, then selecting a range of test
data that would check all boundaries of the solution and calculating expected results, then runing the test on the program to get actual results comparing those with the expected results.
What is accessibility?
Allowing the solution to be used by a wide range of users
What is bench marking?
Commonly performed in the analysis stage,
Benchmarking is documenting the capabilities of the software solution and comparing these with the design specificiations.
What is involved in documentation testing and why is it important?
Creating a record of what has been carried out, how successful it was and what measures were taken to correct any errors that were found.
This information is useful for management in reporting and being accountable to clients.
What is included in documenting test results
1) Tester’s name and department
2) Software package and version
3) Date and time tests were performed
4) Full description of results
5) Resolution of any problems
In an evaluation strategy, what does the first row include
Time frame:
Prior to signing off on the finished solution
Description:
1) Acceptance or usability testing by the users
Efficiency measures:
- Is the software solution easy to use and understand
- Does the solution produce results in a timely manner?
Effectivenss measures:
- Does the solution produce accurate results
In an evaluation strategy, what does the 2ND row include
Time Frame:
- IMMEDIATELY after the solution has been installed
Description:
- Network or technical staff ensure that the solution is operational and is ready to be used
Effectiveness measures:
- Is the software solution fully operational
In an evaluation strategy, what does the 3RD row include
Time Frame:
3-6 Months after the introduction of the software
Description:
Feedback is gathered from the clients using the methods:
1) Surveys
2) Interviews with select staff/managers
3) System error logs
4) Feedback forms
Efficiency measures:
- Is the software as easy to use as it was initially?
- Does the solution continue to produce results in a timely manner (fast), now that more users are using the solution at the same time?
Effectiveness measures:
- Is the correct information being produced?
What is quality assurance?
Process by which DEVELOPERS ensure a new software solution is operating correctly, and if it meets
- For this acceptance testing criteria, identify a technique to evaluate it:
Does the software solution perform the tasks set out in the SRS?
Identify the main tasks required of the software solution as identified in the SRS, placing them in a testing table, checking if the actual results can be compared/match the expected results
- For this acceptance testing criteria, identify a technique to evaluate it:
Is the software solution effective in producing the correct/expected results?
Compare information that is produced using the previous information system to that produced using the software solution.
OR, manually verify it.
- For this acceptance testing criteria, identify a technique to evaluate it:
Is the software solution effective in producing the correct/expected results?
Compare the time taken to perform tasks using the software solution to the previous system.
- For this acceptance testing criteria, identify a technique to evaluate it:
Is the software solutions user interface clear and intuitive?
Select a number of users to provide feedback on the functions and the easy of understanding and consistency of navigation in the solution.
What are some factors that influence the effectiveness of project plans?
1) Clear scope
2) Specification creep
3) Changes in staff
4) Communication issues
5) Inadequate time for testing
6) Budget constraints
7) Tech changes