Outcome 3 Flashcards
Archiving
Long term record of events.
Often compressed, and are placed in medium-to-long-term storage
Often required by local authorities
Backups
A copy of a file(s) in case the original is lost or damaged
Backups - Types
Differential Backups - takes a backup of all items that were changed since the last full backup
Incremental Backups - takes a backup of all items that were changed since the last incremental backup
Full Backups - complete backup of all data regardless of what has been backed up beforehand
Reasons to store backups separate to main database
Regular backups will require less physical space
Regular backups will take less time
Retrieval of backups will take less time
Free space on the system
No confusion between the old and new data
Disposing of Data
Process of deleting data so that it cant be accessed again
Requires data be scrubbed -> meaning data is overwritten by 0’s or 1’s three or more times
Factors affecting the access of data (x5)
- Latency
- Reliability
- Cost
- Capacity
- Ease of Use
Unit Tests
low level tests close to the source code that checks various different modules of the source code
Integration Tests
verifies that different modules used by the application work together
Functional Tests
tests if the solutions meet the SRS. Just the output is checked
Acceptance Tests
formal test that verifies whether the SRS has been satisfied
Performance Tests
observes response times, user behaviour
User Acceptance Testing
checks if the end users are satisfied. Uses a combination of interviews, questionnaires and observations
Examples of Appropriate Test Data x5
Valid Data
Valid but unusual data
Invalid data
Boundary condition data
Wrong Data
Usability Testing
Testing that focuses on experience and any issues that arise with the clients use of the solution (i.e. can they use the solution easily
Ways to document test results
- Testing Tables
- Subjective Reports
- Capturing screenshots of features that are not normally visible
- make handwritten calculations to verify
- capture screenshots of the solutions validations rules
Annotations to Gantt Charts
handwritten or added as notes that give reasons to task schedules or resourcing priorities
Evaluation
Final stage of the problem-solving methodology - checks how well the solution is satisfying the needs of the user it was created for.
Is not Testing - by the time evaluation has begun, the solution has been proven to work (in the testing phase) and its functionality is not in question
Efficeincy
A measure of how much time cost and effort is applied to achieve intended results
Effectiveness
A measure of how well a solution, information management strategy or network functions and whether each achieves its intended result
Examples of Quantative Evaluation Methods
- task completion times
- survey
- number of successfully completed tasks
When to Evaluate
after the solution has been ‘bedded in’ and the users are familiar and comfortable with it.
A few months of regular use is typical
Purpose of Evaluating project Plans
help judge how your plan and the techniques used help you manage your project
Avoids the company making the same mistake as well
Evaluation Criteria for Project Plans
Completeness - were any significant asks omitted from the project management plan? were resources included? was it annotated when required?
Maintainability - how easy was it to modify the Gantt chart to keep it up to date with reality:?
Accuracy - were tasks correctly identified and marked as dependent or concurrent, in the right sequence
Readability - was it easy to see the tasks and their dependencies? Was the Gantt chart and its text a readable size? were the colour choices appropriate?