Outcome 3 Flashcards

1
Q

Archiving

A

Long term record of events.
Often compressed, and are placed in medium-to-long-term storage
Often required by local authorities

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Backups

A

A copy of a file(s) in case the original is lost or damaged

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Backups - Types

A

Differential Backups - takes a backup of all items that were changed since the last full backup
Incremental Backups - takes a backup of all items that were changed since the last incremental backup
Full Backups - complete backup of all data regardless of what has been backed up beforehand

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Reasons to store backups separate to main database

A

Regular backups will require less physical space
Regular backups will take less time
Retrieval of backups will take less time
Free space on the system
No confusion between the old and new data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Disposing of Data

A

Process of deleting data so that it cant be accessed again

Requires data be scrubbed -> meaning data is overwritten by 0’s or 1’s three or more times

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Factors affecting the access of data (x5)

A
  • Latency
  • Reliability
  • Cost
  • Capacity
  • Ease of Use
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Unit Tests

A

low level tests close to the source code that checks various different modules of the source code

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Integration Tests

A

verifies that different modules used by the application work together

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Functional Tests

A

tests if the solutions meet the SRS. Just the output is checked

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Acceptance Tests

A

formal test that verifies whether the SRS has been satisfied

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Performance Tests

A

observes response times, user behaviour

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

User Acceptance Testing

A

checks if the end users are satisfied. Uses a combination of interviews, questionnaires and observations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Examples of Appropriate Test Data x5

A

Valid Data
Valid but unusual data
Invalid data
Boundary condition data
Wrong Data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Usability Testing

A

Testing that focuses on experience and any issues that arise with the clients use of the solution (i.e. can they use the solution easily

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Ways to document test results

A
  • Testing Tables
  • Subjective Reports
  • Capturing screenshots of features that are not normally visible
  • make handwritten calculations to verify
  • capture screenshots of the solutions validations rules
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Annotations to Gantt Charts

A

handwritten or added as notes that give reasons to task schedules or resourcing priorities

17
Q

Evaluation

A

Final stage of the problem-solving methodology - checks how well the solution is satisfying the needs of the user it was created for.

Is not Testing - by the time evaluation has begun, the solution has been proven to work (in the testing phase) and its functionality is not in question

18
Q

Efficeincy

A

A measure of how much time cost and effort is applied to achieve intended results

19
Q

Effectiveness

A

A measure of how well a solution, information management strategy or network functions and whether each achieves its intended result

20
Q

Examples of Quantative Evaluation Methods

A
  • task completion times
  • survey
  • number of successfully completed tasks
21
Q

When to Evaluate

A

after the solution has been ‘bedded in’ and the users are familiar and comfortable with it.
A few months of regular use is typical

22
Q

Purpose of Evaluating project Plans

A

help judge how your plan and the techniques used help you manage your project

Avoids the company making the same mistake as well

23
Q

Evaluation Criteria for Project Plans

A

Completeness - were any significant asks omitted from the project management plan? were resources included? was it annotated when required?

Maintainability - how easy was it to modify the Gantt chart to keep it up to date with reality:?

Accuracy - were tasks correctly identified and marked as dependent or concurrent, in the right sequence

Readability - was it easy to see the tasks and their dependencies? Was the Gantt chart and its text a readable size? were the colour choices appropriate?