QA Flashcards

1
Q

[QA] Test Case

A

A set of conditions or variables under which a tester will determine whether a system under test satisfies requirements.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

[QA] Test Plan

A

A document outlining the scope, approach, resources, and schedule of testing activities.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

[QA] Defect or Bug

A

Any variance between expected and actual results in the software under test.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

[QA] Regression Testing

A

Testing conducted to ensure that a recent code change has not adversely affected existing features. This is usually done before a release.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

[QA] Black Box Testing

A

Testing where the tester is not concerned with the internal workings of the system but focuses on the inputs and outputs.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

[QA] White Box Testing

A

Testing where the tester has knowledge of the internal workings of the system and can design test cases accordingly.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

[QA] Smoke Testing

A

Preliminary testing to reveal simple failures that may lead to rejecting a software build.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

[QA] User Acceptance Testing (UAT)

A

Testing conducted to determine whether a system satisfies the acceptance criteria and is acceptable for delivery to end users.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

[QA] Automated Testing

A

Testing using tools and scripts to automate repetitive, but necessary, testing tasks.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

[QA] Load Testing

A

Testing the system’s ability to handle a specified amount of load or traffic.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

[QA] Performance Testing

A

Testing the system’s performance characteristics, such as response time, speed, and stability.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

[QA] Scalability Testing

A

Testing to ensure that the application can handle an increase in the number of users or load.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

[QA] Boundary Testing

A

Testing at the edges or boundaries of the input domain to ensure system stability. Ex: a number field only accepts numbers 1-50. You will want to test if -5, 0, 1, 25, 50, and 100 would work with the expectation of -5, 0, and 100 to not work

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

[QA] Ad Hoc Testing

A

Informal testing without predefined test cases; often exploratory in nature. This would be you just poking around the application.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

[QA] Bug Tracking System

A

A tool used to track defects, issues, or bugs throughout their lifecycle. (Jira, TFS)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

[QA] Traceability Matrix

A

A document or table that helps determine all the possible outcomes and the most efficient test coverage

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

[QA] Dynamic Testing

A

Testing technique where the code is executed to demonstrate the actual behavior of the system.

17
Q

[QA] Static Testing

A

Testing technique where the code is not executed; it can include reviews and inspections.

18
Q

[QA] Test Environment

A

The setup of software and hardware for testing, including servers, databases, and network configurations. Note there is usually a Dev, Test, Stage, and Prod environments.
Prod Environments are the live code the users use. Everything else is internal.

19
Q

[QA] Test Suite

A

Test suites are the logical grouping or collection of test cases to run a single job with different test scenarios.

20
Q

[QA] Negative Testing

A

A test strategy that ensures that your application can gracefully handle invalid input or unexpected user behavior. For example, if a user tries to type a letter in a numeric field, the correct behavior in this case would be to display the “Incorrect data type, please enter a number” message.

21
Q

[QA] Severity

A

How badly the defect has affected the application’s functionality.

Ranked 0 -5 (0 being the highest)

A severity of 0 means this needs to be fixed now.

22
Q

[QA] Priority

A

The order in which developers will fix defects (because priority describes business importance).

Ranked 0-5 (0 being the highest)

23
Q

[QA] Repro Rate (Reproduction rate)

A

The rate at which the bug/defect can be reproduced. Typically when a bug is found I will test it 5 times, so I know the repro rate (it is rare that I test it more than 10 times)

example Rate: 1/5 meaning it can be reproduced 20% of the time, 5/5 meaning it can be repro’d 100% of the time

24
Q

[Agile] Agile

A

A set of principles for software development that prioritizes flexibility, collaboration, and customer satisfaction.

25
Q

[Agile] Scrum

A

An Agile framework that defines roles, events, and artifacts for effective project management.

26
Q

[Agile] Sprint

A

A time-boxed iteration of work in Scrum, usually 2-4 weeks long, during which a potentially shippable product increment is created.

27
Q

[Agile] Product Owner

A

The person responsible for defining and prioritizing the product backlog, representing the customer’s needs, and making decisions about what to build.

28
Q

[Agile] Scrum Master

A

A facilitator and coach who helps the Scrum team follow the Agile principles and practices and removes impediments to progress.

29
Q

[Agile] Scrum Team

A

A cross-functional team consisting of developers, a product owner, and a Scrum Master, responsible for delivering a potentially shippable product increment at the end of each sprint.

30
Q

[Agile] Product Backlog

A

A prioritized list of features, enhancements, and bug fixes that make up the work to be done on a project.

31
Q

[Agile] User Story

A

A brief, user-centric description of a feature or functionality, often written from the perspective of an end user.

32
Q

[Agile] Daily Standup (Daily Scrum)

A

A short, daily meeting where team members discuss their progress, plans, and any impediments they are facing.

32
Q

[Agile] Velocity

A

A measure of the amount of work completed by a team in a sprint, used for estimating future work and improving planning accuracy.

33
Q

[Agile] Burndown Chart

A

A visual representation of the amount of work remaining in a sprint or project, helping teams track their progress.

34
Q

[Agile] Acceptance Criteria (AC)

A

A set of criteria that must be met for a product increment to be considered complete and potentially shippable.

35
Q

[Agile]Backlog Grooming (Refinement)

A

The process of reviewing, prioritizing, and refining items in the product backlog to ensure they are ready for inclusion in upcoming sprints.

36
Q

[Agile] Sprint Retrospective (Retro)

A

A meeting held at the end of a sprint for the team to reflect on their process and identify areas for improvement.

37
Q

[Agile] Continuous Integration

A

The practice of frequently merging code changes into a shared repository, with automated builds and tests to catch integration issues early.

38
Q

[Agile] Story points

A

Units of measure for expressing an estimate of the overall effort required to fully implement a product backlog item or any other piece of work. This is the combined effort between the Scrum team. Usually set in the fibonacci sequence
0,1,2,3,5,8,13