Application Development and Deployment Flashcards
Which of the following methodologies progresses through a series of phases, with each phase being completed before progressing to the next phase?
Scrum
Waterfall
Agile
Waterfall
The waterfall model is a development model based on simple manufacturing design. The work process begins with the requirements analysis phase and progresses through a series of four more phases, with each phase being completed before progressing to the next phase. The Scrum programming methodology is built around a 30-day release cycle. The Agile model is not a single development methodology, but a whole group of related methods. Designed to increase innovation and efficiency of small programming teams, Agile methods rely on quick turns involving small increases in functionality. Extreme Programming is a structured process that is built around user stories. These stories are used to architect requirements in an iterative process that uses acceptance testing to create incremental advances
Which of the following methodologies is a structured process that is built around user stories that are used to architect requirements in an iterative process that uses acceptance testing to create incremental advances?
Scrum
Extreme Programming (XP)
Waterfall
Waterfall
Extreme programming (XP) is a structured process that is built around user stories. These stories are used to architect requirements in an iterative process that uses acceptance testing to create incremental advances. Agile methods are not a single development methodology, but a whole group of related methods. Designed to increase innovation and efficiency of small programming teams, Agile methods rely on quick turns involving small increases in functionality. The waterfall model is a development model based on simple manufacturing design. The work process begins with the requirements analysis phase and progresses through a series of four more phases, with each phase being completed before progressing to the next phase. The Scrum programming methodology is built around a 30-day release cycle
Which of the following are elements of software development that will help to improve the security of code? (Choose all that apply.)
Input validation
Proper error and exception handling
Cross-site scripting mitigations
Patch management
Input validation
Proper error and exception handling
Cross-site scripting mitigations
All are elements of software development that will help to improve the security of code. While patch management is an important aspect of security, it occurs after code development and delivery and is considered a process element and not a part of the software development lifecycle
Where should all errors/exceptions be trapped and handled?
In the main program or routing that called the routine that generated the error/exception
In the generating routine itself
In a special routine designed to handle all errors/exceptions
In the generating routine itself
All errors/exceptions should be trapped and handled in the generating routine
Which of the following is a system that, once deployed, is never modified, patched, or upgraded?
Baseline
Immutable system
Frozen system
Immutable system
An immutable system is a system that, once deployed, is never modified, patched, or upgraded. If a patch or update is required, the system is merely replaced with a new system that is patched and updated. Baselining is the process of determining a standard set of functionality and performance. This is a metrics-driven process, where later changes can be compared to the baseline to gauge their impact on performance and other variables. If a change improves the baseline elements in a positive fashion, a new baseline can be established. The other terms are not commonly used in industry
What is the term used to describe removing users’ permissions or authorities to objects?
Provisioning
Version control
Deprovisioning
Deprovisioning
Deprovisioning is the removal of users’ permissions or authorities to access objects. Provisioning is the process of assigning to users permissions or authorities to access objects. Version control is as simple as tracking which version of a program is being worked on, whether in development, testing, or production. Change management addresses how an organization manages which versions are currently being used, and how it coordinates changes as they are released by a manufacturer
The process describing how an organization manages which versions are currently being used, and how it coordinates updates or new versions as they are released by a manufacturer, is known as which of the following?
Version control
Provisioning
Change management
Change management
Change management addresses how an organization manages which versions are currently being used, and how it coordinates changes as they are released by a manufacturer. Version control is as simple as tracking which version of a program is being worked on, whether in development, testing, or production. Provisioning is the process of assigning permissions or authorities to objects for users. Deprovisioning is the removal of permissions or authorities to objects for users
Which of the following is an initial step in the input validation process that creates the canonical form, or simplest form, of a string before processing?
Implementing stored procedures
Code reuse
Normalization
Normalization
Normalization is an initial step in the input validation process. Specifically, it is the step of creating the canonical form, or simplest form, of a string before processing. Stored procedures are precompiled methods implemented within a database engine. Stored procedures act as a secure coding mechanism because they offer an isolation of user input from the actual SQL statements being executed. Code signing involves applying a digital signature to code, providing a mechanism where the end user can verify the code integrity. Code reuse is reusing code from one application to another
Which of the following is true about what is known as dead code?
Dead code is code that is never executed and thus can be removed from the program without a negative impact.
Dead code is code that is never executed but should remain in the program because removing it may have unintended consequences.
Dead code is code that while it may be executed, the results that it produces are never used elsewhere in the program. There are compiler options that can remove dead code, which is called dead code elimination, but these must be used with care because dead code elimination may have unintended consequences.
Dead code is code that while it may be executed, the results that it produces are never used elsewhere in the program. There are compiler options that can remove dead code, which is called dead code elimination, but these must be used with care because dead code elimination may have unintended consequences.
Dead code is code that while it may be executed, the results that it obtains are never used elsewhere in the program. There are compiler options that can remove dead code, called dead code elimination, but these options must be used with care because dead code elimination may have unintended consequences
What is the term used to describe the loss of control over data from a system during operations?
Sandboxing
Data exposure
Data breach
Data exposure
Data exposure is the loss of control over data from a system during operations. Sandboxing refers to the execution of computer code in an environment designed to isolate the code from direct contact with the target system. A data breach occurs when an unauthorized user gains access to your system and its data. Runtime release is not a term used in the industry
What term is used to refer to testing a system under a controlled speed environment?
Load testing
Stress testing
Sandboxing
Load testing
Load testing involves running the system under a controlled speed environment. Stress testing takes the system past this operating point to see how it responds to overload conditions. Sandboxing refers to the execution of computer code in an environment designed to isolate the code from direct contact with the target system. Static code analysis is when the code is examined without being executed
Fuzz testing works best in which of the following testing environments?
White box testing
Gray box testing
Black box testing
Fuzz testing works equally well in all of the above.
Fuzz testing works equally well in all of the above.
Fuzz testing works well in white, black, or gray box testing, as it can be performed without knowledge of the specifics of the application under test
Code analysis can be performed at which of the following levels of development? (Choose all that apply.)
Unit level
Subsystem level
System level
Complete application
Unit level
Subsystem level
System level
Complete application
Code analysis can be performed at virtually any level of development, from unit level to subsystem to system to complete application
Which code analysis method is performed while the software is executed, either on a target system or an emulated system?
Runtime analysis
Sandbox analysis
Dynamic analysis
Dynamic analysis
Dynamic analysis is performed while the software is executed, either on a target system or an emulated system. Static code analysis is when the code is examined without being executed. Sandboxing refers to the execution of computer code in an environment designed to isolate the code from direct contact with the target system. Runtime analysis is descriptive of the type of analysis but is not the term used in industry
Which of the following is true concerning verification? (Choose all that apply.)
Ensuring the code does what the code is supposed to do, verification, is more complex than just running the program and looking for runtime errors.
Verification also checks whether the program specification captures the requirements from the customer.
Verification is simple on a case-by-case basis, but when a program has many interdependent calculations, verifying that the results match the desired design model can be a fairly complex task.
Verification is the process of checking that the software developed meets the model specification.
Ensuring the code does what the code is supposed to do, verification, is more complex than just running the program and looking for runtime errors.
Verification is simple on a case-by-case basis, but when a program has many interdependent calculations, verifying that the results match the desired design model can be a fairly complex task.
Verification is the process of checking that the software developed meets the model specification.
Ensuring the code does what the code is supposed to do, verification, is more complex than just running the program and looking for runtime errors. The program results for a given set of inputs need to match the expected results per the system model. For instance, if applying a simple mathematical operation, is the calculation correct? This is simple to verify on a case-by-case basis, but when a program has many interdependent calculations, verifying that the result matches the desired design model can be a fairly complex task. Verification is the process of checking that the software developed meets the model specification. Validation is the process of checking whether the program specification captures the requirements from the customer