Domain 8: Software Development Security Flashcards
Change management process has three basic components:
Request Control/Change Control/Release Control
Request Control
Provides an organized framework within which users can request modifications,
managers can conduct cost/ benefit analysis, and developers can prioritize tasks.
Change Control
Provides an organized framework within which multiple developers can create and
test a solution prior to rolling it out into a production environment. Change control includes
conforming to quality control restrictions, developing tools for update or change deployment, properly
documenting any coded changes, and restricting the effects of new code to minimize diminishment of
security.
Release Control
Once the changes are finalized, they must be approved for release through the release control procedure.
Configuration Identification
Administrators document the configuration of covered software products throughout the organization.
Configuration Control
Ensures that changes to software versions are made in accordance with the
change control and configuration management policies. Updates can be made only from authorized
distributions in accordance with those policies.
Configuration Status Accounting
Formalized procedures are used to keep track of all authorized changes that take place
Configuration Audit
Periodic configuration audit should be conducted to ensure that the actual
production environment is consistent with the accounting records and that no unauthorized
configuration changes have taken place
5 levels of SW-CMM
initiating , repeatable, defined, managed, Optimizing
initiating
competent people, informal processes, ad-hoc, absence of formal process
repeatable
project management processes, basic life-cycle management processes
defined
Engineering processes, presence of basic life-cycle management processes and reuse
of code, use of requirements management, software project planning, quality assurance,
configuration management practices
managed
product and process improvement, quantitatively controlled
Optimizing
continuous process improvement Works with an IDEAL model.
IDEAL Model
Initiate , Diagnose, Establish an action plan,, Action implement improvements, , Leverage reassesses and continuously improve
PERT
Program Evaluation Review Technique is a project-scheduling tool used to judge the size of a
software product in development and calculate the standard deviation (SD) for risk assessment. PERT
relates the estimated lowest possible size, the most likely size, and the highest possible size of each
component. PERT is used to direct improvements to project management and software coding in order
to produce more efficient software
DevOps
The word DevOps is a combination of Development and Operations, symbolizing
that these functions must merge and cooperate to meet business requirements.
Integrates:
• Software Development,
• Quality Assurance
• IT Operations
DBMS
Refers to a suite of software programs that maintains and provides controlled access to data
components store in rows and columns of a table
Relational
One-to-one relationships, has DDL and DML, has TUPLES and ATTRIBUTES
(rows and columns)
Key-Value Store
key-value database, is a data storage paradigm designed for storing,
retrieving, and managing associative arrays, a data structure more commonly known today as a
dictionary or hash.
DDL – Data definition language
defines structure and schema
DML – Data manipulation language
View, manipulate and use the database via VIEW, ADD,
MODIFY, SORT and DELETE commands.
Degree of Db
Number of attributes (columns) in table
Cardinality
rows
Tuple
Row or record
DDE – Dynamic data exchange
Enables applications to work in a client/server model by providing the
inter-process communications mechanism (IPC)
DCL – Data control language
Subset of SQL used to control access to data in a database, using GRANT and REVOKE statements
Semantic integrity
Make sure that the structural and semantic rules are enforced on all data types,
logical values that could adversely affect the structure of the database
Referential integrity
all foreign keys reference existing primary keys,
Candidate Key
An attribute that is a unique identifier within a given table, one of the candidate keys
is chosen to be the primary key and the others are alternate keys, A candidate key is a subset of
attributes that can be used to uniquely identify any record in a table. No two records in the same table
will ever contain the same values for all attributes composing a candidate key. Each table may have one
or more candidate keys, which are chosen from column headings.
Primary Key
Provide the sole tuple-level addressing mechanism within the relational model. Cannot
contain a null value and cannot change or become null during the life of each entity. When the primary
key of one relation is used as an attribute in another relation, it is the foreign key in that relation.
Uniquely identify a record in a database
Foreign Key
Represents a reference to an entry in some other table that is a primary key there. Link
between the foreign and primary keys represents the relationship between the tuples. Enforces
referential integrity
Main Components of a Db using Db
- Schemas; blueprints
- tables
- views
Incorrect Summaries
When one transaction is using an aggregate function to summarize data stored
in a Db while a second transaction is making modifications to a Db, causing summary to include
incorrect information
Dirty Reads
When one transaction reads a value from a Db that was written by another transaction
that did not commit, Db concurrency issue
Lost Updates
When one transaction writes a value to the Db that overwrites a value needed by
transactions that have earlier precedence
Dynamic Lifetime Objects
Objects created on the fly by software in an Object Oriented
Programming environment. An object is preassembled code that is a self-contained module
ODBC
Open Database Connectivity is a database feature that allows applications to communicate
with different types of databases without having to be directly programmed for interaction with each
type. ODBC acts as a proxy.
Database contamination
Mixing data with different classification levels and/ or need-to-know
requirements and is a significant security challenge. Often, administrators will deploy a trusted front
end to add multilevel security to a legacy or insecure DBMS.
Database partitioning -
Is the process of splitting a single database into multiple parts, each with a
unique and distinct security level or type of content
Polyinstantiation
Occurs when two or more rows in the same relational database table appear to have
identical primary key elements but contain different data for use at differing classification levels. It is
often used as a defense against inference attacks
Database transactions
Four required characteristics
atomicity, consistency, isolation, and durability.
Together, these attributes are known as the ACID model, which is a critical concept in the development of database
management systems.
Atomicity
Database transactions must be atomic—that is, they must be an “all-or-nothing” affair. If
any part of the transaction fails, the entire transaction must be rolled back as if it never occurred
Consistency
All transactions must begin operating in an environment that is consistent with all of the
database’s rules (for example, all records have a unique primary key). When the transaction is
complete, the database must again be consistent with the rules, regardless of whether those rules were
violated during the processing of the transaction itself. No other transaction should ever be able to use
any inconsistent data that might be generated during the execution of another transaction.
Isolation
Principle requires that transactions operate separately from each other. If a database
receives two SQL transactions that modify the same data, one transaction must be completed in its
entirety before the other transaction is allowed to modify the same data. This prevents one transaction
from working with invalid data generated as an intermediate step by another transaction.
Durability
Database transactions must be durable. That is, once they are committed to the database,
they must be preserved. Databases ensure durability through the use of backup mechanisms, such as
transaction logs
Expert Systems
Expert systems seek to embody the accumulated knowledge of experts on a
particular subject and apply it in a consistent fashion to future decisions.
Every expert system has two main components: the knowledge base and the inference engine.
Expert Systems Two modes
- Forward chaining: acquires info and comes to a conclusion
* Backward chaining: backtracks to determine IF a hypothesis is correct