Data Security & Encryption Flashcards

1
Q

Data Corruption

A

Refers to any unintended or undesirable alteration or distortion of data, rendering it inaccurate, unreadable, incomplete, or unusable. It occurs when the actual data content no longer matches the expected or intended data structure or format

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

“Erasing” Data

A

Performing a ‘delete’ operation against a file
-Data is usually recoverable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

“Clearing” Data

A

Preparing the media for reuse
Data cannot be recovered using traditional methods

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

“Purging” Data

A

A more intense form of “clearing” data for media reuse
Meant for less secure environments

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Degaussing

A

Erasing data using a strong magnetic field

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

“Destruction” of Data

A

Most secure means of sanitizing media
This is the final stage in the lifecycle of data
Means more than just “deleted”

Needs to be completely unrecoverable

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Record retention

A

Policy used for data that is a liability.
Sensitive data is destroyed after a certain amount of time (usually one year)
This applies specifically to PII
This is sometimes imposed by law
-In that case, you MUST delete it when it ages out or there will be severe fines

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Tape Backup Security

A

Policy used for data that is critical to business/government operations
Helps prevent ransomware attacks

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Data-Security Baseline Control

A

List of controls that an organization can apply as a baseline (the bare minimum, typically depends on the kind of data that they are responsible for)

The baseline can only be changed by:
1) Scoping
2) Tailoring

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Definition of Controls in Data Security

A

Things you can do to prevent or mitigate a loss of data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Role of the Data Owner in Data Security

A

Responsible for collecting the PII.
Usually a member of senior management
Can delegate the maintenance tasks
But cannot delegate total responsibility

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Role of the Data Custodian in Data Security

A

Responsible for the day-to-day management of the data (for the exam the keyword here is day-to-day)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Role of the Data Administrator in Data Security

A

In role-based access control, they are responsible for granting appropriate access to personnel.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Role of the “Data User” in Data Security

A

Any person who accesses the data via a computing system is a “data user”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Zero Trust Security

A

A framework in which no user/system (inside or outside the network) should be trusted by default. Represents a paradigm shift in cybersecurity that challenges the traditional approach of relying primarily on perimeter defenses to protect an organization’s network.

Trust is never assumed, and access to data is granted based on a strict need-to-know and “least privilege” basis

Even when you do grant access, you are watching the user’s every move and are ready to respond to anomalous behavior at all times.

Three main:
1) Secure Defaults
2) Fail Securely
3) Trust but verify
4) Principle of Least Privelige

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Principle of “least privilege”

A

Users and systems are granted the minimum level of data access necessary to perform their specific tasks. Access rights are continuously reviewed and adjusted based on roles, responsibilities, and changes in requirements.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
17
Q

General Data Protection Regulation (GDPR)

A

Places stringent requirements on data containing PII coming in and out of the European Union(EU)

Protects individuals rights when it comes to their personal data

1) Users can request all the data an organization has collected on them

2) If a user asks an organization to delete all of the data they have collected on them, they have to do it

3) Requires Organizations to be completely transparent about their privacy policy

4) If an organization has a data breach, it must be disclosed in under 72 hours

Makes it more difficult to do business with companies overseas

Compliance is enforced by ridiculously huge fines (Up to €20 million or 4% of the annual global turnover, whichever is higher)

Two ways to reduce GDPR requirements:
1) Anonymization
2) Pseudonymization

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
18
Q

Anonymization in Data Security

A

Removing any actual PII from the data. Makes it impossible to identify the data object (done properly
Can reduce the GDPR restrictions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
19
Q

Psudonymization in Data Security

A

Using aliases to represent data to reduce the exposure of PII.
Can reduce the GDPR restrictions

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
20
Q

“Data Processor” in GDPR

A

Person /authority/agency that processes personal data on behalf of the “data controller” (another GDPR term)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
21
Q

“Data Controller” in GDPR

A

The person or entity that controls processing of the data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
22
Q

Four levels of data classification

A

Class 0: Public, no damage occurs if it gets out

Class 1: Some Damage would occur. This data gets the basic level of protection

Class 2: Serious Damage would Occur if it gets out

Class 3: Greatest amount of damage would occur if it gets out

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
23
Q

Class 3 data in Data Security

A

Greatest amount of damage would occur if it gets out

Government Side: “Top Secret”

Civilian Side:
“Confidential/Proprietary”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
24
Q

Class 2 Data in Data Security

A

Serious Damage would Occur if it gets out

Government Side: “Secret”

Civilian Side:
“Private”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
25
Q

Class 1 Data in Data Security

A

Some Damage would Occur if it gets out

Government Side: “Confidential” (CIU)

Civilian Side:
“Sensitive”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
26
Q

Class 0 Data in Data Security

A

No Damage would Occur if it gets out

Government Side: “Unclassified”

Civilian Side:
“Public”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
27
Q

Personally Identifiable Information (PII)

A

Any information that can be used to identify/locate a “data object”

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
28
Q

A “data object” in data security

A

Any person that can be identified by their PII

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
29
Q

The Data Lifecycle

A

1) Create
2) Store
3) Use
4) Share
5) Archive
6) Destroy

Mnemonic Device:
Cyber Security Unifies Software And Data

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
30
Q

Six Data Related Roles

A
  1. Data Owner
  2. Data Custodian
  3. Data Administrator
  4. Data User
  5. Business Mission Owners
  6. Asset Owners
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
31
Q

SQL Injection (SQLi)

A

Malicious SQL code is injected into an application’s input fields, so that the attacker can send queries to the database. Attackers can modify, delete, or steal data, and in severe cases, gain control over the entire database.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
32
Q

Cross-Site Scripting

A

Injecting malicious scripts into a website, which can then be executed by other users. These scripts can potentially steal authentication tokens or credentials, leading to unauthorized access to the database.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
33
Q

Cross-Site Request Forgery (CSRF)

A

CSRF attacks trick authenticated users into unknowingly executing unwanted actions on a website they are logged into. This can lead to unauthorized changes in the database or other malicious actions.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
34
Q

Brute Force Attacks

A

Attackers attempt to gain unauthorized access to the RDBMS by repeatedly trying different username and password combinations until they find the correct credentials. Brute force attacks can be mitigated by implementing account lockout policies and using strong, unique passwords.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
35
Q

Credential Sniffing

A

Attackers use various techniques to capture plaintext usernames and passwords as they traverse the network. Once obtained, these credentials can be used to gain unauthorized access to the RDBMS.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
36
Q

Denial of Service (DoS) and Distributed Denial of Service (DDoS)

A

DoS and DDoS attacks overwhelm the RDBMS with a flood of traffic, causing it to become unavailable to legitimate users. This disrupts database access and can lead to data unavailability during the attack.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
37
Q

Zero-Day Exploits

A

Exploits targeting vulnerabilities unknown to the vendor (zero-days) can be used to compromise the RDBMS. Attackers may gain unauthorized access, escalate privileges, or exfiltrate data using these unpatched vulnerabilities.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
38
Q

Time of Check (TOC) attack

A

In a TOC attack, an attacker exploits a timing window between the check (verification of permissions, privileges, etc.) and the actual use of the resource or action. The attacker gains unauthorized access by manipulating the conditions after the check is performed but before the action is executed.

For example, in a file access scenario, the application checks if a user has permission to read a file at a specific location. However, before the actual reading occurs, an attacker could replace or modify the file, changing its contents.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
39
Q

Time of Use (TOU) Attack

A

A TOU attack involves an attacker modifying or manipulating a resource after a check is performed, but before the resource is used. The attacker exploits the delay between the time of the check and the time the resource is used to maliciously alter the resource or conditions, potentially leading to unauthorized actions.

For instance, consider a financial transaction system where the balance is checked before a transfer. An attacker could modify the balance after the check is performed but before the funds are transferred, allowing them to manipulate the amount transferred.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
40
Q

Two Modes of Data Creation

A

1) Data created by users (specifically deliberately entering/uploading information)
2) Data Created by the system (ie the system is logging/monitoring user behavior)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
41
Q

Data Classification

A

(technically part of the data lifecycle that comes right before data storage)
Data needs to be given a classification as soon as possible to ensure that it is handeled or stored properly

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
42
Q

Data Archival

A

Penultimate step in the data lifecycle
This is often required to comply with “data retention” laws

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
43
Q

Defensible Data Destruction

A

Destroying Data in a way that complies with standards/regulations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
44
Q

Scoping a data-security baseline

A

Removal of baseline recommendations that do not apply

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
45
Q

Three primary types of security controls

A

1) Preventative
2) Detective
3) Corrective

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
46
Q

Preventive Security Controls

A

Act as a deterrent to an attack

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
47
Q

Detective Security Controls

A

Help you identify when they Confidentiality/Integrity/Access of your data is compromised, as quickly as possible

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
48
Q

Corrective Security Controls

A

Help you mitigate the effects of an attack

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
49
Q

International Association of Privacy Professionals (IAPP)

A

Global organization dedicated to supporting professionals in the field of privacy and data protection. It provides resources, education, certification, networking opportunities, and advocacy for individuals and organizations involved in managing privacy risks and compliance with privacy laws and regulations.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
50
Q

Symmetric Cryptography v. Asymmetric Cryptography

A

Symmetric Cryptography:
- Uses a single secret key for both encryption and decryption.
Efficient and faster for processing large amounts of data.
- Key management is critical, and secure distribution of the key is essential.
- If the key is compromised, all encrypted data is at risk of decryption.
- Primarily used for encrypting bulk data like files and securing network communication.
- Speed and efficiency make it suitable for high-volume data encryption.

Asymmetric Cryptography:
- Uses a pair of keys: a public key for encryption and a private key for decryption.
- Slower and computationally more intensive compared to symmetric cryptography.
- Public key can be openly shared, but the private key must remain secret.
- Even with the public key, it is computationally infeasible to derive the private key.
- Key exchange, digital signatures, and secure communication are primary applications.
- Security relies on the complexity of mathematical problems and the secrecy of the private key.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
51
Q

Cryptography Schemes that can stand up to quantum-drive code breakers

A

Symmetric (shared key) cryptography schemes

The only exception is “Lattice”, which is the only asymmetric scheme that cannot be broken by a quantum computer

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
52
Q

Keys in Cryptography

A

Just a piece of information that is used to control the transformation of plaintext (original, readable data) into ciphertext (encrypted, unintelligible data) during encryption, and vice versa during decryption. Keys are essential to ensuring the security and confidentiality of the data.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
53
Q

Keys in Symmetric Cryptography

A
  • The key in symmetric cryptography serves both for encryption and decryption of data.
  • It controls the transformation of plaintext to ciphertext and vice versa.
  • The same secret key is used by both parties involved in the communication.
  • The security of the system heavily relies on keeping the key secret from unauthorized entities.
54
Q

Keys in Asymmetric cryptography

A
  • In asymmetric cryptography, keys have two distinct purposes:
    Public Key: Used for encryption, allowing anyone to encrypt data intended for the key’s owner.
    Private Key: Used for decryption, allowing only the key’s owner to decrypt data encrypted with their public key.
  • It enables secure key exchange and secure communication between parties without sharing the private key.
  • Public keys are shared openly, and private keys are kept secret by their respective owners.
  • The security of the system relies on the computational difficulty of deriving the private key from the public key.
55
Q

Results of Grover’s Algorithm

A

Demonstrates that using a quantum computer can effect halve the key length, making it significantly easier for enemies to decrypt messages

56
Q

Key Handling in Symmetric Cryptography

A
  • Key distribution is a significant challenge. Both parties need the same secret key.
  • Secure methods are required to distribute and manage the secret key securely.
  • Any compromise or unauthorized access to the key compromises the security of the entire communication.
57
Q

Key Handling in Asymmetric Cryptography

A
  • Public keys can be openly shared or published in directories.
  • Private keys must be kept confidential and securely stored by their respective owners.
  • There is no need for secure key distribution since the public key can be openly shared.
  • The private key remains with the owner and is used for decryption and digital signatures, ensuring security and integrity.
58
Q

Factors in the popularity of Lattice-based cryptography

A

1) Quantum-Resistant Properties:
Lattice-based cryptography relies on problems that are believed to remain hard even for quantum computers. This makes it a strong candidate for post-quantum cryptography, providing a potential solution to security concerns posed by quantum computing.

2) Versatility:
Lattice-based cryptography provides a versatile framework for constructing a wide range of cryptographic primitives, including public-key encryption, digital signatures, key exchange, and more. This versatility makes it applicable to various use cases and cryptographic applications.

3) Mathematical Foundation:
Lattice theory is a well-established and rich area of mathematics, providing a solid theoretical foundation for lattice-based cryptography. Researchers have been able to develop various cryptographic algorithms and protocols based on the hardness of lattice problems.

4) Efficiency and Performance:
Over time, researchers have improved the efficiency and practicality of lattice-based cryptographic schemes. Efforts are ongoing to optimize these schemes, making them more suitable for real-world applications in terms of speed, key sizes, and computational resources.

5) NIST Standardization Process:
The National Institute of Standards and Technology (NIST) initiated a public competition to standardize post-quantum cryptographic algorithms, including lattice-based schemes. This competition has further increased the visibility and scrutiny of lattice-based cryptography within the cryptography community.

6) Security Analysis:
Lattice-based schemes have been extensively studied, and their security is based on well-defined mathematical problems in lattice theory. The security analysis provides confidence in their resilience against both classical and potential future quantum attacks.

59
Q

A “code” in cryptography

A

A system where words, phrases, or entire sentences are substituted with arbitrary symbols, typically for the purpose of secret communication. This substitution is predetermined and agreed upon by both the sender and receiver, known as the “codebook.” In a code, the meaning of each symbol or combination of symbols is predefined in the codebook. Codes can be as simple as replacing words with numbers or using complex symbol combinations to represent words or phrases.

60
Q

A “cypher” in cryptography

A

A cipher is a systematic technique or algorithm used to encrypt or encode messages. It involves the transformation of the original message (plaintext) into a form that is unintelligible (ciphertext) using a secret key. The key is critical to the encryption and decryption processes. Unlike codes, ciphers use mathematical transformations and operations to scramble the data.

61
Q

Codes v. Cyphers

A

1) Representation: Codes use predefined substitutions for words or phrases, while ciphers use algorithms and mathematical transformations to scramble data.

2) Symbol Mapping: In a code, symbols directly correspond to words or phrases based on a predetermined codebook. In a cipher, the relationship between symbols (ciphertext) and the original message (plaintext) is determined by the encryption algorithm and a secret key.

3) Complexity: Codes are relatively simple and may involve direct word-to-symbol substitutions. Ciphers can be more complex, involving various cryptographic techniques such as substitution, transposition, and more.

4) Key Dependency: Ciphers depend on a secret key for encryption and decryption, while codes rely on a pre-established codebook.

62
Q

Substitution Ciphers

A

1) Definition: Substitution ciphers are a type of encryption algorithm that replaces each unit of plaintext (e.g., letters or groups of letters) with another unit or symbol based on a predetermined substitution rule.

2) Operation: Substitution ciphers perform a one-to-one mapping of characters from the plaintext to ciphertext based on a substitution table, which can be as simple as a Caesar cipher (shifting the alphabet by a fixed number) or more complex, like the Atbash cipher.

3) Characteristics:
- Generally simple and easy to implement.
- May suffer from vulnerabilities like frequency analysis, especially in monoalphabetic substitution ciphers.

63
Q

Block Ciphers

A

1) Definition: Block ciphers are symmetric encryption algorithms that encrypt fixed-size blocks (typically 64 or 128 bits) of plaintext at a time. The blocks are encrypted independently and combined to form the final ciphertext.

2) Operation: Block ciphers use a fixed-size key to perform encryption and decryption on individual blocks of plaintext, transforming them into ciphertext and vice versa using a reversible algorithm. Common block cipher algorithms include AES (Advanced Encryption Standard) and DES (Data Encryption Standard).

3) Characteristics:
- Operate on fixed-size blocks of data.
- High security due to the complexity of the algorithm and key size.
- Often used in modes of operation (e.g., ECB, CBC, GCM) to handle multiple blocks and provide additional security.

64
Q

Stream Cyphers

A

1) Definition: Stream ciphers are symmetric encryption algorithms that encrypt and decrypt data one bit or one byte at a time. They generate a stream of pseudorandom bits (keystream) based on a key and use bitwise operations (like XOR) to combine the keystream with the plaintext to produce ciphertext.

2) Operation: Stream ciphers generate a keystream that is combined with the plaintext in a bitwise manner. The key, combined with a nonce or initialization vector, is used to generate this keystream.

3) Characteristics:
- Operate on a continuous stream of data, making them suitable for real-time communication.
- Typically faster than block ciphers for encrypting streams of data.
- Vulnerable to key reuse, and care must be taken to ensure unique key-IV pairs.

65
Q

Substitution Ciphers v. Block Ciphers v. Stream Ciphers

A
  • Substitution ciphers substitute units of plaintext with other units based on a fixed rule, while both block and stream ciphers use reversible algorithms for encryption and decryption.
  • Block ciphers process fixed-size blocks of data, whereas stream ciphers process data bit by bit or byte by byte in a continuous stream.
  • Stream ciphers are typically faster for encrypting streams of data and are often used in real-time communication scenarios, while block ciphers provide stronger security for fixed-size blocks of data.
  • Block ciphers are generally more secure due to their structure and resistance to frequency analysis, making them suitable for a wide range of applications, including securing stored data.
  • Substitution ciphers are usually simpler but less secure, making them less suitable for modern cryptography compared to block and stream ciphers.
66
Q

Transposition in Cryptography

A

Transposition is a technique used to rearrange the characters or elements of plaintext to create ciphertext. Unlike substitution, which replaces each character with another according to a specific rule, transposition involves rearranging the positions of characters without changing their identities.

67
Q

Transposition Ciphers

A

1) Operation: Transposition ciphers rearrange the positions or order of characters in the plaintext to create the ciphertext, but they do not change the actual characters themselves.
2) Encryption Process: The plaintext characters are rearranged based on a specific algorithm or pattern, often following a matrix, rail fence, or route-based arrangement. The characters are then read off in a certain order to produce the ciphertext.
3) Key Usage: A transposition cipher may use a key to determine the specific rearrangement pattern or the rules for how the characters are shuffled.

68
Q

Initialization Vector (IV)

A

a random or non-predictable value used in cryptography to initialize certain cryptographic algorithms, particularly in block ciphers operating in modes that require an IV. The primary purpose of an IV is to ensure that even if the same plaintext is encrypted multiple times, the resulting ciphertext will be different each time, enhancing security.

69
Q

One Time Pad (OTP) in cryptography

A

An encryption technique and symmetric key algorithm in cryptography that provides perfect secrecy when used correctly. It’s an unbreakable encryption scheme, assuming certain conditions are met, and is characterized by the use of a random, secret key that is at least as long as the plaintext being encrypted.

70
Q

Minimum length of a cryptographic key

A

128 bits, anything shorter than that can be broken quite easily

71
Q

Zero Knowledge Proof in Cryptography

A

A cryptographic concept and protocol that allows one party (the prover) to prove to another party (the verifier) that they possess certain knowledge or information without revealing the actual knowledge itself. In other words, it allows for the verification of a statement without revealing the details that make the statement true.

The core idea of a zero-knowledge proof is to demonstrate the validity of a claim or statement without revealing any unnecessary information, preserving privacy and confidentiality while ensuring trust and integrity in the interaction.

Applications:
1) Cryptography and Security:
- Authentication: Proving knowledge of a password without revealing the password itself.
- Electronic cash systems: Demonstrating the validity of a transaction without revealing the details.
- Digital signatures: Proving knowledge of a private key without revealing the key itself.
2) Blockchain and Cryptocurrencies:
- In blockchain, ZKPs can be used to prove ownership of assets or validate transactions without revealing specific details.
3) Privacy-Preserving Technologies:
- Authentication protocols: Zero-knowledge proofs are used to authenticate without revealing identities.
- Data sharing: Enabling the sharing of data for analysis without revealing the actual data.

72
Q

Work Function in Cryptography

A

Refers to a function or algorithm that requires a specific amount of computational effort or resources to compute. The concept of a work function is central to various cryptographic protocols, especially in the context of proof-of-work mechanisms and key derivation functions

*This concept specifically applies to Brute Force Attacks

73
Q

Hash Function in Cryptography

A

A mathematical algorithm that takes an input (or ‘message’) and transforms it into a fixed-length string of characters, typically a sequence of digits and letters. This fixed-length output is called the “hash value,” “hash code,” or simply “hash.” Hash functions play a fundamental role in cryptography and have various applications in ensuring data integrity, digital signatures, password hashing, and more.

74
Q

Five Properties of a good Hash Function

A

1) They must allow an input of any length
2) They must provide a fixed length output
3) They must make it relatively easy to compute the hash for any input
4) Provide one way functionality
5) Must be collision free

75
Q

Collisions in Cryptography

A

A collision occurs when two distinct inputs (messages or pieces of data) produce the same hash value (output) when processed through a hash function. In other words, a collision happens when two different inputs result in identical hash values. Avoiding collisions is essential for the reliability and security of various cryptographic applications.

76
Q

Cryptographic Salts

A

A random value that is generated and used to enhance the security of cryptographic operations, particularly in password hashing and key derivation functions (KDFs). The primary purpose of a salt is to introduce randomness and uniqueness into the hashing process, making it more difficult for attackers to use precomputed tables (rainbow tables) or dictionary attacks.

77
Q

RSA public key cryptographic system

A

1) Mathematical Foundation:
- Based on properties of large prime numbers and modular arithmetic.
- Difficulty of factoring the product of large primes (RSA assumption).

2) Key Generation:
- Select two large prime numbers.
- Generate public and private keys based on modular arithmetic operations involving the primes.

3) Key Size and Efficiency:
- Typically requires larger key sizes for equivalent security.
- Larger key sizes may result in slower computations and increased memory requirements.

4) Security:
- Security based on the difficulty of factoring large numbers (factorization problem).
- Well-established security relying on the factorization assumption.

78
Q

El Gamal public key cryptographic system

A

1) Mathematical Foundation:
- Based on the discrete logarithm problem in a finite field.
- Involves finding the exponent (logarithm) in a finite field.

2) Key Generation:
- Select a finite field.
- Generate a public and private key pair based on the discrete logarithm problem.

3) Key Size and Efficiency:
- Key sizes are generally larger than ECC but smaller than RSA for equivalent security.
- Encryption and decryption can be computationally intensive compared to RSA.

4) Security:
- Security based on the difficulty of solving the discrete logarithm problem.
- Well-established security relying on the discrete logarithm assumption.

79
Q

Elliptic Curve Cryptography (ECC)

A

1) Mathematical Foundation:
- Based on the algebraic properties of elliptic curves over finite fields.
- Difficulty of solving the elliptic curve discrete logarithm problem.

2) Key Generation:
- Select an elliptic curve.
- Generate a public and private key pair based on the elliptic curve discrete logarithm problem.

3) Key Size and Efficiency:
- Provides strong security with smaller key sizes compared to RSA and ElGamal.
- Smaller key sizes lead to faster computations and more efficient resource utilization.

4) Security:
- Security based on the difficulty of solving the elliptic curve discrete logarithm problem.
- Considered very secure and offers strong security with smaller key sizes.

80
Q

Digital Signatures in Cryptography

A

Used to verify the authenticity, integrity, and non-repudiation of digital messages or documents. A digital signature is a unique, fixed-size string generated from the contents of the message using a private key. This string, when decrypted using the corresponding public key, confirms the signer’s identity and verifies that the message has not been altered.

81
Q

Digital Signature Standard (DSS)

A

Federal Information Processing Standard (FIPS) that specifies the use of digital signatures for authentication and integrity verification of digital messages or documents. DSS was developed by the National Institute of Standards and Technology (NIST) and adopted by the U.S. government for secure communications and transactions.

82
Q

Public Key Infrastructure (PKI)

A

A comprehensive framework that manages digital keys (public and private keys) and digital certificates to provide secure communication, authentication, data integrity, and confidentiality in various online applications and systems. PKI establishes a trust model for digital transactions, enabling entities to securely exchange information over potentially unsecured networks like the internet.

83
Q

Key Derivation Functions (KDFs)

A
  • Cryptographic algorithms designed to derive one or more secret keys from a single, often low-entropy input, such as a password, passphrase, or other initial keying material. The purpose of KDFs is to enhance the security and strength of these derived keys by incorporating additional parameters and complexity.
  • KDFs significantly strengthen the initial keying material (e.g., passwords) to produce stronger and more secure keys suitable for cryptographic applications.
  • KDFs are crucial in deriving encryption keys from passwords in a secure manner. They make it computationally intensive and time-consuming to derive the original key, significantly increasing the difficulty of password cracking.
  • KDFs often take additional parameters like salt (a random value) and iteration count to enhance the security of the key derivation process.
84
Q

One Time Pad v. Initialization Vector

A

One-Time Pad (OTP):
- OTP is a specific encryption scheme that uses a truly random and secret key that is at least as long as the plaintext. The key is combined with the plaintext using bitwise XOR to produce the ciphertext. Each bit of the key is used exactly once.
OTP provides perfect secrecy and is theoretically unbreakable when used correctly. The key in OTP is essentially as long as the message and is never reused.

Initialization Vector (IV):
- An IV is used in various encryption schemes, including block ciphers operating in certain modes like CBC (Cipher Block Chaining). The IV’s purpose is to introduce randomness and uniqueness into the encryption process, ensuring that even if the same plaintext is encrypted multiple times, the resulting ciphertexts are different.
- IVs are typically used in block cipher modes where blocks of plaintext are encrypted independently, and the IV is XORed with the first block of plaintext to provide the necessary randomness for encryption.
While both OTP and IVs are tools used in encryption, they serve different purposes and are not directly related. OTP ensures perfect secrecy by using a unique and random key for each encryption, while IVs are used to prevent patterns in the ciphertext when encrypting multiple blocks of data using certain block cipher modes.

85
Q

Hash

A

A hash is not a type of ciphertext; rather, it’s a fixed-size alphanumeric string generated from an input (often called a “message”) using a hash function. A hash function is a mathematical algorithm that transforms an input (of any size) into a fixed-length output, typically represented as a sequence of characters or numbers.

86
Q

Methods of Preventing Collisions

A

1) Cryptographic Hash Functions: Use cryptographic hash functions (e.g., SHA-256, SHA-3) that are designed to minimize the possibility of collisions. These functions undergo rigorous testing and analysis to ensure that it is computationally infeasible to find two different inputs that produce the same hash.

2) Sufficient Hash Length: Choose a hash function with a sufficiently long output length to reduce the probability of collisions. Longer hash lengths provide a larger output space, making it computationally harder to find collisions.

3) Salting: When using hash functions, incorporate a random value called a “salt” into the input before hashing. Salting ensures that even if the same input is hashed multiple times, the resulting hash values will be different, preventing collisions.

4) Use of Unique Identifiers: Ensure that the data being hashed has a unique identifier or piece of information that distinguishes it from other data. This helps in avoiding unintended collisions.

5) Message Authentication Codes (MACs): When dealing with authentication or integrity verification, use Message Authentication Codes (MACs) which involve a secret key. The key helps ensure that the MAC is unique for each unique input, mitigating the risk of collisions.

6) HMAC (Hash-based Message Authentication Code): Use HMAC, a specific construction for creating a MAC, which is based on a cryptographic hash function. It provides an added layer of security and helps prevent collisions when creating MACs.

7) Digital Signatures: Use digital signatures, which involve hashing the message and signing the hash with a private key. Properly implemented digital signatures are designed to prevent collisions.

8) Avoid Using Weak Hash Functions: Do not use weak or deprecated hash functions, as they might have known vulnerabilities or collision risks. Choose hash functions that are currently considered secure and recommended by cryptographic standards.

9) Regularly Update Algorithms: Stay updated with the latest developments and recommendations in cryptography. As new collision-resistant hash functions or protocols are introduced, consider transitioning to them to maintain a higher level of security.

87
Q

Methods of Secure Key Distribution

A

1) Key Exchange Protocols: Utilize key exchange protocols like Diffie-Hellman (DH) or its variants (e.g., ECDH for elliptic curve cryptography). These protocols enable secure communication and key agreement between parties, allowing them to establish a shared secret without revealing it over the communication channel.

2) Key Encapsulation: Use key encapsulation mechanisms, such as the Key Encapsulation Mechanism (KEM), which combines asymmetric and symmetric cryptography. In this approach, a symmetric key is securely exchanged using asymmetric encryption.

3) Pre-shared Keys (PSKs): When possible, establish pre-shared symmetric keys using secure, out-of-band methods. This can involve manual configuration or physically secure distribution of keys.

4) Key Derivation Functions (KDFs): Use secure key derivation functions (KDFs) to derive encryption keys from a shared secret or a master key. KDFs ensure that the derived keys are secure and have the desired properties.

5) Key Distribution Centers (KDCs): Use a centralized entity like a Key Distribution Center (KDC) to securely distribute and manage keys. The KDC can authenticate parties and distribute session keys securely.

6) Quantum Key Distribution (QKD): For highly secure and future-proof key distribution, consider quantum key distribution, a method that leverages principles of quantum mechanics to securely exchange keys. Quantum key distribution is resistant to certain types of attacks.

7) Secure Multi-Party Computation (SMPC): Use secure multi-party computation techniques that allow parties to jointly compute cryptographic functions without revealing their private inputs. This can be used to generate keys securely.

8) Physical Delivery of Keys: In highly secure environments, use physically secure methods to deliver keys, such as using a trusted courier or a secure hardware token that physically contains the key.

9) Public Key Infrastructure (PKI): Combine symmetric and asymmetric cryptography within a PKI. Asymmetric algorithms can be used to securely distribute and exchange symmetric keys.

10) Post-Quantum Key Exchange: Considering the potential threat of quantum computing, explore post-quantum key exchange algorithms that resist attacks from quantum computers.

11) Certificate-Based Key Distribution: Utilize digital certificates and a trusted Certificate Authority (CA) to securely bind public keys to entities, facilitating secure symmetric key distribution.

88
Q

Popular Stream Cipher Algorithms

A

1) RC4 (Rivest Cipher 4): RC4 is one of the most well-known and widely used stream ciphers. It was initially developed by Ron Rivest in 1987. Despite its historical prevalence, RC4 has faced security vulnerabilities and is not recommended for use in new applications.

2) Salsa20: Salsa20 is a stream cipher designed by Daniel Bernstein. It is known for its simplicity, efficiency, and high performance across a wide range of platforms. Salsa20 has been extensively analyzed and is considered secure.

3) ChaCha20: ChaCha20 is another stream cipher designed by Daniel Bernstein, based on the Salsa20 core. It offers high security and speed, making it a popular choice for applications like TLS/SSL encryption and various security protocols.

4) Grain: Grain is a lightweight stream cipher designed for constrained environments, such as low-power devices and wireless sensors. It offers a good balance between security and efficiency.

5) Trivium: Trivium is a synchronous stream cipher that was a candidate in the eSTREAM project. It’s known for its security and high performance.

6) A5/1 and A5/2: A5/1 and A5/2 are stream ciphers used in the GSM cellular standard. A5/1 is more widely used, while A5/2 is a weaker version.

7) Rabbit: Rabbit is a high-speed stream cipher with a built-in MAC function. It’s designed for high-speed software implementations.

8) HC-128 and HC-256: HC-128 and HC-256 are stream ciphers designed for fast software-based implementations. They are part of the eSTREAM portfolio.

9) ISAAC: ISAAC (Indirection, Shift, Accumulate, Add, and Count) is a fast cryptographic random number generator that can be used as a stream cipher. It’s known for its speed and security.

89
Q

Digital Certificate

A

A digital document that verifies the authenticity of a person, organization, device, or service in the digital world. It contains information about the entity it represents and their associated public key. Digital certificates are a fundamental component of Public Key Infrastructure (PKI) and play a crucial role in securing online communications and transactions.

90
Q

Under what conditions is one time pad scheme unbreakable?

A

1) Key Length and Randomness: The key used in the OTP must be truly random, at least as long as the message being encrypted. Each bit of the key should be statistically independent and have an equal probability of being 0 or 1. Any pattern or predictability in the key compromises the security.

2) Key Usage: Each key in the OTP must be used only once and must never be reused for encrypting another message. Reusing a key significantly weakens the security and makes the encryption vulnerable to attacks.

3) Secrecy and Distribution: The keys must be kept secret and shared securely between the sender and the receiver. If an adversary gains access to the key, they can decrypt the corresponding message.

4) No Key Sharing: The same key should not be used for encrypting multiple messages. Each key should be unique to a particular message.

5) Perfect Secrecy Property: The one-time pad scheme relies on the perfect secrecy property, which ensures that given any ciphertext, there exists a key such that the ciphertext corresponds to any plaintext of the same length. This means that any plaintext can map to any ciphertext, providing perfect secrecy.

6) Message Length: The length of the message must be less than or equal to the length of the key. If the message is longer than the key, the key cannot be used for encryption.

91
Q

Frequency Analysis in Cryptoanalysis

A

Frequency analysis involves studying the frequency of letters, characters, or patterns within a piece of encrypted text to gain insights into the underlying structure of the plaintext and potentially decrypt the message.

Here’s how frequency analysis is used in decrypting messages:

1) Frequency of Letters: In many languages, certain letters occur more frequently than others. For example, in English, ‘e’ is the most common letter. Cryptanalysts analyze the frequency of letters in the encrypted text.

2) Letter Patterns: Cryptanalysts look for recurring patterns of letters in the encrypted text. Common patterns could correspond to common words or combinations of letters in the plaintext.

3) Mapping Frequencies: They create a frequency distribution table, mapping the frequency of each encrypted character or pattern to its potential plaintext counterpart.

4) Comparison with Language Characteristics: They compare the frequency distribution of the encrypted text with known frequency distributions in the target language (e.g., English).

5) Guessing and Substitution: Based on the analysis, they make educated guesses and substitutions, trying to map the encrypted characters to their most likely plaintext characters based on frequency.

6) Iterative Process: This process is iterative, where analysts gradually build a partial decryption of the message. As more patterns are recognized, more substitutions can be made, and the decryption process continues.

92
Q

Cryptoanalysis

A

The study of analyzing and breaking encryption or cryptographic systems

93
Q

Cryptographic Salt v. Initialization Vector

A

Cryptographic Salt:
1) A cryptographic salt is a random value generated and used in processes such as password hashing and key derivation functions (KDFs).
2) The purpose of a salt is to add randomness and uniqueness to the process of creating a hash or a derived key.
3) Salts are used in password hashing to prevent the same passwords from generating the same hash. This makes it difficult for attackers to use precomputed tables (rainbow tables) to crack passwords.
4) Salts are also used in key derivation to enhance security by making the derived keys unique even when using the same input and function.
5) For example, when hashing passwords, each password is combined with a unique salt before hashing, ensuring that even if two users have the same password, their hash values will be different due to the unique salts.

Initialization Vector (IV):
1) An initialization vector (IV) is a random value used in symmetric encryption algorithms.
2) The purpose of an IV is to introduce randomness and prevent patterns in encrypted data, especially when encrypting multiple pieces of data with the same key.
3) IVs are crucial in achieving semantic security, ensuring that encrypting the same plaintext multiple times with the same key does not result in the same ciphertext.
4) IVs are used in block cipher modes of operation like CBC (Cipher Block Chaining) to mix the effects of encryption and add uniqueness to each block of ciphertext.
5) For example, when encrypting data using a block cipher like AES in CBC mode, the IV is combined with the first block of plaintext before encryption to ensure that even if the same plaintext is encrypted multiple times, the resulting ciphertext will be different.

94
Q

Federal Information Processing Standard (FIPS)

A

A set of standards and guidelines established by the National Institute of Standards and Technology (NIST), which is a non-regulatory federal agency within the United States Department of Commerce. FIPS provides a framework for federal agencies and contractors to ensure the security and interoperability of computer systems, software, and data.

95
Q

The strongest of the symmetric cryptography algorithms

A

RC5, mostly because the key length can get up to 2040 bits long

It is an RSA Block Cipher (with block sizes of 32, 64, or 128 bits)

96
Q

The weakest of the symmetric cryptography algorithms

A

DES, mostly because the key length is only 56 bits

It is a Block cipher (with block sizes of 64 bits)

97
Q

Most popular Block Ciphers

A

1) AES (Advanced Encryption Standard): AES is one of the most widely used and accepted symmetric encryption algorithms. It uses block sizes of 128 bits and supports key sizes of 128, 192, or 256 bits. AES is considered secure and efficient, making it the standard for symmetric encryption.

2) DES (Data Encryption Standard): Although aging and considered weak by modern standards due to its small 56-bit key size, DES was one of the earliest and most widely used block ciphers. It played a significant role in the history of cryptography and still has some applications.

3) Triple DES (3DES): 3DES is an extension of DES that applies the DES algorithm three times with independent keys. While considered more secure than DES, 3DES is now considered relatively slow and has been largely replaced by AES.

3) Blowfish: Blowfish is a symmetric key block cipher known for its simple and efficient design. It supports variable key sizes and is still used in various applications.

4) Twofish: Twofish is a symmetric key block cipher that was a finalist in the AES competition. It is known for its security and flexibility, supporting block sizes of 128 bits and key sizes of up to 256 bits.

5) Serpent: Serpent is another symmetric key block cipher that was a finalist in the AES competition. It is known for its strong security and is often used in applications where security is a top priority.

6) Camellia: Camellia is a symmetric key block cipher designed for efficient hardware and software implementations. It is often used in various security protocols and applications.

7) CAST-128 and CAST-256: CAST (Carlisle Adams and Stafford Tavares) is a family of symmetric key block ciphers known for their simplicity and efficiency. CAST-128 and CAST-256 are two popular variants.

8) IDEA (International Data Encryption Algorithm): IDEA is a symmetric key block cipher known for its security and efficiency. It is used in various applications, particularly in older systems.

9) KASUMI: KASUMI, also known as A5/3, is a block cipher used in the 3GPP mobile telecommunications standards. It is employed in the confidentiality and integrity protection of mobile communications.

98
Q

Most highly regarded stream ciphers

A

1) ChaCha20: ChaCha20 is considered to be highly secure and efficient. It is designed to be fast and secure across various platforms. ChaCha20 is often used in various applications, including TLS (Transport Layer Security).

2) Salsa20: Salsa20, similar to ChaCha20, is highly regarded for its security and efficiency. It’s known for its simplicity and excellent performance across a range of devices.

3) AES-CTR (Counter Mode): While AES is widely known as a block cipher, it can also be used in a streaming mode like CTR. AES-CTR with appropriate key sizes is considered secure and efficient for stream cipher applications.

4) Rabbit: Rabbit is known for its high speed and strong security. It is often used in applications where both speed and security are essential.

5) HC-128 and HC-256: HC-128 and HC-256 are stream ciphers that are part of the eSTREAM portfolio. They are designed to be fast and secure, suitable for software implementations.

99
Q

AES algorithm v. DES algorithm

A

Both are symmetric block ciphers

1) Algorithm and Design:
- AES: AES is a block cipher that operates on fixed-size blocks of data (128 bits or 16 bytes). It uses a symmetric key for encryption and decryption. AES operates on a fixed number of rounds (10, 12, or 14 rounds depending on the key size), each consisting of specific mathematical operations (e.g., substitution, permutation).
- DES: DES is also a block cipher that operates on fixed-size blocks (64 bits or 8 bytes). It uses a symmetric key and employs a Feistel network structure. DES operates with 16 rounds, each involving key mixing, substitution, and permutation operations.

2) Key Size:
- AES: AES supports key sizes of 128, 192, or 256 bits. The key size directly affects the encryption strength and resistance to brute-force attacks.
- DES: DES uses a fixed key size of 56 bits, which is relatively small compared to modern standards. This small key size has led to security concerns regarding its vulnerability to brute-force attacks.

3) Security:
- AES: AES is considered highly secure and robust against various cryptographic attacks, including brute-force attacks, when used with adequate key sizes (128 bits or higher).
- DES: DES is no longer considered secure for sensitive applications due to its small key size and vulnerability to modern computing capabilities, making brute-force attacks feasible.

4) Usage and Applications:
- AES: AES is widely used and accepted as the standard symmetric encryption algorithm in a variety of applications, including secure communications, file encryption, secure messaging, and more. It is utilized by various protocols and security standards.
- DES: DES is largely deprecated and not recommended for use in new applications due to its security vulnerabilities. It is mostly used in legacy systems and not suitable for modern security requirements.

5) Speed and Efficiency:
- AES: AES is generally faster and more efficient than DES due to its improved algorithm design and streamlined operations, making it a preferred choice for high-speed and low-latency applications.
- DES: DES is comparatively slower and less efficient, partly due to its older design and larger number of rounds.

100
Q

Most popular asymmetric algorithms

A

*In order from weakest to strongest

1) RSA -Based on primes and factorization
2) El Gamal -Based on discrete logarithm problems
3) Elliptic Curve Cryptography -Based on eliptical curves in fixed space
4) Lattice -Based on Latice Theory

101
Q

Rivest Cipher 2 (RC2)

A
  • Key Size: RC2 supports variable key sizes, ranging from 8 to 1024 bits. The key size significantly impacts the number of rounds used in the algorithm.
  • Block Size: The block size of RC2 is fixed at 64 bits.
  • Design Principle: RC2 uses a combination of rotations, modular additions, and look-up tables based on the user-supplied key and certain constants. It employs a series of modular additions and non-linear transformations.
  • Historical Context: RC2 was designed in 1987 by Ron Rivest. Initially, it was a proprietary algorithm, but its description was published later, and it became a widely used symmetric encryption algorithm.
    .
102
Q

Rivest Cipher 4 (RC4)

A
  • Key Size: RC4 supports variable key sizes, often ranging from 40 to 2048 bits. It’s important to note that while RC4 can accept a wide range of key sizes, it is most commonly used with 128 or 256 bits.
  • Stream Cipher: RC4 is a stream cipher, meaning it generates a pseudorandom keystream that is XORed with the plaintext to produce the ciphertext.
  • Design Principle: RC4 is based on a variable-length key schedule and the generation of a pseudorandom permutation. The permutation is typically initialized based on the provided key.
  • Historical Context: RC4 was designed in 1987 by Ron Rivest. It gained widespread popularity and was extensively used in various applications, especially in SSL/TLS protocols. However, security vulnerabilities were discovered over time, leading to its deprecation in many applications.
103
Q

Rivest Cipher 5 (RC5)

A
  • Key Size: RC5 supports variable key sizes, often ranging from 0 to 2040 bits, making it highly flexible.
  • Block Size: The block size of RC5 can vary, but it is typically chosen to be 32, 64, or 128 bits.
    Design Principle: RC5 is based on the concept of a Feistel network, where the plaintext is divided into two halves, and operations like addition and rotation are performed based on the user-supplied key and other parameters.
  • Historical Context: RC5 was designed by Ronald Rivest in 1994. It was intended to be a successor to RC4 and RC2, providing a more secure and flexible alternative
104
Q

Blowfish Algorithm

A
  • Key Size: Blowfish supports variable key sizes, typically between 32 and 448 bits. The key expansion process allows for the generation of subkeys based on the user-supplied key.
  • Block Size: The block size of Blowfish is fixed at 64 bits.
    Design Principle: Blowfish is a Feistel network-based block cipher. It uses a series of rounds (typically 16) to perform encryption and decryption. Each round consists of operations like substitutions and permutations based on subkeys derived from the original key.
  • Security: Blowfish was designed to be a fast, secure alternative to existing block ciphers at the time of its creation.
  • Usage: Blowfish has been widely used in various applications, including secure communication protocols, encryption of files, and data storage.
105
Q

Skipjack Algorithm

A
  • Key Size: Skipjack uses a fixed key size of 80 bits. The key is divided into 32 key bits and 48 key-dependent round subkey bits.
  • Block Size: The block size of Skipjack is fixed at 64 bits.
    Design Principle: Skipjack is designed as a symmetric key block cipher with a focus on government and military use. It is a classified algorithm and was initially intended for the Clipper chip, which was proposed to have encryption with government-accessible backdoors.
  • Security: Skipjack’s security is partly based on its classified design, and full details of the algorithm are not publicly available. The government maintains control over its use and dissemination.
  • Usage: Skipjack has not seen widespread adoption outside specific government applications due to its classified nature and the controversy surrounding backdoor access.
106
Q

Twofish v. Threefish

A

1) Purpose and Design:
- Twofish: Twofish is a symmetric key block cipher and is a successor to the Blowfish cipher. It is designed for encryption and decryption of data. Twofish is a Feistel network-based block cipher and operates on fixed-size blocks of 128 bits. It uses a variable key size, typically 128, 192, or 256 bits.
- Threefish:
Threefish is a symmetric key block cipher that is designed specifically for encryption purposes within the Skein hash function family. It’s part of the Skein cryptographic hash function, not a standalone cipher for general encryption purposes. Threefish operates on large blocks, such as 256, 512, or 1024 bits. It is primarily designed to provide encryption in hash functions like Skein.

2) Block Size and Key Size:
- Twofish: Twofish operates on a fixed block size of 128 bits and supports key sizes of 128, 192, or 256 bits.
- Threefish: Threefish operates on variable block sizes, typically 256, 512, or 1024 bits, providing flexibility for various applications. However, it is primarily designed for larger blocks, suitable for hash function operations.

3) Use Cases:
- Twofish: Twofish is designed for general-purpose encryption and is used in various security applications where symmetric key encryption is required, such as secure communication, file encryption, etc.
- Threefish: Threefish is designed specifically for encryption within cryptographic hash functions like Skein. It is not intended for standalone use in general-purpose encryption.

4) Relation to Skein:
- Twofish: Twofish is a standalone block cipher and is not directly related to the Skein hash function.
- Threefish: Threefish is a component of the Skein cryptographic hash function family. It provides the encryption component within Skein.

107
Q

Twofish Algorithm

A
  • Key Size: Twofish supports three possible key sizes: 128, 192, and 256 bits. The key size affects the number of encryption rounds applied during the encryption process. Despite supporting different key sizes, Twofish ensures security and efficiency across the key spectrum.
  • Block Size: Twofish operates on a fixed block size of 128 bits. The block size remains constant throughout the encryption and decryption processes.
  • Design Principle: Twofish is based on a Feistel network structure. A Feistel network divides the block into two halves and processes each half through multiple rounds, incorporating various mathematical operations such as substitutions and permutations. The design employs a mix of substitution-permutation network and key whitening techniques.
  • Security: Twofish has been extensively analyzed for its security properties. It is designed to resist known cryptanalytic attacks, including linear and differential cryptanalysis. Additionally, the design’s confusion and diffusion properties, achieved through the S-boxes and pseudo-Hadamard transform, contribute to its security.
  • Usage: Twofish has been used in various applications requiring symmetric key encryption, including secure communication protocols, file and data encryption, secure messaging, and more. Its flexibility in supporting different key sizes makes it adaptable to a wide range of security requirements.
108
Q

Threefish Algorithm

A
  • Key Size: Threefish supports variable key sizes, providing flexibility for different security requirements. Common key sizes include 256, 512, and 1024 bits. The key size influences the number of encryption rounds and the algorithm’s strength.
  • Block Size: Threefish operates on variable block sizes, typically 256, 512, or 1024 bits. It allows for flexibility in block size to suit specific applications.
  • Design Principle: Threefish employs a tweakable block cipher design. A tweak is an additional input that alters the encryption process, providing further customization and security. The design also utilizes a Feistel-like network with a mix of addition, rotation, and modular multiplication operations.
  • Security: Threefish, as part of the Skein hash function family, is designed with a focus on security. The use of a tweakable design and the incorporation of cryptographic principles make Threefish resistant to various attacks. Its security is closely tied to the overall security of the Skein hash function.
  • Usage: Threefish is primarily used as the encryption component within the Skein hash function. The Skein hash function, in turn, finds applications in secure hashing, digital signatures, password hashing, and other cryptographic protocols. The encryption provided by Threefish is crucial for ensuring data integrity and security within the Skein hash function.
109
Q

Electronic Codebook Mode (ECB) Operational Mode

A

One of the five basic operational modes of data encryption standards

  • Description: Each block of plaintext is independently encrypted using the same encryption key.
  • Characteristics:
    • Suitable for parallel processing since each block is encrypted independently.
    • Vulnerable to pattern analysis and identical plaintext blocks result in identical ciphertext blocks.
  • Usage:
    • Not recommended for securing sensitive data due to vulnerability to pattern-based attacks.
    • May be used when parallelization is necessary and each block is independent.
110
Q

Cipher Block Chaining (CBC) Operational Mode

A

One of the five basic operational modes of data encryption standards

  • Description: Each plaintext block is XORed with the previous ciphertext block before encryption.
  • Characteristics:
    • Provides more security compared to ECB because each ciphertext block depends on all preceding plaintext blocks.
    • Initialization Vector (IV) is used to modify the first block.
  • Usage:
    • Commonly used and recommended for securing sensitive data.
    • Suitable for transmitting data where sequential processing is acceptable.
111
Q

Cipher Feedback (CFB) Operational Mode

A

One of the five basic operational modes of data encryption standards

  • Description: Previous ciphertext block is encrypted and XORed with the current plaintext block to produce the ciphertext.
  • Characteristics:
    • Allows encryption of individual bits or bytes, offering streaming encryption.
    • Resynchronization can be achieved if bits are lost or corrupted during transmission.
  • Usage:
    • Suitable for applications that require real-time encryption, such as voice communication.
112
Q

Output Feedback (OFB) Operational Mode

A

One of the five basic operational modes of data encryption standards

  • Description: A key stream is generated by encrypting an initialization vector (IV) with the encryption key. The key stream is then XORed with the plaintext to generate ciphertext.
  • Characteristics:
    • Provides the ability to encrypt individual bits or bytes, similar to CFB.
    • Decryption uses the same encryption process, simplifying the implementation.
  • Usage:
    • Suitable for real-time encryption and situations where error propagation should be avoided.
113
Q

Counter (CTR) Operational Mode

A

One of the five basic operational modes of data encryption standards

  • Description: Uses a counter (unique nonce combined with a counter value) to generate a key stream, which is then XORed with the plaintext to produce the ciphertext.
  • Characteristics:
    • Allows for parallel encryption and decryption.
    • Decryption process is similar to encryption.
  • Usage:
    • Suitable for parallelizable encryption and real-time encryption requirements.
114
Q

Data Security models that enforce confidentiality

A

1) Bell-Lapadula (popular)
2) Brewer and Nash
3) Take Grant

115
Q

Data Security models that enforce integrity

A

1) Bibe
2) Clark-Wilson
3) Goguen-Maseguer
4) Sutherland

116
Q

Three Properties of data-security models

A

1) Simple Security Property (describes rules for read)
2) Star * Security property (Describes rules for write)
3) Invocation property (Rules around invocations (calls), suck as to subjects)

*These are the three things each of the security models will handle differently

117
Q

Three goals of data-security models

A

Used to Determine three things:

1) How security will be implemented
2) What subjects can access the system
3) What objects (resources) they will have access to

118
Q

Data-Security Model

A

These are intended to formalize security policy
They lay out broad guidelines (top level, not specific)
It is up to the developer to decide how these models will be used and integrated into specific things

119
Q

Bell Lapadula data-security model

A

“No Read Up”-“No Write Down” (helps prevent spillage in government contracting)

A state machine model that enforces confidentiality
Uses mandatory access control (MAC) to enforce the DoD multilevel security policy
Simple security property: “No read up” (Subject cannot read data at a higher level of classification)
Star * security property: “No write down” (subject cannot write into lower level of classification)

120
Q

Biba data-security model

A
  • “No read down”-“No write up”
  • A lattice-based model developed to address concerns of integrity
  • Simple integrity property: “no read down” (subject at one level of integrity is not permitted to read an object of lower integrity)
  • Star * integrity property “no write up” (object at one level of integrity is not allows to write to object of higher integrity)
  • Invocation property: Prohibits a subject at one level of integrity from invoking a subject at a higher level of integrity
121
Q

Clark-Wilson data-security model

A
  • Uses security LABELS to grant access to objects:
    • constrained data item (CDI): refers to any data whose integrity is protected by the security model
    • unconstrained data item (UDI): refers to any data item that is not controlled by the security model
    • integrity verification procedure (IVP): refers to a procedure that scans data items and confirms their integrity
    • Transformation procedures (TPs) -are the only procedures that are allowed to modify a CDI
  • Focusing: on :
    • Controlled access
    • Separation of duties
    • Well-defined data transformation procedures
    • Thorough auditing
  • The model emphasizes the need for certification and enforcement of well-formed transaction and data transformation processes to guarantee data integrity.
  • Data is accessed and modified through “well-formed transactions” that adhere to predefined rules and constraints. These transactions maintain the integrity of data by ensuring that certain conditions are met.
  • The model encourages the separation of duties, meaning that different individuals or roles are responsible for different aspects of data management, including its creation, modification, and validation.
  • Users are granted the minimum necessary privileges or permissions to perform their roles and responsibilities. This reduces the risk of unauthorized access and potential malicious activities.
  • Data transformation procedures are established and enforced to ensure that data is manipulated in a controlled and approved manner, preserving its integrity.
  • Regular auditing and analysis of the system’s operations, data modifications, and transaction logs are conducted to identify any anomalies, potential security breaches, or policy violations.
  • Access Control (triplet)
122
Q

Access Control Triplet

A

“Access triplets” refer to a fundamental concept related to the access control mechanisms used to enforce security policies and ensure data integrity. Access triplets play a role in specifying and controlling access to data and processes within the framework of the Clark-Wilson model.

Consists of Thress components:
1) Program Component: The program or software component that needs access to specific data or resources.
2) Transformation Procedure: The transformation procedure or set of procedures that define how the program component can access and manipulate the data or resources. These procedures ensure that the access is controlled and complies with defined security policies and integrity constraints.
3) Well-Formed Transaction: The well-formed transaction that encapsulates the transformation procedure and ensures that the access to the data or resources is done in a controlled and approved manner, maintaining data integrity.

123
Q

Take-Grant Model of Data Security

A
  • Confidentiality-based model that supports four basic operations: take, grant, create, and revoke
  • Focus: The Take-Grant model focuses on the transfer of access rights (permissions) between entities within a system.
  • Basic Elements:
    • Entities: Subjects (e.g., users, programs) and objects (e.g., files, resources).
    • Rights: Permissions or access rights associated with entities.
    • Rules: Rules for transferring rights from one entity to another.
  • Operation: Transfers rights based on predefined rules, allowing the granting and taking of rights between entities.
  • Usage: Used to model and analyze access control and permissions within a system, particularly in distributed systems.
124
Q

Brewer and Nash Model of Data Security

A
  • Focus: Also known as the “CAP Theorem,” it addresses the trade-offs between consistency, availability, and partition tolerance in distributed systems.
  • Basic Elements:
    • Consistency (C): All nodes in a distributed system have the same view of the data at the same time.
    • Availability (A): The system is always responsive and available for operations, even if some nodes fail.
    • Partition Tolerance (P): The system continues to operate and function despite network partitions that prevent some nodes from communicating.
  • Brewer’s Theorem: States that it’s impossible for a distributed system to simultaneously achieve perfect consistency, full availability, and partition tolerance.
  • Usage: Guides the design and implementation of distributed systems, helping in decision-making regarding the system’s properties under varying conditions.
125
Q

Graham-Denning Model of Data-security

A
  • Uses a formal set of protection rules for which each object has and a controller. It is based on eight rules:
    1) Securely create an object
    2) Securely create a subject
    3) Securely delete an object
    4) Securely delete a subject
    5) Securely provide the read access right
    6) Securely provide the grant access right
    7) Securely provide the read access right
    8) Securely provide the grant access right
  • Focus: The Graham-Denning model is designed to address secure information flow and access control in a secure operating system environment (specifically secure created and deleting of both subjects and objects)
  • Basic Elements:
    • Subjects: Entities that can access objects.
    • Objects: Resources that need protection and access control.
  • Rules: Rules for accessing and managing subjects’ access to objects.
  • Operations: Defines rules and mechanisms for creating, deleting, and granting access to subjects and objects securely.
  • Usage: Primarily used to model secure systems, access control policies, and analyze potential security violations and their prevention mechanisms.
126
Q

Subjects in data-security

A
  • Entities that can access objects.
  • Definition: A “subject” refers to an active entity or component within a system. It could be a user, a process, a program, or any other element that initiates actions or operations within the system.
  • Role: Subjects are the entities that perform actions or operations on objects. They may request access to objects, execute operations on them, or perform other activities within the system.
  • Access Initiator: Subjects initiate access requests to objects and seek permissions or rights to perform specific actions.
  • Example: In a computer system, a user (a subject) might request access to a file (an object) to read or modify its contents.
127
Q

Objects in data-security

A

Non-tangible assets.
Resources that need protection and access control.

  • Definition: An “object” refers to a passive entity or resource within a system. It could be a file, a database record, a piece of data, or any other element that is acted upon or accessed by subjects.
  • Role: Objects are the entities or resources that are acted upon or accessed by subjects. They are the targets of actions or operations initiated by subjects.
  • Access Target: Objects are the entities to which access rights or permissions are granted or denied. The permissions dictate what actions subjects can perform on the objects.
  • Example: In a computer system, a file (an object) might have associated permissions that dictate whether a user (a subject) can read, write, or execute the file.
128
Q

“Read” in data security

A
  • Read is about viewing the content of a data object.
  • Action: “Read” refers to the action of accessing and viewing the content or information stored in a data object without modifying or altering it.
  • Permission: Having the “read” permission allows a subject (user, program, etc.) to view the content of the data object.
129
Q

“Grant” in data security

A
  • Grant is about giving permissions to other subjects.
  • Action: “Grant” refers to the action of giving or providing permissions to other subjects to access or perform specific operations on a data object.
  • Permission: Having the “grant” permission allows a subject to assign or delegate permissions (such as read, write, execute) to other subjects for a particular data object.
130
Q

“Delete” in data security

A
  • Delete is about removing or erasing a data object.
  • Action: “Delete” refers to the action of removing or erasing a data object, making it inaccessible or no longer present in the system.
  • Permission: Having the “delete” permission allows a subject to remove or delete a data object.
131
Q

“Transfer” in data security

A
  • Transfer is about moving ownership or control of a data object to another subject.
  • Action: “Transfer” refers to the action of moving or transferring ownership or control of a data object from one subject to another.
  • Permission: Having the “transfer” permission allows a subject to transfer ownership or control of a data object to another subject.
132
Q

Read v. Delete v. Grant v. Transfer within Data-Security

A
  • Read is about viewing the content of a data object.
  • Grant is about giving permissions to other subjects.
  • Delete is about removing or erasing a data object.
  • Transfer is about moving ownership or control of a data object to another subject.
  • These actions and permissions are fundamental in designing access control policies and security mechanisms to ensure that data is accessed, shared, and managed securely and in accordance with the defined policies and requirements.