CIPT Flashcards
Risk is defined as a potential threat or issue, along with the impact the threat or issue could cause, and the likelihood that it will occur.
Analysts make use of privacy risk models to help them identify and align threats with the system’s vulnerabilities to mitigate and plan for these risks. Risk options can include:
- accepting the risk as is;
2. transferring the risk to another entity;
- mitigating the risk by applying an appropriate control or design change;
4.avoiding the risk via abandoning a functionality, data or the system itself.
What are some privacy risk models :
Risk models define the risk factors to be assessed and the relationships among those factors.
Risk factors are inputs to determining levels of risk.
a. Nissenbaum’s Contextual Integrity
b. Calo’s Harms Dimensions
c. Legal Compliance
d. FIPPs
e. NIST/NICE frameworks
f. FAIR (Factors Analysis in Information Risk)
Explain Nissenbaum’s Contextual Integrity:
provide an example
Privacy can be expressed as norms that should govern information access. Norms are domain specific; for example, the norms governing banking information will differ from the norms governing medical information. In addition, norms are context specific, such that each individual can have their own reasons for controlling access to their information in specific situations based on their own expectations, which govern those situations.2 This viewpoint presents a challenge to IT professionals: how to identify relevant norms and preserve norms when they introduce new or changing technology?
Consider the following illustrates the concept of contextual integrity.
Actors: The senders and receivers of personal information Attributes: The types of information being shared
Transmission principles: Those that govern the flow of information
Example: A patient visits a doctor with complaints (actors) and an x-ray is taken to determine the cause of their discomfort (attribute). The doctor shares results with a specialist to determine a course of action (transmission).
When disruptions from the informational norms occur, privacy problems arise. if the doctor were to communicate treatment options via postal mail, to either the patient’s home or work address, it could cause potential risks to privacy and the norms that govern a patient-doctor relationship, since the mail from the specialist could give away information about the type of ailment the individual may have (for example, if the envelope showed a return address for a cancer center).
One of the challenges for privacy technologists when considering context is that these norms do not generally have a preexisting reference point for privacy risks. Privacy technologists must work with organizations to identify relevant, existing norms and then determine how a system may disrupt those norms. Interpreting and designing for vulnerabilities is particularly crucial when new technology is introduced or when existing programs and practices are modified.
Explain: Calo’s Harms Dimensions
Two dimensions of privacy harm: objective and subjective. Objective harm occurs when privacy has been violated and direct harm is known to exist. It involves the forced or unanticipated use of personal information and is generally measurable and observable.
Subjective harm exists when an individual expects or perceives harm, even if the harm is not observable or measurable. An individual’s perception of privacy invasion can cause fear, anxiety and even embarrassment.
Subjective privacy harms amount to discomfort and other negative feelings, while objective privacy harms involve actual adverse consequences.
example of these dimensions.
Consider a hypothetical situation where there was a large breach of personal financial information. Those individuals whose identities were stolen or whose credit was damaged by hackers are victims of objective harm (direct harm is known to exist). However, the individuals who did not experience a direct harm (there is no evidence that their personal information was lost or used by hackers) might still experience subjective harm due to their concern that they might have been impacted by the breach or because of the amount of time and money spent freezing their credit accounts and paying for credit monitoring.
To assess the potential for subjective and objective harm, a privacy technologist may examine elements of the system that relate to individuals’ expectations of how their information may be used, actual usage—including surveillance or tracking—and consent or lack thereof to the collection and use of that information. Clear privacy notices and controls can and should be used to build and retain individuals’ trust.
Explain Legal Compliance
Statutory and regulatory mandates prescribe aspects of systems that handle personal information. This includes the type of data collected, what the system does with that data, and how the data is protected, stored, and disposed of. To ensure compliance, both business process and system owners must understand the specific obligations and prohibitions their organizations are subject to and must work with their system design teams to relay those requirements, as well as identify and address any threats and vulnerabilities associated with the technologies that will be used.
what are the .FIPPs
d.FIPPs
Fair Information Practice Principles (also referred to as FIPPs) are a set of long-standing privacy values that exist in various forms globally. FIPPs work alongside compliance models to mandate: notice, choice, and consent; access to information; controls on information; and how information is managed. Many organizations around the world have adopted the FIPPs in their privacy risk management recommendations.
FIPPs are a high-level abstraction of privacy compared to legal and policy structures that are more specific. How the FIPPs are addressed varies based on the nature of the system, product, service, or process. Interpretation is necessary to determine how they should be applied when designing, building and operating a system. To illustrate this, one common principle is to restrict the collection, use and sharing of information to only that which is necessary to meet the purpose of a system. For example, when a medical provider and a payment processor need to share information for billing purposes, they may need to share an individual’s name and mailing address, but not the doctor’s notes from the patient visit.
The Collection Limitation Principle. There should be limits to the collection of personal data and any such data should be obtained by lawful and fair means and, where appropriate, with the knowledge or consent of the data subject.
(2) The Data Quality Principle. Personal data should be relevant to the purposes for which they are to be used and, to the extent necessary for those purposes, should be accurate, complete and kept up-to-date.
(3) The Purpose Specification Principle. The purposes for which personal data are collected should be specified not later than at the time of data collection and the subsequent use limited to the fulfillment of those purposes or such others as are not incompatible with those purposes and as are specified on each occasion of change of purpose.
(4) The Use Limitation Principle. Personal data should not be disclosed, made available or otherwise used for purposes other than those specified, except a) with the consent of the data subject, or b) by the authority of law.
(5) The Security Safeguards Principle. Personal data should be protected by reasonable security safeguards against such risks as loss or unauthorized access, destruction, use, modification or disclosure of data.
(6) The Openness Principle. There should be a general policy of openness about developments, practices and policies with respect to personal data. Means should be readily available of establishing the existence and nature of personal data and the main purposes of their use, as well as the identity and usual residence of the data controller.
(7) The Individual Participation Principle. An individual should have the right:
a) to obtain from a data controller, or otherwise, confirmation of whether or not the data controller has data relating to him;
b) to have data relating to him communicated to him, within a reasonable time, at a charge, if any, that is not excessive; in a reasonable manner, and in a form that is readily intelligible to him;
c) to be given reasons if a request made under subparagraphs (a) and (b) is denied and to be able to challenge such denial; and
d) to challenge data relating to him and, if the challenge is successful, to have the data erased, rectified, completed or amended;
(8) The Accountability Principle. A data controller should be accountable for complying with measures which give effect to the principles stated above.
What are the NIST/NICE frameworks
The National Institute of Standards and Technology (NIST) provides standards, guidelines and best practices for managing cybersecurity-related risks, including the Risk Management Framework, the Cybersecurity Framework, and the Privacy Framework. The NIST Privacy Framework is a voluntary risk management tool alongside the NIST Cybersecurity Framework. The NIST Privacy Framework is intended to assist organizations in communicating and organizing privacy risk, as well as rationalizing privacy to build or evaluate a privacy governance program.
The National Initiative for Cybersecurity Education’s Cybersecurity Workforce Framework (NICE Framework) is a nationally-focused resource published by NIST, which categorizes and describes cybersecurity work. The NICE Framework establishes common terminology to describe cybersecurity work and is intended to be applied in all sectors (public, private and academic).
What is . FAIR (Factors Analysis in Information Risk)
The Factors Analysis in Information Risk (FAIR) model breaks down risk by its constituent parts, then further breaks down those parts to find factors that estimate the overall risk. The goal is not to completely eliminate risk, but rather to build a logical and defensible range of potential risk. FAIR constructs a basic framework that breaks risk into the frequency of action and magnitude of the violations. It asks, how often will a violation occur and over what period of time? And what impact will that violation have?
wha is Privacy by Design and its principles
Principle 1: Proactive, Not Reactive; Preventative, Not Remedial
Privacy protection must be a forethought in any technology system, product, process or service development. Making privacy a consideration in the design phase—instead of reacting to privacy harms as they arise in the future—helps to mitigate potential privacy risks and violations. Thinking about privacy when designing a system, product, service or process helps practitioners design these things with privacy considerations built in instead of trying to figure out how to address them in a design that may be less flexible when privacy is considered later.
Principle 2: Privacy as the Default Setting
When personal information is used beyond or outside of the scope of what an individual expects, their privacy is in danger of being violated. Individuals should not be solely responsible for protecting their privacy; the default of a technology ecosystem should be that of preserving individuals’ privacy. privacy should be achieved automatically without the individual having to take explicit action. For example, many systems incorporate an opt-in feature for users to consent to future contact by an organization before the user provides any personal information. This is considered a privacy-friendly alternative to the opt-out selection, which indicates an assumption to intrude unless the user takes action, like unchecking a box.
Principle 3: Privacy Embedded into Design
Privacy should be embedded into the design and architecture of technology systems and business practices such that a system cannot operate without privacy-preserving functionality. This principle suggests that privacy is not only included in the design of a program but is integral to the design. Privacy technologists may employ mechanisms such as designing online forms to collect data in a structured format to prevent the collection of irrelevant personal information, using system logging capabilities to record access and changes to personal information, or encryption for instant messenger programs—all examples of privacy embedded into design.
Principle 4: Full Functionality — Positive Sum, Not Zero Sum
Privacy-enhancing technologies are not a trade-off for other parts of a system, but rather a synergistic win-win relationship.
Principle 5: End-to-End security — Full Life Cycle Protection
Consideration of personal information at every stage in the data life cycle—collecting, processing, storing, sharing and destroying—is essential in any system design. By assessing the potential privacy risks associated with each stage of the information life cycle, appropriate security measures can be evaluated and implemented to mitigate these risks.
Principle 6: Visibility and Transparency — Keep it Open
Since the 1970s, providing notice to individuals regarding the use of their personal information has been a cornerstone of privacy. Information that communicates how the organization uses, shares, stores and deletes personal information should not be misleading, confusing or obscured.
Principle 7: Respect for User Privacy; Keep it User Centric
The individual is the principal beneficiary of privacy and the one affected when it is violated. Privacy technologists and organizations should keep individuals’ needs, and the risks to them, at the forefront when developing data ecosystems. Designing for privacy while respecting the best interest of the individual is imperative in maintaining a balance of power between the individual and the organization that holds their personal information.
what is value sensitive design and how design effects users
Value-sensitive design is a design approach that accounts for moral and ethical values and should be considered when assessing the overall “value” of a design.
In addition to privacy, these values might include things such as trust, fairness, informed consent, courtesy or freedom from bias.
Value-sensitive design methods help to systematically assess the values at play in relation to specific technologies and respective stakeholders. It then assesses how the technology might meet or violate those values and strives to iteratively develop designs that are sensitive to and respectful of those values. The goal of value-sensitive design is that stakeholders should see their values reflected in the final design.
How design affects users
Value-sensitive design emphasizes the ethical values of both direct and indirect stakeholders.
Direct stakeholders are those who directly interact with a system.
Indirect stakeholders are any others who are affected by the system.
For example, a mail order company’s database system might be used by its customer service representatives and the inventory control, billing, and packing and shipping departments, all of whom would be considered direct stakeholders. The customers would be indirect stakeholders, even though it is their personal information that is contained in the database records.
Value-sensitive design is an iterative process which involves conceptual, empirical and technical investigations. please elaborate
Conceptual. The conceptual investigation identifies the direct and indirect stakeholders, attempts to establish what those stakeholders might value, and determines how those stakeholders may be affected by the design.
Empirical. The empirical investigation focuses on how stakeholders configure, use, or are otherwise affected by the technology.
Technical. The technical investigation examines how the existing technology supports or hinders human values and how the technology might be designed to support the values identified in the conceptual investigation.
Value-sensitive design methods
Value-sensitive design focuses not just on the design of technology but also on the co- evolution of technologies and social structures. In the case of privacy, this means considering the interplay of technological solutions, regulatory solutions, and organizational solutions when trying to resolve identified value tensions.
In their book, A Survey of Value-sensitive Design Methods, Batya Friedman, David Hendry and Alan Borning have identified 14 targeted design methods for engaging values in the context of technology, including:
Direct and indirect stakeholder analysis, during which direct and indirect stakeholders, as well as any potential benefits, harms or tensions that may affect them, are identified;
Value source analysis, wherein project, designer and stakeholder values are assessed and the ways in which each group’s values may be in conflict are considered;
The co-evolution of technology and social structure, which strives to engage both technology and social structure in the design space with a goal of identifying new solutions that might not be apparent when considering either alone;
Value scenarios, which are used to generate narratives, or scenarios, to identify, communicate or illustrate the impact of design choices on stakeholders and their values;
Value sketches, which make use of sketches, collages or other visual aids to elicit values from stakeholders;
Value-oriented semi-structured interviews, which use interview questions to elicit information about values and value tensions;
Scalable information dimensions, which is a values-elicitation method that uses questions to determine the scalable dimensions of information such as proximity, pervasiveness or granularity of information;
Value-oriented coding manuals, which are used to code and then analyze qualitative information gathered through one of the other methods;
Value-oriented mock-ups, prototypes, or field deployments, which can be used to elicit feedback on potential solutions or features of new technologies or systems that are still in development;
Ethnographically-informed inquiries regarding values and technology, which examine the relationships between values, technology and social structures as they evolve over time;
The model of informed consent online, which provides design principles and a value analysis method for considering informed consent in online contexts;
Value dams and flows, which are ways of both identifying design options that are unacceptable to most stakeholders (the value “dams”), and removing them from the design space, while also identifying value “flows,” which are those design options that are liked by most stakeholders;
The value-sensitive action reflection model, which uses prompts to encourage stakeholders to generate or reflect on design ideas; and,
Envisioning CardsTM, that are a set of cards developed by Friedman and her colleagues, which can be used to facilitate many of the other methods.
what is the Design Thinking process
The Design Thinking process When considering value-sensitive design methods, it is important to indicate the relevance to the “Design Thinking process”. The Design Thinking process has five phases: Empathize, Define, Ideate, Prototype Test,
and it also follows an iterative approach. Combining the value-sensitive design methods with a process such as this is important to understanding the integration of values with current system design methodologies.
what is the data life cycle
The data life cycle refers to how data flows through an organization, including business processes and technology systems.
The components of the data life cycle—collection, use, disclosure, retention and destruction— are intended to be generic and adaptable to different situations.
The data life cycle is shaped by the privacy objectives and business practices of an organization.
The organization must specify the purpose for which information will be collected and used, and maintain consistency with how it is managed between actual practices and stated practices throughout the data life cycle.
The challenge for privacy technologists is in helping their organization develop a data ecosystem that has the capability to evolve with an organization’s shifting purposes and business needs and which is designed to maximize how information is utilized while minimizing privacy risk.
Another challenge for IT professionals is that the users of the data determine the purposes, and these purposes will evolve as the organization evolves their business practices.
4 types of data collection
(1) first-party collection, when the data subject provides data about themselves directly to the collector, e.g., in a web-based form that is only submitted when the data subject clicks a button;
(2) surveillance, when the collector observes data streams produced by the data subject without interfering with the subject’s normal behavior;
(3) repurposing, which occurs when the previously collected data is now assigned to be used for a different purpose, e.g., reusing a customer’s shipping address for marketing and
(4) third-party collection, when previously collected information is transferred to a third-party to enable a new data collection.
explain active and passive collection give examples
active, which occurs when a data subject is aware of the collection, or
passive, when the data subject is unaware.
Examples of explicit consent include the following:
Clicking a checkbox that appears alongside the collection disclosure statement in a web-based or other data-entry form, e.g., “By clicking this checkbox, you agree to allow us to collect…”
Clicking a button to acknowledge the receipt of a privacy notice, which may be displayed above the button or made obtainable through an additional step, such as a hyperlink or file download, e.g., “By clicking this button, you agree to the terms and conditions stated in…”
Responding to an automatically generated email or other type of private communication to indicate receipt of the privacy notice, e.g., “This notice was sent to you by email because…” and “Please click the link in this email to acknowledge…”
Passive or implied consent is generally obtained by including a conspicuous link to a privacy notice that describes the collection activities. These links may appear at the foot of a web page, for example, or embedded in installation instructions or a user manual. However, no actions are taken by the IT system to engage the individual with the notice; instead, use of the system is assumed to imply consent.
The extent to which the data subject obtains the privacy notice and infers the specific types of collections taking place determines whether the collection is overt.
For example, a privacy notice may state that collections are performed for marketing purposes (e.g., to enable third-party collection by a marketing service). Such general purposes may not lead a data subject to believe they would be the subject of online behavioral advertising through this type of collection. If collection disclosure statements are intentionally or unintentionally vague, the collection may reasonably be viewed as covert when the data subject cannot anticipate the scope of collection.
IT professionals should ensure that the purposes for which data is collected trace to appropriate uses and disclosures of that data throughout their information system.
what is repurposing give example
The act of repurposing occurs when data is collected for one purpose and then used for an entirely different purpose.
This can be a source of privacy harms to the individual and may be illegal under some regulatory frameworks.
Examples include collecting airline passenger data directly from passengers to schedule airline travel and then reusing this information to develop a terrorist threat detection system or, alternatively, collecting a mobile user’s location to provide a route between two locations and then reusing repeated location samples to develop general profiles of traffic patterns.
explain Disclosure and privacy notices
An organization that collects personal information should have a privacy notice in place.
A privacy notice is a statement made to data subjects that describes how an organization collects, uses, retains and discloses personal information. Notices should also indicate what information will be collected.
A privacy notice may also be referred to as a privacy statement, a fair processing statement, or, sometimes, a privacy policy, although the term privacy policy is more commonly used to refer to the internal statement that governs an organization or entity’s handling of personal information.
retention and Offline storage
Data stored online can take up valuable network resources, so offline storage may make sense. Storing data off premises can guard against organizational data loss should a building be destroyed or there is a persistent power outage. Although there are advantages to storing data offline or off premises, these choices are not without risks, especially when sensitive data is involved. Risks and benefits should be weighed when deciding whether, and when, to move data off-network or off-site. Once the decision has been made to move data to offline data storage, the privacy risks associated with it may change and should be assessed to determine whether and how protections should change. For example, sensitive personal information may require encryption during transfer and offline storage.
explain retention give an example
Data should be retained only as long as it is reasonably necessary and in compliance with legal and regulatory requirements as well as applicable standards. If new uses for collected information arise and thus require longer retention periods, some jurisdictions require data subjects to be notified, issued a new privacy notice, or in some cases, given an opportunity to update their consent. Regardless of whether or not this is mandated by law, it is good practice to ensure that individuals are aware of any changes to original policy notices or privacy expectations.
- IT professionals should consider how long data is retained by their system and, when retention is no longer needed, how they choose to destroy the data.
Data may be retained to fulfill ongoing business needs or legal obligations.
For example, an airline must retain consumer travel information for their bookings at least until the traveler has completed their flight; however, this information may be retained longer to comply with government regulations, to fulfill the internal requirements of customer loyalty programs, to profile their customers for marketing purposes or to offer their customers improved services. However, this data may eventually have limited value to any of the company’s existing practices, at which point, the company may consider destroying the data to reduce the risks associated with retaining the data.
explain Destruction
Privacy technologists should work with their organization to determine when and how data will be destroyed, as there are risks with retaining unnecessary data or keeping data longer than permitted, as well as risks in deleting information prematurely.
The sensitivity of information informs the strength of destruction method that should be used.
A destruction plan should be applied to an organization’s records management plan to ensure the proper removal of data.
Simply stating that the data should be destroyed is not always sufficient.
There should be clear guidelines on how to destroy the data based on its type. To aid in the destruction of expired files, a custom attribute such as “Retention Period” can be added to the Properties dialog of the files.
Once the custom attribute has been added, it is easier to retrieve the file to determine when it needs to be destroyed.
It is also possible to automate enforcement of retention schedules, such as by periodically running a program that reads the “Retention Period” value from the file and deletes the file once the retention period has passed.
Digital content: potential issues that impact data destruction.
Disks should be appropriately formatted before use to ensure that all data placed on them eventually can be deleted. Hard drives, tapes and other magnetic media will need to be degaussed.
Hard copy: potential issues that impact data destruction.
Hard copy: The primary challenge with “hard copy” documents, such as paper records, lies in determining what documents need to be destroyed and when. Established policies and guidelines should be put in place that also include who will be responsible for the documents’ destruction, how the
Portable media: potential issues that impact data destruction.
Portable media, such as CDs, DVDs and flash drives, have unique challenges precisely because they are portable and therefore harder to regulate, monitor and track. It may be more difficult to enforce deletion policies, and employees need to be trained on their appropriate use, including receiving regular reminders about established use and deletion policies. ROMs, CDs, DVDs and other “WORM” (write once, read many) media will need to be physically, and possibly professionally, destroyed.
Explain privacy noticed provide examples
privacy notice is an external instrument that informs consumers, suppliers, business partners and individuals about the organization’s information privacy practices, values and commitments.
Organizations must determine when to notify users of their agreement, for example, as soon as one enters the website, or prior to the collection of any personal information.
Additionally, organizations can communicate these notices using different methods depending on the type of information or services they are providing.
Examples include requiring users to check a box indicating agreement to the privacy notice before entering the site or purchasing a product, or simply posting a conspicuous link to the privacy notice on the website.
Prior to design, organizations must be aware of any legal and industry requirements regarding privacy notices as well as consumers’ expectations of the handling of their personal information.
Explain Organization internal privacy policies
privacy policies are internal statements designed to communicate best privacy practices and what information handling guidelines to follow, and when, for those within an organization.
Policies address privacy and security, data management and data loss prevention.
Privacy policies should be documented, easily accessible, and kept up-to-date, and all employees should be familiar with them.
It is also important that these policies are endorsed and enforced by management and executives of the company.
Designing internal policies is an integral part of preventing the loss or misuse of sensitive data.
Explain Organization security policies
Adequate privacy protection of personal information is contingent on the quality of an internal security policy.
A well-functioning internal security policy prevents unauthorized or unnecessary access to corporate data or resources, including intellectual property, financial data and personal information.
Physical security measures, such as locks, safes, cameras and fences offer further protections from both internal and external threats.
Organizations should consider going beyond their minimal requirements for security, as consumer expectations dictate.
Ways in which measures are put in place to secure data.
Data classification policies:
Data schema
Data retention
Data deletion
What is a Data schema:
Data schema: A data schema is used to separate customer information. It formulates all the constraints to be applied on the data, defines its entities and the relationships among them. Access to database schemas are only available to those who need to see the information. For example, purchase history can be separated from personal information. Access to personal information may require a specific customer ID.
What is Data retention:
Laws and regulations may require data to be stored for a specific amount of time. Establish data retention schedules early in the system development life cycle. Remove data on a periodic basis when older data is no longer of use toward a business’s objectives.
what is Data deletion:
When data is no longer needed, remove data and any derivatives from the system, ensuring that recovery methods are also removed.
Explain the need for contracts, agreements
When collected data is shared with third-party vendors, it should be handled in accordance with the commitments made to the data subject and data owner regardless of where their personal information is located or how it is used.
Third-party contracts should be detailed with clear expectations of how data is to be managed while in their possession as well as the roles and responsibilities of vendors.
Often organizations have obligations to specific compliance regulations that must be included in third-party contracts.
It should also be made clear that the organization can perform audits on third-party vendors to ensure compliance.
Penalties for breach of contract by a third-party vendor or contractor should sufficiently compensate the organization for any negative repercussions that a breach would cause.
Risk analysis can assess the vulnerabilities of personal information that is in the hands of third parties and can inform privacy technologists on what actions need to be performed in an effort to mitigate these vulnerabilities and threats.
Implementing controls such as separating collected data according to who is processing it, using data schemas, or requiring acceptance of enforcement policies when data is located in the cloud.
Explain Common IT Frameworks (COBIT, ITIL, etc
Security is about protecting data against unauthorized access and malicious action, where privacy is about enforcing the appropriate use of the data within a secure environment. It addresses all ways that data is handled, including collection, use, sharing, maintenance and retention. Privacy professionals also address risk management.
Security and privacy both rely on similar controls and technological capabilities.
Technology frameworks such as ITIL, Information Technology Infrastructure Library and COBIT, Control Objectives for Information and Related Technology provide service, process and program management to an organization’s technology environment. Because the information organizations collect is stored within technology systems, it is important that they can demonstrate compliance with any laws or regulation that governs them.
ITIL: Governed and owned by AXELOS. Provides an overall measurable view of a technology system, service and functionality. ITIL reports on services provided by the technology system and helps organizations use technology to support change and growth. It has a limited view of risk management.
COBIT: A more comprehensive program that helps with management of a technology system which allows for technology governance. Technology governance focuses on the systems, application and support personnel that manage data within a company.
What are data inventories
Keeping an inventory of data, helps to protect privacy adequately.
This means knowing what data is collected, how it is handled, where it is stored, and how it is classified.
Knowledge of data and its characteristics is a key part of the privacy technologist’s job.
Data should be regularly monitored and inventoried, and device upgrades and updates must also be performed as necessary. This includes software updates, security patches or even replacing obsolete technology.
Analyzing and interpreting data so that it can be classified and organized into information categories is an essential step.
Common categories take the form of information assets, physical assets and intellectual property.
Assets are then classified as confidential, internal use or public.
Classifying and categorizing data enables an organization to properly manage and protect the assets in its possession. It can then assign owners to specific classifications of assets.
Information assets:
Physical assets:
Intellectual property:
What are Information assets:
Customer and employee data as well as backup copies of data stored either on-site or off-site
What are Physical assets
: Servers, workstations, laptops, portable storage devices, backup media, paper files
What is Intellectual property:
Software code, trade secrets