ITEC79 (Ma'am Karen) Flashcards

1
Q

are powerful tools in which you can easily view,
add, or modify data that is stored in the Access database.

A

Forms

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

can be used as an organizational tool for the form, such
as creating different areas on the form.

A

Lines

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

focused on generating a business benefit.

the intent is to generate new or additional revenue where it is
relatively easy to calculate the ROI(Return on Investment).

A

Business-Centric

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

often seek to
improve a process or provide an analytical or data
capability that previously didn’t exist.

These projects provide infrastructure cost savings and an
indirect business benefit and are more difficult to calculate
ROI.

A

IT-centric

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Ingest (obtain, import, and process) data from different sources
at various expectancies (for example, including real-time), in an
efficient usable manner leveraging pre-built connectors to
simplify the data ingestion process.

A

Data Ingestion

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

The end-to-end tools and processes used to integrate, govern,
secure, and administer the transformation of source data into
data that is “fit-for purpose” and in compliance with corporate
and regulatory policies.

A

Data Management

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Combine data from different sources using a variety of
transformations such as filtering, joining, sorting, and
aggregating while establishing relationships within the datasets
to provide a unified view.

A

Data Integration

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Clean up data to ensure it’s fit for its intended purposes to
appropriately address incomplete or irrelevant entries, eliminate
duplications, standardize and normalize data, and ensure data
exceptions are handled properly, preferably in an automated
and repeatable fashion.

A

Data Quality:

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

A dedicated repository to collect, manage, and report on data
assets, their relationships, and the processes used to integrate,
govern, and secure those assets.

A

Metadata Catalog

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Enforce and ensure the accuracy and accountability of the
critical data in an organization to provide a common point of
reference and truth.

A

Master Data Management

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

De-identify, obfuscate, or otherwise obscure sensitive data,
such as credit card numbers, so relational integrity is
maintained, yet key sensitive values aren’t accessible.

A

Data Masking

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Analyze and assess the risk of a data security breach by
identifying the location and proliferation, and tracking the usage
of sensitive data.

A

Data Security Analytics

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Collect, process, and analyze multilatency data (including real-
time) to provide event-based insights and alerts within a time-
interval of maximum business impact (often in real-time or near real-time).

A

Streaming Analytics

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Apply analytical formulas and algorithms to datasets to answer
questions based on big data; these algorithms are employed by
data experts to test hypotheses and validate analytic models
used to improve business outcomes.

A

Big Data Analytics

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Collect and store all types of data as originally sourced for use
as a live archive, data exploration, and an operational data
store for pre-processing and preparing data for big data
analytics.

A

Data Lakes

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Collect and store structured data into a large repository for the
purpose of applying analytics and generating reports.

A

Data Warehouses

17
Q

A collection of organized data, information and
records.

  • Documents are usually organized in chronological
    order for easier access, retrieval and use.
18
Q

defines how data is organized in a
database. Also determine the set of operations that can be
performed on a data.

A

Structural model

19
Q

commonly used structural database
model.

A

Relational model

20
Q

Also called structured data in which record of
information arranged in uniform format.
Databases that usually stored information with
similar entities.

A

Structured Database

21
Q

A loose collection of information.
A collection of documents in the computer from
several programs.

A

Free-form Database

22
Q

A Dynamic Database that is used by any
organization in its day-to-day operation.
Used to collect, maintain, modify and delete data.

A

Operational Database:

23
Q

A Static Database where data is rarely modified.
Used to store and track historical data to make long
term projections and analysis.

A

Analytical Database:

24
Q

is designed to create, maintain, manipulate, modify
and delete information in relational database

A

Relational Database Management System (RDBMS)

25
a table that establishes a connection between two or more table
Linking Table
26
* is all about obtaining the artifacts that contain the input data from a variety of sources, extracting the data from the artifacts, and converting it into representations suitable for further processing.
DATA ACQUISITION
27
* The three main sources of data:
Internet (namely, the World Wide Web) Databases local files (possibly previously downloaded by hand or using additional software)
28
* is the process of searching and analyzing a large batch of raw data in order to identify patterns and extract useful information. Companies use this software to learn more about their customers. It can help them to develop more effective marketing strategies, increase sales, and decrease costs.
DATA MINING
29
refers to the process of combining data from multiple sources to provide a unified view. In modern systems, understanding these concepts is crucial as organizations strive to make the most of their data resources. This presentation will explore essential aspects of data integration and interoperability, highlighting their importance in today’s digital landscape.
Data integration
30
is vital for organizations as it enhances decision- making, boosts productivity, and fosters collaboration. By integrating data from various sources, businesses can generate comprehensive reports, identify trends, and make informed decisions. Additionally, seamless data workflows prevent siloed information, allowing organizations to operate more efficiently. Companies investing in data integration see noticeable improvements in operational agility and customer satisfaction.
Data integration
31
include databases, APIs, flat files, and cloud services. Identifying all relevant data sources is crucial for comprehensive integration that meets organizational needs.
Data Sources
32
is the process of converting data into a suitable format for analysis. This can involve cleansing, enriching, and structuring data to ensure consistency and usability.
Data Transformation
33
solutions can include data warehouses or lakes, where integrated data is consolidated. Such environments must be scalable and secure, allowing for efficient retrieval and analysis.
Data Storage
34
refers to the ability of different systems, devices, and applications to work together seamlessly. This compatibility is essential for efficient data exchange and process synchronization.
Interoperability
35
is a traditional method that extracts data from various sources, transforms it into a desired format, and loads it into a target database. This process is crucial for data warehousing.
ETL (Extract, Transform, Load)
36
allows real-time access to integrated data without moving it. This technique is beneficial for quick decision-making and reducing data duplication across systems.
Data virtualization
37
38
on the other hand, describes the ability of different systems to work together and share data effectively.
Interoperability