Exam 1 Flashcards

1
Q

What is the approach of accounting data analytics?

A

Data analytics can assist in uncovering unseen patterns and knowledge in vast quantities of data and software. Accounting data analytics is a subset of data analytics focusing on: transactions, examines data for fraud and errors, facilitates the audit function

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Fraud

A

The material, purposeful, misrepresentation of information characteristics, in order to gain assets, leverage or anything of value from another

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Occupational Fraud

A

Fraud in the workplace

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Abuse

A

Inappropriate workplace behavior or events to which the definition of fraud does not fit in

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Anomalies

A

Unusual occurrences or data values

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Broad Categories of fraud types

A
  1. Misappropriation of assets
  2. Financial statement fraud
    3 Corruption Fraud
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q
  1. What are the approaches by which organizations measure and address risk?
A
  1. Avoiding
  2. Bearing
  3. Mitigating
  4. Share
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

Traditional Fraud Detection approach

A

=> Do not directly observe occupational fraud
=> Observe indicators, symptoms or red flags of occupational fraud
=> These observations are referred to as anomalies
=> Investigate the anomalies to see if fraud actually occurred
Standard method is to identify a group of anomalies, examine a few transaction in the group to clear the entire group
Significant number of false positives occur among these anomalies
Due to the number of false positives, often do not get the attention these deserve

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Accounting Data Analytics approach

A

=> Use computers and specialized software to examine each transaction
=> Screen all the transactions to identify anomalies
Using specialized algorithms
=> Investigate the anomalies for fraud and errors
Can examine all or most of the anomalies

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Describe the Data Analysis Cycle

A

Three stages of the data analysis cycle

  1. Evaluation and analysis
  2. Software and technology
  3. Audit and investigation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Types of Data

A

Categorical data
Characteristics or qualities of the data
Can use pivot tables to examine two-way categorization of the data

Quantitative Data
Numeric data
Numbers, values, counts

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Types of numeric data

A

Cardinal data: numeric data with a natural zero

Score data: numeric data without a natural zero

Ordinal data: ranks, categories
May not perform arithmetic operations

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Concepts underlying inferential statistics

A

=>The population is too large or costly to enumerate
=>Draw a sample that represents the population
Representation is the key
=>Calculate statistics of interest in the sample
Infer these to values to the corresponding population statistic
=>With data analytics the entire population may be examined
Search for anomalies
Inspect the anomalies (like a sample from the population)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Measures of central tendency

A

Mean
Median
Mode

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Measures of Dispersion

A
=>Range 
=>Deviations from the mean 
=>Variance 
=>Standard Deviation 
=>Standard Deviation of a sample and a population
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Sampling

A

=>Sampling is the act of selecting items from a population
=>Use the sample to represent the population
=>Draw inferences about the population based on the sample

     Sampling is identifying the anomalies in the data set of transactions
     Study the anomalies for errors and fraud =>In an audit context
     Sample transactions across a variety of transaction types
     Stratified sampling
17
Q

Statistical Sampling

A

=>Use of mathematical calculations for selecting and analyzing a data sample
Selection of items in the sample
Calculation of sample parameters
Determination of precision levels

18
Q

Non-Statistical Sampling & Methods

A

=>Often referred to as judgement sampling
=>Researcher knows the population and the data set
=>Select the transactions in the sample based on the researcher’s knowledge

Methods
=> systematic
=>stratified
=> cluster or block

=>Use if these methods provide a more representative sample
Requires researcher or analyst special knowledge
=>Can make into a random sample by
Selecting a random starting point in a systematic sample
Selecting a random sample within each strata for a stratified sample
Randomly select a cluster or block and include all items in the cluster or block

19
Q

Statistical Sampling Methods of Auditors

A

=>Probability Proportional-to-Size Sampling
Sampling unit is dollars
Monetary Unit Sampling (MUS)
=>Transactions with higher monetary value more likely to be selected in sample
Rather than frequency of occurrence
=>Use to determine accuracy of financial transactions when size of transaction is most important
=>Errors expected to be few

20
Q

What is the general process of accounting data analytics tests?

A

=>Start with general or initial tests to identify anomalies
=>Once anomalies (or outliers or suspicious patterns) are identified
Partition the data to these anomalies (AKA data reduction)
Perform additional tests on the data partition or anomalies for further investigation

21
Q

Describe Benford’s Law in concept and its use in accounting data analytics

A

=> Benford’s law analyzes the digits in numerical data, helps identify anomalies, and detects systematic manipulation of data based on the digital distribution in a natural population.
=>Based on observation of the frequency of lead digits in numbers
Empirical rule
Examines the first or first two digits in numbers
=>The lower the digit, the more frequently it occurs

=>Creates a monotonic downward sloping curve when graphed with frequency on the vertical axis and the lead number or numbers on the horizontal axis

22
Q

What are the “standard” Benford’s Law and what are the objectives of each?

A

=>First digit
Suitable for use on data sets of less than 300 transactions
=>First two digits
Most practical and used of Benford’s Law tests
=>First three digits
Very detailed and tends to produce many anomalies
May identify too many anomalies to be useful

23
Q

Last Two Digits

A

Apply Benford’s Law to the last two digits of numbers in a data set

24
Q

Number Duplication

A

=>See if some digits occur too often in a data set

=>Too frequently occurring numbers may indicate fraud or error

25
Q

Distortion Factor Model

A

=>Tests to see if there are excess high numbers or low numbers in a data set
=>Assumes that if a true number is changed the false number is in the same range or same percentage as the true number

26
Q

Last Two Digits Test

A

=>Order the transactions by their last two digits
=>Count the frequencies for each value of last two digits
=>This distribution of the order frequencies should follow a uniform (rectangular) distribution
=>Anomalies occur if the actual distribution deviates in a meaningful fashion from a uniform or rectangular distribution

27
Q

Z-Score

A

=>Z-scores are standardized mean deviations
Dimensionless
Published tables of probabilities
Subject to the empirical rule: 1 +/- STD 68% of the transactions; 2 +/- STD 95% of the transactions; 3 +/- STD 99% of the transactions
=>Anomalies are transactions with z-scores deviating from this distribution
Can calculate statistical significance

28
Q

Relative Size Factor Test

A

=>Used to detect errors
=>Detects errors in a data partition (subset) when the largest value in a field is significantly larger than a reference point or value
Misrecorded transaction value: e.g., misplaced decimal point or fraud
Transaction really belongs to another data partition: e.g., incorrectly recorded vendor or customer number or fraud

29
Q

Same-same-same test

A

=>Used to find duplicates in the data on up to eight fields
=>Concept is to identify duplicate transactions and then compare these transaction on other dimensions
Who authorized transactions; when, duplicate numbers

30
Q

Same-Same-Different Test

A

=>Compares transactions on up to eight fields to identify duplicate or near-duplicate values
Indicates need for further analyst examination of the near duplicates

31
Q

Even amounts

A

=>Most transaction amounts do not occur in even dollar amounts
Exceptions are items like rents, consultant fees
=>Screen transaction to identify all even dollar amounts
=>Further investigate these transactions

32
Q

Correlation

A

=>Requires two variables or fields
=>Correlations measures the strength of the linear relationship between the two
=>Correlation is not causation
Theory can provide the causation
=>The calculation of correlation is purely mathematical

33
Q

Trend Analysis

A

Regression analysis
Ordinary Least Squares (OLS)
=>Examines the correlation between to variables x and y
=>Defines one variable as a dependent variable and the other as an explanatory variable
=>Theory provides the
Determination of which variable is the dependent and which the explanatory
Causal explanation

34
Q

Time Series

A

=>Use time series if data has a element of seasonality (seasonal variation)
=>Allows the development of correlation for seasonal variations

35
Q

Summation Tests & Second order tests

A

=> Summation Tests
Evaluate sums of numbers using Benford’s law
=> Second order tests
The second order test is an analysis of the digit frequencies of the differences between the ordered (ranked) values in a data set.
The digit frequencies of these differences approximates the frequencies of the original data