6 - Using data to create & preserve value Flashcards
Feedback =
where the results of a process are gathered and then used to influence future performance (i.e. exercise control) by adjusting inputs into the process.
Single feedback loop = (1 liner)
Compare actual results against expected results
Double feedback loop = (1 liner)
Also involves making changes to the actual plans or systems as a result of changes in internal/external conditions.
Define sensor (feedback theory)
A measuring or recording device
Define comparator (feedback theory)
compares actual results obtained by sensor against the plan/standard
Define effector (feedback theory)
A team or individual who acts upon the comparison by issuing new instructions relating to the input.
Define High level controller (feedback theory)
Performs a role similar to the effector, but from a more senior position in the org.
Big difference between single and double feedback look
Single = no human intervention - automated & dependable Double = need high level human control
Outline single feedback loop
Outputs - sensor - comparator - effector - adjust - Inputs - processes - outputs
What extra features does double feedback loop have over single?
- Higher level controller
- standard (similar to comparator)
- lower level controller (similar to effector)
What does ‘future data needs’ refer to?
Sometimes feedback or new strategies create Qs that finance cannot answer as it lacks info due to gap in data available - needs to be filled by sourcing data.
2 ways to source new data
- Reconfigure an existing system
2. New system
4 stakeholders who provide feedback to finance and have data requirements
Sales, operations, Marketing, Board of directors.
HR general data requirements from info system
Appraisal, productivity, training data
Shareholders general data requirements from info system
Impacts and future prospects of business e.g. Integrated reporting such as carbon footprints
Individual directors general data requirements from info system
Effective data visu and key metrics
employees general data requirements from info system
Payroll info and systems data to help suggest and test solutions to inefficiencies as users of these systems
Data extraction, transformation & loading (ETL) systems =
Take data from an existing database, convert it to a different form and place it into a new database (usually data warehouse). Combine 3 different system functions into a single system.
Extract ETL function
Data analysed to understand content, format & structure = data profiling.
Then read from specified source & what is required is extracted.
Transform ETL function
Convert extracted data using rules, look up tables or by combining data, into a set form so it can be placed into another database.
Load ETL function
Write the transformed data into the target database (data warehouse) where it is held in a systematic and logical way so it is ready for future interrogation by BI systems.
Where is data supplied by an ETL system drawn from?
A number of independent IT infrastructure systems
Business intelligence systems =
describe the technological architecture the extract, assemble, store and access data to provide reports and analysis. (ETL one part of this)
Bottom of BI stack + explain function
IT infrastructure - basic hardware and networking to support info systems.
2nd level of BI stack
Specialist independent IT systems for each bis function . data often saved in diff formats in diff locations = inconsistencies.
3rd level BI stack
ETL - takes data from specialist systems, transforms, and loads into org-wide database. + EXTERNAL data.
4th level BI stack
Systems that store the transformed data & manage it so it can be used by higher level systems: Metadata, data warehouse & warehouse mgmt. Provide storage capacity for big data.
5th level BI stack
Application level - software that analyses data supplied by ETL system. Mgrs, without specialist IT or analytical expertise, can drill into and view data to find patterns. Basis for big data analytics.
OLAP: interrogate data real time
Data mining
Query & reporting: drilldown & manipulate to create customised report
6th level BI stack
Customer, finance, operations, supply chain. Business analytical packages targeted at particular user groups.
7th level BI stack
Company-wide BI tools and apps. Real time access, reports, dashboards. data visu = presentation/delivery layer.
Definition of Hadoop
a technological solution for BI that can handle the volume, variety & velocity challenges of big data.
How does Hadoop increase speed that tasks can be completed?
rather than 1 system processing from start to finish, task broken down into separate elements & each element sent to a separate server (node) for processing. This reduces processing time as different task elements can be worked on at once.
Hadoop: the breaking down of tasks and distributing them is performed by…
HDFS = Hadoop Distribute File System
Hadoop: the coordination for the separated task processing is performed by…
MapReduce framework
Hadoop: what happens once the separated tasks are completed?
They are then reassembled
Data modelling =
Analysis of an org’s data needs required to support its bis processes.
Data manipulation =-
the reorganisation or transformation of data to make it easier to read or more meaningful.
Data analysis =
Overall process of collecting, cleansing, manipulating and modelling data to support decision making. Additional aspect is the decision making.
State 3 stages of data modelling
- Conceptual model
- Logical model
- Physical model
Data modelling: what is conceptual model & how are finance involved?
Finance liaise with internal stakeholders to determine types of insights required & therefore record data requirements.
Data modelling: what is logical model & how are finance involved?
Data requirements developed into formal docs which set out specific data structures that will be used to organise the database (data flows, sources, storage). Finance then review this to make sure it’d meet data requirements.
Data modelling: what is physical model & how are finance involved?
Manage data and relationships between datasets and tables - create hardware & code software.
Finance test the physical model to ensure it can create the type of insights required.
4 pros of data modelling
- Foundation for handling data & facilitates effective use of data
- Helps comply with data regs by enforcing bis rules at early stage
- Enhanced quality of data through well-planned approach
- Consistent naming conventions & values through systematic and reliable database
What is a data manipulation language (DML)?
Something used to automate the process of data manipulation and handle complex databases. DMLs search the parameters of data being held and give instructions to ensure it’s help in consistent and structured manner.
How are finance involved in data manipulation?
Manipulate data using accounting fields, dates and values to make it easier for users to understand trends e.g. by putting into spreadsheet.
Exploratory data analysis =
finding new relationships or features in an existing dataset.
Confirmatory data analysis =
Confirming or disproving a hypothesis
Predictive data analysis =
Making forecasts based on techniques such as statistical modelling.
Text data analysis =
Extracting and classifying data from textual sources e.g. grouping products into named categories. Minor part of data analysis.
How does finance help manage the volume issue with big data?
Assist mgmt in determining infra & storage needs
Monitor volume & predict future storage
Internal audit ensure don’t run out of storage
1 solution to volume issue with big data
Cloud storage - can be scaled up or down as needed.
How does finance help manage the velocity issue with big data?
Provide benchmarking insights on how good networks and comms systems are to IT - data needs to be streamed as fast as bis requires.
How does finance help manage the variety issue with big data?
Ensure data connection and visu tools can make sense of diff forms of data to meet org’s insight needs.
How does finance help manage the veracity issue with big data?
finance cleanse data before it can be trusted as accurate.
3 Qs that are asked to ensure effective data visualisations
- Who is audience - what detail needed & their technical ability
- How do they want the data - what form?
- what outcome do they want - what decisions try to make?
4 reasons why finance performs the creation and maintenance of data visualisation roles (even though many diff business functions could do this)
- Finance responsible for insights = should control visus
- Experts = can spot & amend errors themselves = faster
- Improve comms by finance - being in charge of visus enables finance to have deeper understanding of data behind visus and so can provide richer and higher level analysis
- Finance knows org-wide strategy - not just 1 function
___ is a key aspect of finance creating data visualisations
business partnering