Data Cloud Accredited Professional Exam Flashcards
What does Data Cloud do?
Salesforce Data Cloud gives every cloud access to unified customer data to power intelligence and automation at scale
What are some benefits of using Salesforce Data Cloud
Reduce Costs (by using AI powered predictions, recommendations, and insights)
increase productivity by connecting real-time data with Salesforce Flow to automate business processes
reduce time to market by practicing low code development
What is the first step involved in data Cloud?
Discover all of the company’s data sources in various locations, so can be brought together
what are the data patterns that Data Cloud consistently support?
Applications, Real time decisioning, ML processing, Event processing, Analytics, Big data lakes
What are the inputs and outputs of data cloud?
Inputs: Data from all across the company’s various data sources
Outputs: A unified customer database that is accessible to other systems and provides advanced platform functionality to power success
What are the methods that allow users to connect their data
Connectors, APIs, Mulesoft
What is a connector and what do they do?
Data Cloud connectors make the process of integrating data from common sources fast and easy w/o needing to rely on a data integration team
What are Ingestion APIs and what do they do
Data Cloud gives users the ability to connect data from any source with its Ingestion APIs. Streaming APIs can be used to send real time event information from website or mobile app. Uses a fire and forget pattern to synchronize micro batches of updates between source systems and Data Cloud in near real time. Data is processed approximately every 15 minutes
What is Mulesoft and what does it do?
Mulesoft Anypoint platform contains dozens of pre configured connectors for common platforms to easily and quickly transfer data from systems outside of Salesforce Data Cloud
what is the customer 360 data model?
The customer 360 data model is the standard data cloud data model that helps with the interoperability of data across the applications - reduces complexity by providing a standardized data interoperability guideline
what is a DMO and how is it created
A DMO is a Data Model Object and it is created when data ingested into data cloud is mapped to objects based on the customer 360 data model. This results in normalized entity within the customer data model.
what is identity resolution in Data Cloud?
Identity Resolution is the process of identifying and linking all the different data records that refer to the same real world object or entity.
What is householding in Data Cloud?
Data cloud has the ability to group together individuals that are likely members of the same household, family, or other groups. This can be done by analyzing data such as addresses, phone numbers, or other identifying information
What is a data cloud use case for the customer service industry?
Better solve a customer’s case by accessing a customer’s real time unified profile with information about past service interruptions , marketing engagements, and even loyalty or lifetime value scores
What is a data cloud use case for the Sales industry
Drive better informed sales conversations by leveraging customer’s unified profile, engagement data, we browsing, email opens, help id what conversation to have with the customer next
What is a data cloud use case for Marketing Industry
Sending super personalized communications across web, email by leveraging the customer’s unified profile
What is a data cloud use case for the commerce industry
inform automatic, personalized experiences like tailoring prices and promotions, and product recommendation using real time unified profiles
What is a data cloud use case for the analytics industry
Use Tableau across the vast scale of real time data to better understand the why behind customers’ actions and share across teams to use insights or to better inform future targeting efforts and inject some magic into analytics
What is considered “using the right data”
Using the right data means using first party data
what are data ethics
Data ethics are moral guidelines that govern the gathering, protection, use, and sharing of data and how it affects individuals
What are principles for Data Ethics
- Use and collect individual information appropriately: give customers control of their preferences and hold organizations accountable for honoring those preferences
- Provide clear exchange of value for data: Ensure customers receive clear benefits in exchange for their data
3.Treat sensitive data clearly: Sensitive data can include: protected status, race, health, veteran status, gender identity, sexual orientation, religion, ethnicity, citizenship, and political affiliation - Collect and use only what is necessary: if it is not going to be used then do to collect it
- Choose partners carefully: If data is going to be shared with third party advertisers, be intentional about selecting partners and understanding the chain of custody for the data
What is the Data Cloud built on top of?
Data Cloud is built on top of the Salesforce Platform combining: Core Salesforce org capabilities, Data Lake stores ingested data and performs transformations, as well as API and productized integrations with other Salesforce products
When does it make sense to provision data cloud inside of an org currently used by a business
It makes sense to provision data inside of a Salesforce org used by a business when: the customer has a single line of business, customer data is housed in a single salesforce org, primary use cases require OOTB Data Cloud LWCs and search capabilities for service agents
When does it make sense to house data cloud in a new Home Org
It makes sense to use a new home org for a Salesforce org when multiple customer exist, highly complex enterprise architecture exists, the data cloud administration users are different from the Salesforce admin users, the existing data org is highly customized
What are the initial setup process for Data Cloud
- Setup your Data Cloud account
- Configure additional users by creating profiles
- Set up connectors to connect data sources
What are the four Data Cloud Permission Sets
- Data Cloud Platform Admin
- Data Cloud Platform Data Aware Specialist
- Data Cloud Platform Marketing Manager
- Data Cloud Platform Marketing Specialist
What permissions are included in the Data Cloud Platform Admin permission set?
The Data Cloud Platform Admin was responsible for the setup of the application, user provisioning, and assigning permission sets within the system - this role has access to the Salesforce Sales Cloud and Salesforce Service Cloud as well as other integrated systems within the core cloud platform
What permissions are included in the Data Cloud Platform Data Aware Specialist
Data Cloud Data Aware specialist permission set manages permissions related to creating data streams, mapping data to the data model, creating identity resolution rulesets fir unified profiles, and creating calculated insights
What is included in the Data Cloud Marketing Manager Permission set
Permissions related to the overall segmentation strategy, including creating activation targets, activations, and the Data Cloud Marketing Specilaist permission set
What is included in the Data Cloud Platform Marketing Specialist Permission Set
Responsible for creating segments in Data Cloud
What are the different ingestion patterns that Data Cloud Connectors can have
Batch - CRM connector, Marketing Cloud can ingest and updates hourly so would follow the batch pattern
Near Real Time - Ingestion API processers small micro batches of records every 15 minutes so could be considered near real time
Real Time - Web and Mobile connectors can process engagement data every 2 minutes so would be a real time ingestion pattern
Which Permission Set manages the overall segmentation strategy and identifies the target campaigns?
A. IT manager
B. Marketing Manager
C. Data Aware Specialist
D. Marketing Specialist
B. The Marketing Manager Permission set manages the overall segmentation strategy and identifies the target campaigns
Which tab in the navigation manages the data coming into Data Cloud?
A. Segments
B. Data Streams
C. Activation
D. Data Model
B. Data Streams tab manages the data coming into Data Cloud
What must the first Admin user do first when setting up users in Data Cloud?
A. Assign permission sets
B. Setup each user as an admin
C. Create profiles for each user
D. Configure Data Sources
C. Create profiles for each user role
Which of the following reflects the correct order of the Data Cloud Setup process flow?
A. Configure Admin user, provision and complete Data Cloud setup, configure additional users & permissions, and connect to relevant Salesforce Clouds
B. Connect to relevant Salesforce Clouds, provision and complete Data Cloud setup, configure Admin user, and configure additional users and permissions
C. Provision and complete Data Cloud setup, connect to relevant Salesforce Clouds, and configure Admin user
D. Configure additional users and permissions, configure Admin user, provision and complete Data Cloud setup, and Connect to relevant Salesforce Clouds
A. Configure Admin user, provision and complete Data Cloud setup, configure additional users & permissions, and connect to relevant Salesforce Clouds
Which connection can a Data Aware Specialist setup to ingest data from without needing the Admin to explicitly setup the connection?
A. Google Cloud Storage
B. B2C Commerce
C. Amazon S3
D. Salesforce CRM
C. Amazon S3
When using the GCS Connector, how frequently is data from Google Cloud Storage synchronized with Data Cloud?
A. Every 15 minutes
B. Every 1 hour
C. Every 12 hours
D. Every 24 hours
B. Every hour
What two scenarios would you recommend when provisioning Data Cloud in an existing CRM Data Org?
A. Existing CRM Data Org has been highly customized
B. Customer Data is housed in a single Salesforce Org
C. Customer is using Loyalty Management and Promotions
D. Customer has a need to connect multiple CRM orgs
B. Customer Data is housed in a single Salesforce org
C. Customer is using Loyalty Management and Promotions
Which permission set is required to setup an External Activation Platform?
A. Data Cloud Platform Admin
B. Data Cloud Data Aware Specialist
C. Data Cloud Marketing Manager
D. Data Cloud Marketing Specialist
A. Data Cloud Platform Admin
What is normalized data
Normalized data is divided into multiple tables, with established relationships to reduce redundancy and inconsistency
What is denormalized data
Denormalized data is combined into a single table to make data retrieval faster - the rows contain relational data as column attributes, this is commonly known as spreadsheet view
What is the order of steps to configure a data source into Data Cloud
Step 1 Select the data source
Step 2 Select the data source object
Step 3 Define the data sources properties
Step 4 Confirm the data source object schema
Step 5 Apply the necessary row level transformations
Step 6 Configure Updates to the Data Source Object
When do you have the opportunity to use the connectors that were setup during the set-up process
In the New Data Stream page you have the opportunity to use those connections as well as potential new additional options. For all connections not configured in Data Cloud such as Amazon S3 you may have to specify authorization credentials
For Data Sources configured via connectors, how are you able to select the Data Source object
For Data Sources connected via connectors, you are presented with a dialog box to select a specific object
When configuring a new S3 Data Stream, when can the directory attribute be left blank
The directory attribute can be left blank if the Data Stream file is located in the root directory of the s3 bucket
What compression standards can be used for Data Stream files?
Data Stream files can be compressed with Zip and GZ compression standards
What are the two key fields when defining the data stream properties
the source name and category are the two key fields when defining data stream properties
What should you keep in mind when providing a name during configuration of a data stream
Choose a name that identifies the source system where the data originated, for example POS Terminal or eCommerce Store
What are the three options when specifying a category during a data stream’s configuration
Profile Data, Engagement data, and other data
When should the profile category be used for data sources
Use the profile data category for data sources that provide information about individuals with their identifiers, demographic information and profile attributes, as well as contact points such as email and phone number
When should the Engagement Data Category be used for data sources
The Engagement Category should be used for data sources that provide time-series data points, these could include customer transactions, engagement activities, and web browsing history
How to avoid duplicate records in data cloud and to ensure that the events are associated with the actual DateTime they happened with
To prevent duplicating records within Data Cloud and to ensure that events are associated with the DateTime when they took place, it is important to specify the field within the data source that contains Immutable Values
When should the other data category be used?
The other data category should be used for all other data sources that do not fit in the engagement or profile categories - like engagement data with mutable fields, or data about products, store locations, etc
How are suggested data types created in Data Cloud?
Once a data source is created, the Data Cloud Platform evaluates the data set and presents a list of fields with their suggested data types for the Data Source Object (DSO)
What are the data types supported in Data Cloud
Data cloud supports Text, Number, Date, and DateTime data types
Text Data Type description
Stores any kind of text data. It can contain both single-byte and multibyte characters that the locale supports. Zero length strings (“”) and no value are treated as empty strings.
Number Data Type description
Stores numbers with a fixed scale of 18 and precision of 38. Scale represents the number of fractional digits. Precision represents the count of digits, regardless of the location of the decimal point.
If the data record has a number that’s out of range or a non-numerical value, the value is null.
Date Data Type description
Holds the calendar date without a time part or time zone.
Example: yyyy-MM-dd
DateTime Data Type description
Stores an instant in time expressed as a calendar date and time of day. A valid datetime must include the time part and time zone. If they’re not included, it’s inferred as 00:00:00 UTC.
What is a Primary Key in Data Cloud
This value uniquely identifies a given record within the data set and establishes whether a new record from the data source should be added to the DSO or if an existing one should be updated
What is a record modified field
This attribute acts as a reference point when the system is deciding whether to update the record continuously calibrating the latest version of the record. it is also useful when data might be received out of order, helping prevent overwriting of the information with the older version
What is a Organization Unit Identifier
If your data set includes an attribute that provides a reference to an organization unit, such as Marketing Cloud business unit ID (MID), you can specify that attribute in the Organization Unit Identifier configuration field of the data stream
What is a header label
References the raw source data; cross referencing to Data Cloud and source data
What is a Field Label
Displays editable values within the Data Cloud user interface
What is a Field API name
References the field when you need to interact with the object via API
What is a composite key
The value that is produced by combining values from more than one field together
With a formula field, how can you reference a field from a different record than the one being evaluated
For any given record processed the formula context only enables access to the fields of that single record - no other records from the same data stream or other objects already configured in Data Cloud can be exposed to the formula execution context
When are some times to create a formula field at the time of ingestion
Primary Keys and Missing Attributes
Normalization
Standardization
If source data is missing a primary key, or a composite key is needed, what is the solution to create those attributes needed for ingestion
To create attributes needed for the ingestion or mapping of source data, consider using functions like CONCAT() or NOW() in a formula field
Why can the UUID() function not be used as a means to generate a primary key value in a majority of use cases?
UUID cannot be used as a means to generate a primary key value in a majority of use cases because as a formula field function it produces a new value every time the record is processed. This means that instead of upsert the platform performs an insert, adding a duplicate record in data cloud
What is the value that is produced by combining values from more than one field together
Composite Key
What is used to create attributes needed for the ingestion and mapping of source data
Primary Key and Missing Attributes
What is used to simplify segmentation and enhance usability by bucketing or grouping source data values
Normalization
What is the process used to ensure consistent, clean data values and formatting for segmentation and activation
Standardization
What are the available options when selecting the refresh mode in the Data Source Object
A data stream can be scheduled to refresh hourly, daily, weekly, or monthly, but can also be configured with a none option
When configuring a data stream for an Amazon S3 data source, what additional settings are available on the last step of the configuration
Authentication Details, Schedule, and Aggregate Node
After files have been retrieved from an Amazon S3 data source, how can you track which files have been picked up and not picked up, and then pull only the new files accordingly
By selecting the Refresh only new files checkbox when configuring updates to a data stream
When configuring an update to an Amazon S3 data stream how do you provide an alert if no files are located in the directory?
You can cause an alert if files are not found in the directory by selecting the checkbox “Log an Error if no file is found”.
This might be enabled for data streams that are important and where you would want to be informed if the data was unavailable for some reason and the data was expected to be on a defined schedule.
While configuring an update to an Amazon S3 data source, if the initial file used during setup had headers but future files will not, how can you accommodate this
Enable headerless file retrieval, do note it is expected in such scenarios the order of the data columns do not change over time
When dealing with an Amazon S3 dataset, how do you retrieve a file immediately upon saving the dataset instead of waiting for the first scheduled run
Following the frequency setting, an additional checkbox “Run Initial Refresh Immediately” ensures the file is retrieved immediately upon saving the data instead of waiting for the first scheduled run. This is similar to the refresh now button on the data stream record home page
What is a good resource to use when troubleshooting data ingestion problems
Refresh History
What is a data stream refresh setting when data set contains only new or newly updated records
Upsert
How to clear the table and replace the existing records with new records
Full refresh
How to instruct a file to be retrieved immediately upon saving the data stream
Refresh initial file immediately
How to enable the platform to combine files for optimized processing
Aggregate Mode
What are the three ways Salesforce CRM connector enables the ingestion of data
Starter Data Bundles, Direct Object Ingestion, and Data Kits
What do The Starter Bundles provide access to
Provide access to sales, service, and loyalty data by enabling highly personalized messaging experiences for specific customer segments
What does the Sales Cloud Bundle provide access to
Installs data streams for the account, contact, and lead objects
What are you bale to accomplish from the Schema Review dialog box when configuring a new data starter bundle
enables the selection or de selection of the fields for each respective object allowing the update of Field Label and Field API Name while preventing any modifications to the field type
Which objects are installed with the Service Cloud Bundle
Account, Case, Contact
Where are you allowed to select/deselect Account fields and update the Field Label and Field API Name for a Data Stream?
On the Account Schema Review Dialog page
What does the Salesforce Data Cloud Segmentation Engine allow users to do
- Query all data in the system
- create granular segments of customers, 3. and understand the data composition
How can marketers use immediate population results?
Marketers can automatically use attributes from Sales Cloud, Service Cloud, Commerce Cloud, Loyalty Cloud, Enterprise Resource Planning, and modeled data to get immediate population results
How can activation be accomplished easily
Activation is as easy as clicking a button to send segment data along for activation in messaging, advertising, personalization, and analytics systems
How can breaking down data be accomplished?
Use segmentation to break down your data into useful segments to understand, target, and analyze your customers. Create segments on any entities from your data model and then publish them on a chosen schedule or as needed
How to get immediate segment populations?
Run unlimited queries and get immediate segment populations, this unlocks the ability to test and learn
Does activation require SQL coding to send segment data?
No, activation does not require SQL coding and is as easy as clicking a button to send segment data along for activation in messaging, advertising, personalization, and analytics systems
Do users use Einstein-calculated attributes to add modeled data
Yes, users use Einstein calculated attributes to add modeled data
Can publishing the segments can only be performed on a schedule?
No, segments do not have to be published on a schedule
What is the purpose of segmentation
Segmentation creates segments to understand, target, and analyze customers
How can users query data in the segment
The segmentation engine lets users query data in the system
What are some use cases for segmentation
- Basic Email Engagement - find Unified Individuals with at least 3 emails opened with the last 30 days
- Exclusion of recent purchasers - find unified individuals who have not purchased in the last 90 days
- Date Bound Purchase Aggregation - Find Unified with at least $1,000 in purchases in the last year
- Data Lineage like Point of Sale Purchases - look for customers with $500 purchases last year from Point of Sale (not online)
- Alternate Channel Engagement - Find Unified Individuals who are active on the mobile channel but not in the last 2 months
- New Loyal Customers - find loyal customers who made a recent purchase without Urgent Service Cases
- Find Highly Engaged Individuals - Find highly engaged individuals with large purchase sums. Filter for individuals with a high engagement rate who have made at least 5 purchases over $500
- ## Seasonal Spenders - opens by subject line and total purchases by month - filter for seasonal spenders who opened an email with the subject line of Holiday Flash sale and made purchases more than 1000 in December
How are Segments created
Segments are created by completing the fields in the new segment window and specifying your criteria in the Segmentation Canvas
What is the segment target
the segment target defines the target entity used to build your segment. You can choose any entity marked as profile during ingestion.
During segmentation, what determines which attributes are available in the attitude library that you can use as segmentation filters
The chosen entity target will determine which attributes are available
What segmentation target needs to be selected to take advantage of identity resolution
The Unified Individual entity needs to be selected as segmentation target to take advantage of identity resolution
what determines how often a segment should re filter for individuals that meet the criteria and notify activations targets that a refreshed segment is available
The publishing schedule will determine how often a segment refilters
what is the use case for the segment canvas in Salesforce Data Cloud
On the Segment Canvas in Data Cloud, use direct and related attributes to narrow down a created segment to your target audience
Where can you see the direct (1:1) and related (1:Many) segmented target data that has been mapped into the Data Cloud Data Model and marked for use in segmentation
To see the direct and related segmented target data use the Attribute Library in the Segmentation Canvas
What does Rule Builder allow you to do?
- Build features to define your target audience using your 1:1 and 1:Many data and features like on-the-fly aggregates, filter frequency, relative data expressions, and nested operators
- Use Calculated Insights that have been created in your segment criteria.
How can you create relationships between your related attributes
Containers allow you to create relationships between your related attributes
How do you request a count of the segment targets that are in your segment
Count Segment allows you to request a count of the segment targets that are in your segment, based on your current data ingested and defined segment filters
How to see the individual level details on the records within a specific segment
The segment count shows the overall count of members who fall in a specific segment. To see the individual level details on the records within a specific segment, it needs to be published and activated
How to ensure that your segment is available in activation targets like Marketing Cloud or an Amazon S3 bucket
Publishing your segment in either a scheduled or an ad hoc fashion will allow a segmentation to be available in activation targets such as S3 or Marketing Cloud
True or false: Segments do not publish by default
True
True or False: You should never use the Unified Individual entity as as segment target
False
True or False: Attribute Library contains data that has been mapped and marked for segmentation
True
True or False: The rule builder defines your target audience using only 1:1 data
False
True or false: Count segment can be used to see which individual level record details are in a segment
False
True or false: Count segment can be used to see which individual record details are in a segment
False
True or False: Segments are always published on a schedule
False
What details can be seen on the segments home page?
Status, population, last publish time, and other relevant details