DA1 Flashcards
Northern Trail Outfitters has decided that it is going to build a channel sales portal with the following requirements:
- External resellers are able to authenticate to the portal with a login.
- Lead data, opportunity data and order data are available to authenticated users.
- Authenticated users may need to run reports and dashboards.
- There is no need for more than 10 custom objects or additional file storage.
Which community cloud license type should a data architect recommend to meet the portal requirements?
A. Customer community.
B. Lightning external apps starter.
C. Customer community plus.
D. Partner community.
D. Partner community.
- https://www.salesforceben.com/salesforce-experience-cloud-licences/*
- Partner Community: Occasionally referred to as Partner Relationship Management. Anything where suppliers and partners need to see Opportunity data.*
Universal Containers (UC) has multiple Salesforce orgs that are distributed across regional branches. Each branch stores local customer data inside its org’s Account and Contact objects. This creates a scenario where UC is unable to view customers across all orgs.
UC has an initiative to create a 360-degree view of the customer, as UC would like to see Account and Contact data from all orgs in one place.
What should a data architect suggest to achieve this 360-degree view of the customer?
A. Consolidate the data from each org into a centralized datastore.
B. Use Salesforce Connect’s cross-org adapter.
C. Build a bidirectional integration between all orgs.
D. Use an ETL tool to migrate gap Accounts and Contacts into each org.
A. Consolidate the data from each org into a centralized datastore.
A large retail company has recently chosen Salesforce as its CRM solution. They have the following record counts:
- 2500000 accounts
- 25000000 contacts
When doing an initial performance test, the data architect noticed an extremely slow response for reports and list views.
What should a data architect do to solve the performance issue?
A. Load only the data that the users is permitted to access.
B. Add custom indexes on frequently searched account and contact objects fields.
C. Limit data loading to the 2000 most recently created records.
D. Create a skinny table to represent account and contact objects.
B. Add custom indexes on frequently searched account and contact objects fields.
A large retail B2C customer wants to build a 360 view of its customer for its call center agents. The customer interaction is currently maintained in the following system:
- Salesforce CRM
- Custom billing solution
- Customer Master Data Management (MDM)
- Contract Management system
- Marketing solution
What should a data architect recommend that would help upgrade uniquely identify customers across multiple systems:
A. Store the Salesforce id in all the solutions to identify the customer.
B. Create a custom object that will serve as a cross reference for the customer id.
C. Create a customer database and use this id in all systems.
D. Create a custom field as external id to maintain the customer Id from the MDM solution.
D. Create a custom field as external id to maintain the customer Id from the MDM solution.
Northern Trail Outfitters (NTO) has implemented Salesforce for its sales users. The opportunity management in Salesforce is implemented as follows:
- Sales users enter their opportunities in salesforce for forecasting and reporting purposes.
- NTO has a product pricing system (PPS) that is used to update Opportunity Amount Field on opportunities on a daily basis.
- PPS is the trusted source within the NTO for Opportunity Amount.
- NTO uses Opportunity Forecast for its sales planning and management.
Sales users have noticed that their updates to the Opportunity Amount Field are overwritten when PPS updates their opportunities.
How should a data architect address this overriding issue?
A. Create a custom field for opportunity amount that sales users update separating the fields that PPS updates.
B. Create a custom field for opportunity amount that PPS updates separating the field that sales user updates.
C. Change opportunity amount field access to Read Only for sales users using field level security.
D. Change PPS integration to update only opportunity amount fields when values is NULL.
C. Change opportunity amount field access to Read Only for sales users using field level security.
Northern Trail Outfitters would like to retrieve their Salesforce org’s metadata programmatically for backup within a version control system.
Which API is the best fit for accomplishing this task?
A. Metadata API
B. Tooling API
C. Bulk API in serial mode
D. SOAP API
A. Metadata API
The main purpose of Metadata API is to move metadata between Salesforce orgs during the development process. Use Metadata API to deploy, retrieve, create, update, or delete customization information, such as custom object definitions and page layouts. Metadata API doesn’t work directly with business data. To create, retrieve, update, or delete records such as accounts or leads, use SOAP API or REST API.
Universal Containers (UC) has a Salesforce org with multiple automated processes defined for group membership processing. UC also has multiple admins on staff that perform manual adjustments to the role hierarchy. The automated tasks and manual tasks overlap daily and UC is experiencing “lock errors” consistently.
What should a data architect recommend to mitigate these errors?
A. Ask salesforce support for additional CPU power.
B. Enable granular locking.
C. Remove SOQL statements from APEX loops.
D. Enable sharing recalculations
B. Enable granular locking.
- By default, the Lightning Platform platform locks the entire group membership table to protect data integrity when Salesforce makes changes to roles and groups. This locking makes it impossible to process group changes in multiple threads to increase throughput on updates. When the granular locking feature is enabled, the system employs additional logic to allow multiple updates to proceed simultaneously if there is no hierarchical or other relationship between the roles or groups involved in the updates. Administrators can adjust their maintenance processes and integration code to take advantage of this limited concurrency to process large-scale updates faster, all while still avoiding locking errors.*
- The key advantages of granular locking are that:*
- Groups that are in separate hierarchies are now able to be manipulated concurrently.*
- Public groups and roles that do not include territories are no longer blocked by territory operations.*
- Users can be added concurrently to territories and public groups.*
- User provisioning can now occur in parallel.*
- Portal user creation requires locks only if new portal roles are being created.*
- Provisioning new portal users in existing accounts occurs concurrently.*
- A single-long running process, such as a role delete, blocks only a small subset of operations.*
A large healthcare provider wishes to use salesforce to track patient care. The following actions are in Salesforce:
- Payment Providers: Organizations who pay for the care to patients.
- Doctors: They provide care plans for patients and need to support multiple patients. They are provided access to patient information.
- Patients: They are individuals who need care.
A data architect needs to map the actor to Salesforce objects.
What should be the optimal selection by the data architect?
A. Patients as Contacts, Payment providers as Accounts, Doctors as Accounts.
B. Patients as Person Accounts, Payment providers as Accounts, Doctors as Contacts.
C. Patients as Person Accounts, Payment providers as Accounts, Doctors as Person Account.
D. Patients as Accounts, Payment providers as Accounts, Doctors as Person Accounts.
C. Patients as Person Accounts, Payment providers as Accounts, Doctors as Person Account.
A customer is operating in a highly regulated industry and is planning to implement Salesforce. The customer information maintained in Salesforce includes the following:
- Personally, identifiable information (PII)
- IP restrictions on profiles organized by geographic location
- Financial records that need to be private and accessible only by the assigned Sales associate.
Enterprise security has mandated access to be restricted to users within a specific geography with detailed monitoring of user activity. Additionally, users should not be allowed to export information from Salesforce.
Which 3 Salesforce shield capabilities should a data architect recommend?
Choose 3 answers:
A. Event monitoring to monitor all user activities
B. Restrict access to Salesforce from users outside specific geography
C. Prevent Sales users access to customer PII information
D. Transaction security policies to prevent export of Salesforce Data.
E. Encrypt Sensitive Customer information maintained in Salesforce.
A. Event monitoring to monitor all user activities
B. Restrict access to Salesforce from users outside specific geography
D. Transaction security policies to prevent export of Salesforce Data.
- Event Monitoring is one of many tools that Salesforce provides to help keep your data secure. It lets you see the granular details of user activity in your organization. We refer to these user activities as events. You can view information about individual events or track trends in events to swiftly identify abnormal behavior and safeguard your company’s data.*
- So, what are some of the events that you can track? Event Monitoring provides tracking for many types of events, including:*
- Logins*
- Logouts*
- URI (web clicks in Salesforce Classic)*
- Lightning (web clicks, performance, and errors in Lightning Experience and the Salesforce mobile app)*
- Visualforce page loads*
- Application programming interface (API) calls*
- Apex executions*
- Report exports*
- The encrypted fields using Salesforce Encrypted Custom fields can be exported out in decrypted mode if the user performing the operation had the ‘View Encrypted Data’ permission.*
To address different compliance requirements, such as general data protection regulation (GDPR), personally identifiable information (PII), of health insurance Portability and Accountability Act (HIPPA) and others, a Salesforce customer decided to categorize each data element in Salesforce with the following:
- Data owner
- Security Level (ie. confidential)
- Compliance types (GDPR, PII, HIPPA)
A compliance audit would require Salesforce admins to generate reports to manage compliance.
What should a data architect recommend to address this requirement?
A. Use metadata API, to extract field attribute information and use the extract to classify and build reports
B. Use field metadata attributes for compliance categorization, data owner, and data sensitivity level.
C. Create a custom object and field to capture necessary compliance information and build custom reports.
D. Build reports for field information, then export the information to classify and report for Audits.
B. Use field metadata attributes for compliance categorization, data owner, and data sensitivity level.
Universal Containers has a legacy client server app that has a relational database that needs to be migrated to Salesforce.
What are the 3 key actions that should be done when data modeling in Salesforce?
Choose 3 answers
A. Identify data elements to be persisted in Salesforce.
B. Map legacy data to Salesforce objects.
C. Map legacy data to Salesforce custom objects.
D. Work with legacy application owner to analyze the legacy data model.
E. Implement legacy data model within Salesforce using custom fields.
A. Identify data elements to be persisted in Salesforce.
B. Map legacy data to Salesforce objects.
E. Implement legacy data model within Salesforce using custom fields.
Universal Containers uses classic encryption for custom fields and is leveraging weekly data export for data backups. During a data validation process, UC discovered that encrypted field values are still being exported as part of the data export.
What should a data architect recommend to make sure decrypted values are exported during data export?
A. Set a standard profile for the data migration user, and assign View Encrypted Data
B. Create another field to copy data from encrypted field and use this field in export
C. Leverage Apex class to decrypt data before exporting it.
D. Set up a custom profile for the data migration user, and assign View Encrypted Data.
D. Set up a custom profile for the data migration user, and assign View Encrypted Data.
A customer wants to maintain geographic location information including latitude and longitude in a custom object.
What would a data architect recommend to satisfy this requirement?
A. Create formula fields with geolocation function for this requirement.
B. Create custom fields to maintain latitude and longitude information.
C. Create a geolocation custom field to maintain this requirement.
D. Recommend app exchange packages to support this requirement.
C. Create a geolocation custom field to maintain this requirement.
The geolocation custom field allows you to identify locations by their latitude and longitude and to calculate distances between locations.
What should a data architect do to provide additional guidance for users when they enter information in a standard field?
A. Provide custom help text under field properties.
B. Create a custom page with help text for user guidance.
C. Add custom help text in default value for the field.
D. Add a label field with help text adjacent to the custom field.
A. Provide custom help text under field properties.
Universal Containers developers have created a new Lightning component that uses an Apex controller using a SOQL query to populate a custom list view. Users are complaining that the component often fails to load and returns a time-out error.
What tool should a data architect use to identify why the query is taking too long?
A. Use Splunk to query the system logs looking for transaction time and CPU usage.
B. Enable and use the Query Plan Tool in the developer console.
C. Use salesforce’s query optimizer to analyze the query in the developer console.
D. Open a ticket with salesforce support to retrieve transaction logs to be analyzed for processing time.
B. Enable and use the query plan tool in the developer console.
- The Query Plan tool in the Developer Console can help speed up SOQL queries done over large volumes. Use the Query Plan tool to optimize and speed up queries done over large volumes.*
- Once enabled in the Developer Console, you can access the Query Plan Tool in the ‘Query Editor’ tab of the console.*
- Use this tool to check the Query Plan for any SOQL queries that execute slowly. It will provide you with insight on the different plans and should you have some of the filters indexed, provide the cost of using the index compared to a full table scan.*
Northern Trail Outfitters (NTO) processes orders from its website via an order management system (OMS). The OMS stores over 2 million historical records and is currently not integrated with Salesforce.
The Sales team at NTO is using Sales cloud and would like visibility into related customer orders yet they do not want to persist millions of records directly in Salesforce. NTO has asked the data architect to evaluate Salesforce Connect and the concept of data virtualization.
Which three considerations are needed prior to a Salesforce Connect implementation?
Choose 3 answers:
A. Create a 2nd System Admin user for authentication to the external source.
B. Develop an object relationship strategy.
C. Identify the external tables to sync into external objects
D. Assess whether the external data source is reachable via an ODATA endpoint.
E. Configure a middleware tool to poll external table data
B. Develop an object relationship strategy.
C. Identify the external tables to sync into external objects
D. Assess whether the external data source is reachable via an ODATA endpoint.
Universal Containers is migrating individual customers (B2C) data from legacy systems to Salesforce. There are millions of customers stored as accounts and contacts in the legacy database.
Which object model should a data architect configure within Salesforce?
A. Leverage the person account object in Salesforce.
B. Leverage a custom person account object in Salesforce.
C. Leverage a custom account and contact object in Salesforce.
D. Leverage the standard account and contact object in Salesforce.
A. Leverage the person account object in Salesforce.
As part of addressing General Data Protection Regulation (GDPR) requirements, Universal Containers (UC) plans to implement a data classification policy for all its internal systems that store customer information including Salesforce.
What should a data architect recommend so that UC can easily classify consumer information maintained in Salesforce under both standard and custom objects?
A. Use an AppExchange product to classify fields based on policy.
B. Use Data Classification metadata fields available in Field definition.
C. Create a custom picklist field to capture classification of information on the Contact object.
D. Build reports that contain customer information and classify manually.
B. Use Data Classification metadata fields available in Field definition.
Data Classification Metadata Fields record the data owner, field usage, data sensitivity, and compliance categorization for any standard or custom object field. You can also access data classification metadata in the Salesforce API and Apex.
Universal Containers (UC) has built a B2C e-commerce site on Heroku that shares customer and order data with a Heroku Postgres database. UC is currently utilizing Postgres as the Single Source of Truth for both customers and orders. UC has asked a data architect to replicate the data into Salesforce so that Salesforce can now act as the System of Record.
What are the 3 considerations that a data architect should weigh before implementing this requirement?
Choose 3 answers:
A. Consider whether the data is required for sales reports, dashboards and KPIs.
B. Determine if the data is a driver of key processes implemented within Salesforce.
C. Ensure there is a tight relationship between order data and an enterprise resource planning (ERP) application.
D. Ensure the data is CRM-centric and able to populate standard or custom objects.
E. A selection of the tool required to replicate the data.
B. Determine if the data is a driver of key processes implemented within Salesforce.
D. Ensure the data is CRM-centric and able to populate standard or custom objects.
E. A selection of the tool required to replicate the data.
¿A en lugar de E - si va a ser system of records la A no influye verdad?
Northern Trail Outfitters (NTO) has a loyalty program to reward repeat customers. The following conditions exist:
- Reward levels are earned based on the amount spent during the previous 12 months.
- The program will track every item a customer has bought and grant them points for discounts.
- The program generates 100 million records each month.
NTO customer support would like to see a summary of a customer’s recent transaction and reward level(s) they have attained.
Which solution should the data architect use to provide the information within the salesforce for the customer support agents?
A. Create a custom object in salesforce to capture and store all reward programs. Populate nightly from the point-of-sale system, and present on the customer record.
B. Capture the reward program data in an external data store and present the 12 months trailing summary in salesforce using Salesforce Connect and an external object.
C. Provide a button so that the agent can quickly open the point-of-sale system that displays the customer history.
D. Create a custom big object to capture the reward program data and display it on the contact record, and update nightly from the point-of-sale system.
D. Create a custom big object to capture the reward program data and display it on the contact record, and update nightly from the point-of-sale system.
- What Are Some Ways I Can Use Custom Big Objects?*
- Although you can use big objects to store different kinds of data, big objects were created to tackle a few specific scenarios.*
- 360° View of the Customer*
- You’ve got a lot of customer information you want to store. From loyalty programs to transactions, order, and billing information, use a custom big object to keep track of every detail.*
Universal Containers (UC) recently migrated 1 billion customer related records from a legacy data store to Heroku Postgres. A subset of the data needs to be synchronized with Salesforce so that service agents are able to support customers directly within the service console. The remaining non-synchronized set of data will need to be accessed by Salesforce at any point in time, but UC management is concerned about storage limitations.
What should a data architect recommend to meet these requirements with minimal effort?
A. Virtualize the remaining set of data with Salesforce Connect and external objects.
B. Use Heroku Connect to bidirectionally sync all data between systems.
C. As needed, make callouts into Heroku Postgres and persist the data in Salesforce.
D. Migrate the data to big objects and leverage Async SOQL with custom objects.
A. Virtualize the remaining set of data with Salesforce Connect and external objects.
Universal Containers (UC) has released a new Disaster Recovery (DR) policy that states that cloud solutions need a business continuity plan in place separate from the cloud provider’s built in data recovery solution.
Which solution should a data architect use to comply with the DR policy?
A. Leverage a third-party tool that extracts Salesforce data/metadata and stores the information in an external protected system.
B. Leverage salesforce weekly exports, and store data in Flat files on a protected system.
C. Utilize an ETL tool to migrate data to an on-premise archive solution.
D. Write a custom batch job to extract data changes nightly, and store on an external protected system.
A. Leverage a third-party tool that extracts Salesforce data/metadata and stores the information in an external protected system.
Northern Trail Outfitters (NTO) has one million customer records spanning 25 years. As part of its new Salesforce project, NTO would like to create a Master Data Management strategy to help preserve the history and relevance of its customer data.
Which 3 activities will be required to identify a successful master data management strategy?
Choose 3 answers:
A. Identify data to be replicated.
B. Create a data archive strategy.
C. Define the systems of record for critical data.
D. Install a data warehouse.
E. Choose a Business Intelligence tool.
A. Identify data to be replicated.
B. Create a data archive strategy.
C. Define the systems of record for critical data.
Northern Trail Outfitters (NTO) has outgrown its current Salesforce org and will be migrating to a new org shortly. As part of this process NTO will be migrating all of its metadata and data. NTO’s data model in the source org has a complex relationship hierarchy with several master detail and lookup relationships across objects, which should be maintained in the target org.
What 3 things should a data architect do to maintain the relationship hierarchy during migration?
Choose 3 answers:
A. Use Data Loader to export the data from source org and then import or Upsert into the target org in sequential order.
B. Create an external ID field for each object in the target org and map source record IDs to this field.
C. Redefine the master detail relationship fields to lookup relationship fields in the target org.
D. Replace source record IDs with new record IDs from the target org in the import file.
E. Keep the relationship fields populated with the source record ID’s in the import file.
A. Use Data Loader to export the data from source org and then import or Upsert into the target org in sequential order.
B. Create an external ID field for each object in the target org and map source record IDs to this field.
D. Replace source record IDs with new record IDs from the target org in the import file.