DA2 Flashcards
A large multinational B2C Salesforce customer is looking to implement their distributor management application in Salesforce. The application has the following capabilities:
- Distributors create sales orders in salesforce
- Sales orders are based on product prices applicable to their region
- Sales orders are closed once they are fulfilled
- It is decided to maintain the order in opportunity object
How should the data architect model this requirement?
A. Create a lookup to the Custom Price object and share with distributors.
B. Configure price books for each region and share with distributors.
C. Manually update Opportunities with Prices application to distributors.
D. Add custom fields in Opportunity and use triggers to update prices.
B. Configure price books for each region and share with distributors.
Universal Containers (UC) has adopted Salesforce as its primary sales automated tool. UC has 100,00 customers with a growth rate of 10% a year, UC uses an on-premise web-based billing and invoice system that generates over 1 million invoices a year supporting a monthly billing cycle.
The UC sales team needs to be able to pull a customer record and view their account status, Invoice history, and opportunities without navigating outside of Salesforce.
What should a data architect use to provide the sales team with the required functionality?
A. Create a custom object and migrate the last 12 months of Invoice data into Salesforce so it can be displayed on the Account layout.
B. Write an Apex callout and populate a related list to display on the account record.
C. Create a mashup page that will present the billing system records within Salesforce.
D. Create a visual force tab with the billing system encapsulated within an iframe.
C. Create a mashup page that will present the billing system records within Salesforce.
North Trail Outfitters (NTO) operates a majority of its business from a central Salesforce org, NTO also owns several secondary orgs that the service, finance, and marketing teams work out of, At the moment, there is no integration between central and secondary orgs, leading to data-visibility issues.
Moving forward, NTO has identified that a hub-and-spoke model is the proper architect to manage its data, where the central org is the hub and the secondary orgs are the spokes.
Which tool should a data architect use to orchestrate data between the hub org and spoke orgs?
A. A middleware solution that extracts and distributes data across both the hub and spokes.
B. Develop custom APIs to poll the hub org for change data and push into the spoke orgs.
C. Develop custom APIs to poll the spoke for change data and push into the org.
D. A backup and archive solution that extracts and restores data across orgs.
A. A middleware solution that extracts and distributes data across both the hub and spokes.
Universal Containers (UC) is transitioning from Classic to Lightning Experience.
What does UC need to do to ensure users have access to its notices and attachments in Lightning Experience?
A. Add Notes and Attachments Related List to page Layout in Lighting Experience.
B. Manually upload Notes in Lighting Experience.
C. Migrate Notes and Attachment to Enhanced Notes and Files using a migration tool.
D. Manually upload Attachments in Lighting Experience.
C. Migrate Notes and Attachment to Enhanced Notes and Files using a migration tool.
Northern Trail Outfitters has these simple requirements for a data export process:
- File format should be in CSV.
- Process should be scheduled and run once per week.
- The export should be configurable through the Salesforce UI.
Which tool should a data architect leverage to accomplish these requirements?
A. Bulk API
B. Data export wizard
C. Third-party ETL tool
D. Data loader
B. Data export wizard
Universal Containers (UC) has a very large and complex Salesforce org with hundreds of validation rules and triggers. The triggers are responsible for system updates and data manipulation as records are created or updated by users. A majority of the automation tools within UC’s org were not designed to run during a data load. UC is importing 100,000 records into Salesforce across several objects over the weekend.
What should a data architect do to mitigate any unwanted results during the import?
A. Ensure validation rules, triggers and other automation tools are disabled.
B. Ensure duplication and matching rules are defined.
C. Import the data in smaller batches over a 24-hour period.
D. Bulkily the trigger to handle import leads.
A. Ensure validation rules, triggers and other automation tools are disabled.
Universal Containers (UC) is going through a major reorganization of their sales team. This would require changes to a large number of group members and sharing rules. UC’s administrator is concerned about long processing time and failure during the process.
What should a Data architect implement to make changes efficiently?
A. Log a case with salesforce to make sharing rule changes.
B. Enable Defer Sharing Calculation prior to making sharing rule changes.
C. Delete old sharing rules and build new sharing rules
D. Log out all users and make changes to sharing rules.
B. Enable Defer Sharing Calculation prior to making sharing rule changes.
Universal Container has implemented Sales Cloud to manage patient and related health records. During a recent security audit of the system it was discovered that some standard and custom fields need to be encrypted.
Which solution should a data architect recommend to encrypt existing fields?
A. Use Apex Crypto Class to encrypt customer and standard fields.
B. Implement classic encryption to encrypt custom and standard fields.
C. Implement shield platform encryption to encrypt custom and standard fields.
D. Export data out of Salesforce and encrypt custom and standard fields.
C. Implement shield platform encryption to encrypt custom and standard fields.
Universal Containers (UC) is in the process of migrating legacy inventory data from an enterprise resources planning (ERP) system into Sales Cloud with the following requirements:
- Legacy inventory data will be stored in a custom child object called Inventory_c.
- Inventory data should be related to the standard Account object.
- The Inventory_c object should Inhent the same sharing rules as the Account object.
- Anytime an Account record is deleted in Salesforce, the related Inventory_c record(s) should be deleted as well.
What type of relationship field should a data architect recommend in this scenario?
A. Master-detail relationship field on Account, related to Inventory_c
B. Master-detail relationship field on Inventory_c, related to Account
C. Indirect lookup relationship field on Account, related to Inventory_c
D. Lookup relationship fields on Inventory related to Account
A. Master-detail relationship field on Account, related to Inventory_c
Universal Containers (UC) is in the process of selling half of its company. As part of this split, UC’s main Salesforce org will be divided into two orgs: org A and org B. UC has delivered these requirements to its data architect:
- The data model for Org B will drastically change with different objects, fields, and picklist values.
- Three million records will need to be migrated from org A to org B for compliance reasons.
- The migration will need occur within the next two month, prior to be split.
Which migration strategy should a data architect use to successfully migrate the data?
A. Use an ETL tool to orchestrate the migration.
B. Use Data Loader for export and Data Import Wizard for import
C. Write a script to use the Bulk API
D. Use the Salesforces CLI to query, export, and import
C. Write a script to use the Bulk API
Universal Container (UC) stores 10 million rows of inventory data in a cloud database, As part of creating a connected experience in Salesforce, UC would like to access this inventory data to Sales Cloud without an import. UC has asked its data architect to determine if Salesforce Connect is needed.
Which three considerations should the data architect make when evaluating the need for Salesforce Connect?
Choose 3 answers
A. You want real-time access to the latest data, from other systems.
B. You have a large amount of data and would like to copy subsets of it into Salesforce.
C. You need to expose data via a virtual private connection.
D. You have a large amount of data that you don’t want to copy into your Salesforce org.
E. You need to access small amounts of external data at any one time.
A. You want real-time access to the latest data, from other systems.
D. You have a large amount of data that you don’t want to copy into your Salesforce org.
E. You need to access small amounts of external data at any one time.
Northern Trail Outfitters (NTO) wants to implement backup and restore for Salesforce data, Currently, it has data backup processes that runs weekly, which back-up all Salesforce data to an enterprise data warehouse (EDW). NTO wants to move to daily backups and provide restore capability to avoid any data loss in case of outage.
What should a data architect recommend for a daily backup and restore solution?
A. Use an AppExchange package for backup and restore.
B. Use ETL for backup and restore from EDW.
C. Use Bulk API to extract data on a daily basis to EDW and REST API for restore.
D. Change weekly backup process to daily backup, and implement a custom restore solution.
A. Use an AppExchange package for backup and restore.
Universal Containers (UC) owns a complex Salesforce org with many Apex classes, triggers, and automated processes that will modify records if available. UC has identified that, in its current development state, UC runs change of encountering race condition on the same record.
What should a data architect recommend to guarantee that records are not being updated at the same time?
A. Embed the keywords FOR UPDATE after SOQL statements.
B. Disable classes or triggers that have the potential to obtain the same record.
C. Migrate programmatic logic to processes and flows.
D. Refactor or optimize classes and trigger for maximum CPU performance.
A. Embed the keywords FOR UPDATE after SOQL statements.
Universal Container (US) is replacing a home grown CRM solution with Salesforce. UC has decided to migrate operational (Open and Active) records to Salesforce, while keeping historical records in the legacy system. UC would like historical records to be available in Salesforce on an as-needed basis.
Which solution should a data architect recommend to meet business requirements?
A. Leverage real-time integration to pull records into Salesforce.
B. Bring all data to Salesforce, and delete it after a year.
C. Leverage a mashup to display historical records in Salesforce.
D. Build a chair solution to go the legacy system and display records.
C. Leverage a mashup to display historical records in Salesforce.
Universal Containers (UC) has implemented Salesforce. UC is running out of storage and needs to have an archiving solution. UC would like to maintain two years of data in Salesforce and archive older data out of Salesforce.
Which solution should a data architect recommend as an archiving solution?
A. Use a third-party backup solution to backup all data off platform.
B. Build a batch job to move all records off platform, and delete all records from Salesforce.
C. Build a batch job to move two-year-old records off platform, and delete records from Salesforce.
D. Build a batch job to move all records off platform, and delete old records from Salesforce.
A. Use a third-party backup solution to backup all data off platform.
Tengo dudas