Privacy Industry Specialist Flashcards
Managing Consents in Salesforce
Salesforce enables you to honor people’s requests about how you as a business would use their data.
The Salesforce platform supports GDPR and nation-specific data protection laws like CCPA in the United States or CASL in Canada.
The most common use cases are implementing data privacy preferences to manage customer privacy in the form of Consent Management Objects.
These objects allow us to establish authorization audit details and manage communication methods for customers to provide consent authorization and record communication preferences, respectively. For example, these objects can enable customer preferences to prevent sending emails or prevent referring customer data.
GDPR workflow for Microsoft Dynamics overview
Like the event registration workflow, the consent workflow relies on Surveys and campaign responses to manage to consent status. As part of the package we have introduced 2 new options of response code (Consented and Withdrawn) which should be used when building the consent survey. Like in the event registration workflow, the response codes get updated by the integration changing its status to consented or withdrawn triggering a Dynamics workflow that automatically updates the contact ‘Consent’ field.
The consent status of your campaign can be manged from ‘Consent Report’ form available within the Campaign.
Requirements
Concep Send for Microsoft Dynamics solution v1.4 or above installed
Microsoft Dynamics v7.0 or above
Why Amazon?
What is the most innovative idea?
When you did not meet a commitment, how did you pivot?
- Tell me about a time when u had to work out of ur comfortzone? How did u manage? - Tell me about time when I innovated things in the project and how it helped - Tell me about a time when you solved the big problem with small solution
Criteria: Ask clarifying questions to scope-down and define requirements.
Your interviewer is there to help you with clarifying questions, assumptions, and providing a customer’s perspective. As a designer you should start with the (primary) customer and work backwards:
Who are you designing the system for and why?
What expectations do they have in terms of functionality?
What things would a customer just assume will be in the system but they may not think about in the forefront of their minds? (e.g. it’ll be fast and secure)
What happens if we become hyper-popular with customers? What does 2x growth look like? Or 10x? And how would that influence the design?
Understand first what problem your system is supposed to solve. Ask clarifying questions if this is not clear.
See the interviewer as the customer, requirements might be intentionally vague, and she/he can give you clarifications.
Write/raise the requirements or assumptions you are making, and base your design on them.
Feel free to create a diagram if that helps you clarifying your thoughts.
the candidate is asked to review a system that is designed to drive customers who visit an e-commerce website to use the mobile application instead by leveraging ads. The interviewer is asking questions to understand how the candidate would design for performance.
Alright, now that you’ve had some time to review this ads scenario, I’m interested to learn how you would monitor this solution?
Candidate: There are some baseline technical metrics that we could monitor such as CPU and memory utilization for the hosts or processes that are executing our workflow (e.g. if we were using Spark jobs or something similar). We should also emit metrics on workflow success/failure. This can start out granular but we should also be able to emit these for each workflow step in order to identify if there are particular steps that have errors or are less reliable. Timing metrics, per workflow and per overall step, would also be important here so that we can catch any performance degradation early. We should also look at business metrics and data quality metrics. For example, if we processed an unexpected number of records (too many or too few) that could be something an operator needs to look into. We should consult with our business users on what kind of quality metrics are important for them, as they may be closer to the data or have a better understanding of what certain fields mean and what their expected values are.
Interviewer: Thanks for walking me through those examples. I agree capturing both technical and business metrics is important. You also mentioned having someone look into issues. How would you report and respond to failures or issues?
Candidate: Well, we could send notifications or reports (for example, via email) for someone to investigate. I’ve seen in past sometimes workflows can have intermittent failures, so we could also configure retries and only alarm if all retries are exhausted in order to prevent noisy work for our operators.
What makes this a strong system design?
This response shows more strength criteria such as:
Designs for operational performance, plans for failure, and measures the results (e.g., metrics)
Considers both technical and business metrics, as well as data quality
Thinks about how to monitor for performance and catch potential problems
Considers resiliency and making things easier for operators
t 23.3% of skills that requested privacy-sensitive information did not have a complete privacy policy. In other words, those developers neglected to address exactly how user data will be accessed, used and protected. What’s more, those researchers also discovered that many skills use the same wake up word. Consequently, users may inadvertently share information with the wrong developer.
Alexa does record your voice. But it only records a short snippet of audio whenever it detects the wake word. Those recordings are automatically sent to the cloud, where they’re accessible to the user through the Alexa app.
Perhaps worst of all, Amazon doesn’t verify the developers on their skills store. In other words, phishing scammers could pose as legitimate developers to fool unwitting consumers into sharing private information.
Alexa best practices
To mitigate these privacy concerns, you need to know some best practices:
Routinely delete your voice recordings: At the end of every day, ask Alexa to delete your recordings. You can also choose to delete your entire history of recordings in the Alexa app.
Review your history to see what was recorded: In the Alexa app you can listen to your entire history of voice recordings to understand what information may have been collected.
Opt out of the quality assurance program: This is how you can ensure your recordings aren’t being sampled and exposed to Amazon’s team of specialists.
Mute the microphone when not in use: By physically turning off your Alexa device’s microphone, you won’t be able to make requests, but you also won’t risk unwanted recording.
Amazon proudly states they are not in the business of selling your personal information to others, which is good. However, a good question to ask is, why would Amazon need to sell your data when they have their own advertising and retail juggernaut to use your data to sell you more stuff? Because Amazon is in the business of selling you more stuff. This means Amazon collects a whole lot of data on you – records of your shopping habits, Alexa search requests, the music you stream, the podcasts you listen to, when you turn your lights on and off, when you lock your doors, and on and on and on.
What’s good with Alexa? They make it possible to automatically delete voice recordings immediately after they are processed. That’s a nice feature after the controversy around human reviewers listening in to Alexa voice recordings. However, Amazon says when you delete your voice recordings, they still can keep data of the interactions those recordings triggered. So, if you buy a pregnancy test through Amazon Alexa, they won’t forget you bought that pregnancy test just because you ask them to delete the voice recording of that purchase. That record of the purchase is data they have on you going forward and may use to target you with ads for more stuff.
And then there are Alexa Skills, those little apps you use to interact with Alexa. These Skills can be developed by just about anyone with the, uhm, skill. And with too many of the Skills, third-party privacy policies are misleading, incomplete or simply nonexistent, according to one recent study.When your data is processed by an Alexa Skill, deleting your voice recordings doesn’t delete the data the developer of that Skill collects on you. With over 100,000 Alexa Skills out there, many of them developed by third parties, now your data is floating around in places you might never have imagined.
These days Alexa is built into everything from your Echo Dot smart speakers to your kids’ toys to your glasses, headphones, and thermostats. And while Amazon doesn’t sell your personal information, they sure do use the heck out of it to target you with more stuff to buy. Is this creepy? Well, with so much data floating around in so many places (and we’re talking a lot of places, both within Amazon and with third parties too), yeah, Amazon’s Echo Dot smart speaker with Alexa can feel pretty creepy.