Module 7 Flashcards
Technology Challenges for Privacy
Artificial Intelligence
Simulation of human intelligence created by machines and computers. With the ability to learn, reason and evaluate. Despite advantages, AI also has the ability (when used unethically) to mislead, exploit, manipulate, influence decisions and cause damage.
Machine Learning
Learn by experience and develop skills without the direct involvement of humans. Machine learning can be prone to bias if using biased algorithms or biased data. When designing Machine learning models, here are three guiding principles that a privacy technologist should consider:
o Define and document fairness goals, ensuring that data fed to algorithms are drawn from unbiased sources, and automated decisions remain in line with those goals and policy choices.
o Recognize that machine learning is not always the appropriate solution to a problem. Consider value-sensitive design. Does it meet the organization’s ethical values?
o Consider the privacy implications by which machine learning has the ability to impact all three types of interference, and design algorithms that address the risk to minimize or eliminate it.
Deep Learning
A subset of AI and machine learning, deep learning learns by performing a task repeatedly, adjusting along the way, adding layer of data to improve the outcome. Deep learning can create audio, video or photographs that appear real and can learn to recognize people’s faces and tag them in photos autoatically, raising consent isuues. Deep learning algorithms (Neural Networks) that try to copy the way the human brain works.
Context-aware computing
When hardware adapts to its environment. This is a type of context awareness (location, video, audio and overall activity) and has privacy implications. A technologist needs to consider how context-aware computing affects individual’s emotional well-being and self-image with its potential to intrude into sensitive personal information. This technology needs to allow the user to make adjustments, set time limits and even disable the function.
Internet Monitoring
Tools for monitoring internet activity that is placed along the path from source to its destination across intermediate networking devices (hubs, switches, repeaters and routers).
Authoritative
Type of internet monitoring; Some countries, employers and schools monitor network traffic to enforce policies for security and appropriate behavior. An example of this is when certain keywords or addresses are monitored and added to blacklists or control lists to block access to certain websites deemed inappropriate.
Wi-Fi Eavesdropping
Type of internet monitoring; When unsecured communication sent over a network is intercepted via packet sniffing and analysis tool.
Behavioral
Type of internet montioring; When companies monitor browsing history, geo data and behavior for targeted advertising.
Secure Transfer
A monitoring practice; Secure sockets layer (SSL) and transport layer security (TLS) can be used to encrypt data when transmitting over internet. An example would be to use HTTPS instead of HTTP.
Secure Wi-Fi networks
A montioring practice; Only use a secure Wi-Fi networks. Emails should be encrypted to provide an additional layer of protection.
Deep Packet Inspection
A monitoring practice; DPI is a method of examining and analyzing data within a packet before it leaves a network.
Anthropomorphism
The attribution of human traits, feeling and behaviors to inanimate objects, nonhuman animals or nature. With technology, having voice recognition software that responds or robotics expressing emotions can have privacy issues.
Speech Recognition
This technology recognizes a person’s voice and relay information. Since speech recognition is on many devices used in living areas and personal spaces, this raises privacy concerns. Speech recognition devices that send information to remote servers for processing are of particular concern as conversations may be inadvertently or maliciously recorded and transmitted.
Natural Language Understanding
Utilizes machine reading comprehension through algorithms to identify and extract natural language the computer can understand.
Natural Language Generation
Information is transformed into content, enabling such functions as text-to-speech, automation of reports and the production of content for a web or mobile application.