SB Flashcards

1
Q

Social engineering

A

People are integral to security, and their behaviour can’t always be controlled by policies

“Entire ruse was based on one of the fundamental tactics of social engineering: gaining access to information that a company employee treats as innocuous, when it isn’t” - Mitnick, 2001

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Six principles of influence

A

Cialdini: RCASLS
Reciprocity
Commitment and consistency
Authority
Social Proof
Liking
Scarcity

“You say you’re an author or a movie writer, and everybody opens up” - Mitnick, 2001
-> Liking/social proof pretext

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Final stage of social engineering attack

A

Escalation and exploitation

“Burning the source … allows a victim to recognise that an attack has taken place, making it extremely difficult to exploit the same source in future attacks” - Mitnick, 2001
-> Attacker maintains long-term access

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Mitnick paper

A

Art of deception - shows how social engineers use harmless-seeming information to exploit systems​

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Kane Gamble

A

Gained access to sensitive accounts by impersonating customer service representatives, using social engineering to reset account credentials.
Targeted high-ranking CIA and FBI officials by pretending to be them, including posing as the CIA director to manipulate support staff into granting access.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Cheswick paper

A

1992 - describes the account of defending AT&T security system against the hacker, “Berferd” via honeypots (fake services)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Cyber-killchain and countermeasures+ limitations

A

Disrupts cyberattacks with steps:
Reconnaissance = gathering data
Weaponisation = creating exploit/attack payload
Delivery = transmitting payload
Exploitation = using payload to exploit weakness
Installation = setting up backdoor for long-term
Command and control = connecting system to attacker infrastructure
Actions on objectives = achieving attack’s goal

Countermeasures = detect, deny, disrupt, degrade, deceive

Limitations - focuses on technical attacks and assumes attacker has clear objectives

“Attackers have remarkable persistence, delaying them gives defenders time to identify their methods and plan responses”
Cheswick, 1992
-> Deception

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

IDS

A

Monitors system for unusual behaviour via misuse (attack patterns) or anomalies (deviations from normal behaviour)

“We led him on to study his techniques, feeding him false information to waste his time and protect real systems” Cheswick, 1992
-> Learning patterns, jail environment

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Hutchings and Pastrana

A

2019, discuss the act of eWhoring which defrauds individuals online through fake personas

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Silk Road

A

Was a darknet marketplace facilitating the anonymous trade of illegal goods, such as drugs and weapons using cryptocurrency

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Types of organised cybercrime

A

Swarms - loosely coordinated groups with shared goals like Anonymous
Hubs - centralised groups with core members and supporting roles
Traditional organised crime groups which have extended online

“eWhorers capitalise on the emotional aspects of their victims, creating a sense of attachment or trust” - Hutchings and Pastrana, 2019
-> Example of a traditional method of crime that has moved online

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

Countermeasures and consequence of cybercrime

A

Human-focused interventions like warning messages or mass media campaigns
However this could just shift crime to new targets or methods

“Awareness and education are essential in equipping individuals with the tools to recognise and avoid eWhoring schemes” - Hutchings and Pastrana, 2019
-> Education for cybercrime

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Cybercrime in recent times

A

Cybercrime has evolved significantly with more organised and professionalised methods

“Platforms should prioritise user safety by implementing stronger verification processes and reporting mechanisms” - Hutchings and Pastrana, 2019
-> Most cybercrime is profit-driven however some are idealogical and this evolution of cybercrime needs to be equally matched by platforms

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

“Tragedy of the Commons”

A

When individual actors prioritise short-term gains over collective security

“Buyers generally have no idea whether what they are buying is secure software, so a security lemon market is born—cheaper, less-secure products drive out more secure products.” (Rao et al., 2019)
-> Results of short-term gains and economic pressures

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Rao et al.

A

2019, Explain importance of open source software and vulnerabilities due to insufficient incentives for security

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
16
Q

Gordan-Loeb

A

Cost benefit model that assesses optimal level of security investment

“Participants like end-users, ISPs, and software developers all make choices that increase their individual reward while decreasing collective trust and security.” (Rao et al., 2019)
-> Reason for security breaches

17
Q

Spam economy

A

Relies on specialised roles like harvesters and online vendors with each role benefiting from an anonymous value chain
Payment processing bottlenecks allowed law enforcement to significantly disrupt spam operations

18
Q

Interventions of law enforement

A

Bottlenecks like payments processors can disrupt entire cybercriminal networks so should focus on exploiting weak points in value chain

“A defender that invests 10 million hours of work might recover and patch 10,000 bugs, while an attacker investing just 1,000 hours can find one vulnerability—gaps in defenses allow entry.” (Rao et al., 2019)
-> Efficiency advantage of cybercriminals when this kind of thing might not be looked into

19
Q

Human in the Loop

A

Humans play critical role in operation of security systems, often required to make security decisions but this can fail due to human error

“Despite a worldwide recession, the computer security industry grew 18.6% in 2008, totaling over $13 billion.” (Walsh, 2010)
-> Economic impact of resultant cyber attacks

20
Q

How to improve practices

A

Better communication and user-centered design are necessary rather than briefing models which users are less likely to understand

“security education efforts should focus not only on recommending what actions to take, but also emphasise why those actions are necessary.” (Walsh, 2010)
-> Would lead to better practices

21
Q

Over-reliance on technology

A

Can lead to ignoring critical updates which then leaves entry points for potential attack

“users often find ways to delegate the responsibility for security to some external entity … technological (like a firewall), social (another person or IT staff), or institutional (like a bank)” (Walsh, 2010)
-> Could be because they do not believe that they possess the technical knowledge to manage the threat

22
Q

Mirai botnet

A

Demonstrated interplay between poor user practices and systemic design flaws which allowed attackers to weaponise IoT devices for large-scale disruptions

23
Q

Walsh paper

A

2010, discusses how home computer users conceptualise and make decisions regarding security threats

24
Q

Sasse and Flechais paper

A

2005, importance of designing security systems that users can effectively engage with

25
Q

Blame culture

A

Blame focuses on individual failures whereas a systemic approach views errors as consequences of poor design which mitigates failures by addressing the root concerns

26
Q

Economic constraints for organisations

A

Many organisations adopt ad hoc fixes e.g. password reminder systems which may introduce vulnerabilities

“If secure systems require users to behave in a manner that conflicts with their norms, values, or self-image, most users will not comply” (Sasse and Flechais, 2005)
-> Security should align with business goals, requiring buy-in from all stakeholders

27
Q

Chernobyl disaster

A

Was a result of operators’ active failures and were exacerbated by latent design flaws in the reactor

28
Q

Craggs and Rashid

A

2017, explore the concept of security ergonomics, emphasizing the need to integrate human factors into system design

29
Q

Fundamental Attribution Error

A

Blaming individuals for errors without considering systemic or contextual factors

“The more effort placed into better smarter technology the more likely it is that the human is seen as an error” (Craggs and Rashid, 2017)
-> excessive reliance on technology

30
Q

Just culture

A

Emphasises learning and accountability rather than blame. Also encourages open reporting of security incidents

“Being able to recognise and learn from those errors is fundamental in moving socio-technical systems, such as IoT, forward.” (Craggs and Rashid, 2017)
-> Errors should be seen as opportunity to learn rather than place blame

31
Q

Proactive design

A

Anticipate human errors and latent failures by learning from past (e.g. historical data of phishing attacks)

32
Q

Tay

A

Microsoft chatbot AI, Tay, influenced by biased training data, quickly generated racist and sexist outputs