Technology Flashcards
Elden (2010)
Argues that ‘territory’ itself is a political technology- the way that the earth is mapped is itself a techno-political device (a way of calculating space). Therefore, we need to pay closer attention to the ways in which spaces are arranged and governed. The world is not de-territorialized as a result of globalisation.
Bratton (2015)
Argues that the world of nation-states and power is being overtaken by computational power.
He highlights the uncertainty of national rights of citizens in online space. There is an uncertainty between where a person belongs territorially- within their nation state or their online networks?
The author argues that the interface of a technology is where users of a technology are spatially linked; implied within this is a sense of overcoming gaps in physical space. The addresses of a person are now in the digital traces of an individual- physical addresses are less important. Digital relationships are where we now reside
As a consequence, Bratton suggests that the urban landscape no longer looks the same; smart cities are processing huge amounts of data constantly- cities are now increasingly digitally mediated spaces.
We need to re-think how we understand the nation-state and borders themselves. Technology makes us question how much control the nation-state has over individuals online (specifically in relation to data). Online platforms are increasingly overtaking ideas of citizenship and national identities.
Gregory (2012)
Argues that it is important that we pay attention to the geographies of violence and harm that are occurring as a result of these technologies.
Technologies like drones are becoming increasingly normalised forms of violence. They allow us to see the world from a vertical viewpoint.
“It is precisely the ways in which drones- their technologies, visuality and dispositions- have become part of everyday life that needs the closest scrutiny”
Laney (2001)
Big data= volume, velocity and variety
The volume of data allows for the creation of new algorithms to look for patterns within.
The type (variety) of data suggests that humans cannot comprehend it- it requires a non-human form of calculation. Additionally, more types of data are now being analysed (video, text, speech etc.).
Data is now formed in constant streams- real time data of actual circulations.
Amoore (2018)
The architecture of cloud computing is becoming ever more closely intertwined with geopolitics- from the sharing of intelligence data, to border controls and immigration decisions.
EXAMPLE: ICITE Programme allows 17 US intelligence agencies to store, share and analyse data. This was developed as a response to the criticism of 9/11 being due to security of data.
The cloud promises to transform not only what kinds of data can be stored, where and by whom but what can be discovered (the politics of possibility).
Importantly, when asked where the cloud is we are given spatial locations where the huge amounts of data are stored, processed and analysed. BUT, Amoore argues that this needs to be looked at critically- cloud analysists are attempting to visualize and render perceptible that which could not be brought into view directly.
This has profound critical consequences; knowledge discovery is conducted by an automatic process, unsupervised that cannot be seen by the human eye.
This data archive is critically a site of active operation- its algorithms determines huge decisions (who can apply for a credit card, who is a terrorist, who can cross the border…).
Amoore (2006)
In the immediate months after 9/11, the war on terror was being framed as a problem of risk management; it was argued that if controls and surveillance were stricter, the attack could have been avoided.
Two years later, the US DHS announced the Smart Border Alliance. This was a $10 billion project aimed to restructure and manage all aspects of US air, land and sea port of entry security . The US VISIT programme represents an example of the proliferation of risk management techniques as a means of governing mobility.
The virtual borders used within the programme are designed to operate beyond the borders of the US; aimed to identify the security risks of all inbound travellers. This should be understood in terms of bio-politics. This programme is a mobile regulatory site through which everyday lives are being opened up to intervention and management.
The biometric border signals a dual-faced phenomenon in the war or terror. A significant turn to scientific and managerial techniques in governing the mobility of bodies AND an extension of biopower such that the body becomes the carrier of the border.
Kitchin (2014)
Argues that ‘big data’ creates a new epistemological approach to making sense of the world; rather than testing theories by analysing the relevant data, new algorithms seek to gain insights ‘born’ from the data.
Big Data is generated continuously- examples include: CCTV footage, recording of retail purchases, digital devices recording and communicating the history of their use and clickstream data.
The main challenges of analysing Big Data are coping with the volume and variety of it and the fact that much of what is generated has no specific question in mind.
Analysis of Big Data has now become possible due to high-powered computation and analytical techniques rooted in research concerning AI and algorithms.
Big Data analytics enables an entirely new epistemological approach for making sense of the world. Rather than testing a theory by analysing relevant data, new data analytics seek to gain insights ‘born from the data’. This is changing how knowledge is produced, conduced and enacted upon.
Graham and Shelton (2013)
Defining big data sets by their volume alone can be problematic; the word big is always relative (small data sets were huge a half-century ago).
As digital social data has become increasingly ubiquitous, many academics have turned their attention to harnessing these massive data sets. In order to produce purportedly more accurate and complete understandings of social processes.
Graham (2005)
Software sorting techniques provide critical political sites. These technologies must be at the centre of any attempt to conceptualize the formation, maintenance and experience of social and geographical inequalities within contemporary societies.
We need to address and excavate the power of code and software sorting techniques in continually orchestrating the geographies of inequality (mobility, consumption, security etc.). Such domains have become so powerful and invisible forms of techno-social power.
Ultimately, sorting practices must become transparent if we are to critically evaluate the politics of inequality.