EXTRACTIONS: Digital infrastructures and the public good
Presentations and discussions
- Kate O’Riordan (University of Sussex) “Revisiting the Biodigital”
- Susanne Bauer (UiO) “Bioeconomic Publics? Multispecies Extractions for Re-engineering Salmon and Human Metabolism”
- David Ribes (University of Washington) “Prospecting (in) the data sciences”
- Cat Kramer and Zack Denfeld (Office of Life+Art/UiB): “Creative Strategies for Speculative Policy on the Bioeconomy” – Art interactive workshop
- Ana Delgado (UiO): “Bioprospecting: On microbes, infrastructures and the public good”
- Tommas Måløy (UiO): “Valuation and genomic databases: a case study on the aqua genome project“
- Ageliki Lefkaditou (Norwegian Museum of Science and Technology).Visiting the “Folk” exhibit – Easy extractions? Mouth swabs, socio-genomics, and the new/old public good”
- Round table final discussion (Day 2)
All presentations during the workshop were concerned with digital ‘collections’ of different kinds and with their relevance for ‘ordering’ practices. All participants referred to collections that are ‘infrastructured’ in accordance to certain ‘inference tools’ that have the aim to produce certain order within research or policy making. In every discussed case study, the public (as a political figure/agency) was invoked as a key element needed for the making of orderings, but simultaneously neglected as a political agent.
|‘Extraction’ refers to the action in which an element is taken out of its place without being replaced, but leaving some traces behind.|
In this context, the process of ‘extraction’ in research is seen as process in which the public is ‘extracted’, eventually replaced from ‘abstractions’ of what the public should be. What is ‘extracted’ is the public as a political figure (with political agency) not to be replaced. This was shown empirically across all presentations, in which issues of common concern such as nutrition, sex, antibiotic resistance, big data, genetic mapping, etc., were ‘infrastructured’ (made researchable). This means that they were made researchable in a way that such process of infrastructuring entailed a chain of inferences (i.e. extractions). The action cause by ‘extraction’ is not just pointing out the natural element (bodies, blood, genetic information, fish, metabolic insights etc…) but actually also the ‘public’ as a political figure. In other words, by invoking the common good within digital infrastructures, the public is set as an element that needs to be extracted.
|Absent publics: Where is the public in research projects that are developed in the name of the public? What is to make a science for someone who is not there, and what happen when they are there? (Cat)|
Key ideas discussed within the workshop:
The ‘public good’ appears recursively within practices of ‘digital extractions’ and data driven science as playing an ambivalent role. On one hand, it seen as ‘ethically good’ and on the other as ‘economically good’. In this ambivalent use, ‘the public good’ can be used for innovation/market driven research/infrastructuring, and consequently present potential economic gains as the main desire and even as ethically good. It is important to distinguish this difference because the general/broad sense of public good may be threatened by being reduced to economic terms. It is in this sense that the conceptualization of ‘extraction’ becomes important as is it enables to visualize this threat.
‘The absent public’ is key in innovation policies, but despite being mentioned it usually is done on an unspecific way that does not allow to land in concrete actions. Research funding projects in Europe try to address the public by describing how they respond to ‘grand social challenges’ or ‘societal needs’. However, the public is set before hand, and although invoked, it ends up excluded from the definition of such grand challenges.
Extractions in nature or extractions of the public, is an intertwined processes. It is a process that des-contextualizes nature and the public from its reality of origin to a virtual one.
What does data-driven research do? It produces hybrid formed elements of analysis that are made by the formulation of predictions and explanations.
Can extractions be seen as process of ‘mediation’? Yes, as a processes of inference and translation that may induce transformation and leave traces behind. It is conducted by the ‘inference tools’ that are at play, the way in which relations are made, and under which conventions do tools and actors readdress each other.
Extractions happen at different levels and scales. From DNA, cells or organisms to landscapes and populations, but also in attempts at scaling up from the lab to the industry.
The process of defining Universal/’Domain logics’ are ‘domination attempts’ that determinate which are the right domains, which are not, and which should be excluded.
Images are extractions that try to provide a visualization of the world. This is very relevant for the future interpretation of the Norwegian bioeconomy as images are used to illustrate, justify and convince audiences and experts. What implications and effects does the design in visuals have on science? How can they become effective, reliable and complete? Visuals can provide different interpretations and be subjective, is it not that images have the power to not only extract data, but also transform it?
Key questions during the discussion:
Who owns and-or manages data collections?
How is ownership projected towards the future? What will happen with data collections?
What is left behind and never replaced when extracting information?
Where does trust from the public come into all of this?
How can the public trust the domains that are created, justified as good and used as the best alternative?
Go back to Workshop “EXTRACTIONS” main page