2019 12 12 thingscon final - the internet of things 12 1… · thingscon december 2019: ... data in...

11
ThingsCon december 2019: NGI workshop ‘Digital Welfare through Trustframework & Disposibel identities’ 1 ThingsCon 2019 12 & 13 December; Rotterdam, the Netherlands. NGI Workshop on Trust-frameworks ‘If we are going to look for ways to build a better, more just, more dignified and respectful future, it is imperative that we consider governance mechanisms that can put communities and governments in positions of negotiating power with respect to optimization goals in the design of algorithmic systems. This implies identifying what data infrastructures must be owned or controlled by citizens, groups, communities and governments, what skills and expertise must be cultivated and what governance mechanisms must be put in place in order to be able to negotiate conditions of engagement with powerful platform economy actors. ‘ - Irina Shklovski (output from the session) Workshop by Rob van Kranenburg (NGI Forward) and Theo Veltman, Rainmaker Innovation/Program manager, gemeente Amsterdam with Manon den Dunnen (strategic advisor Police Innovation) on Thursday December 12, 14-16:30h as part of the track ‘How to shape a responsible society’. 1 1 Workshop: digital trust infrastructure. An independent and open digital trust infrastructure, case study The Netherlands. How do we get there? The key question of how to safeguard (digital) autonomy has no simple answer. The time to act is now. It is necessary to act ‘now’ in order to continue to make the most of the benefits of technology without jeopardizing individual autonomy. It would be wise to limit the monopoly position of large organizations, the digital platforms, and to strengthen the position of the individual, while this is still possible. This will require a concerted effort, including at the international level. Read more: Data makes the world go round; Proposal for research into three policy instruments designed to strengthen

Upload: others

Post on 28-Sep-2020

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 2019 12 12 THINGSCON FINAL - the internet of things 12 1… · ThingsCon december 2019: ... data in a transparent and responsible way is laborious and complex. ... The event gives

ThingsCon december 2019: NGI workshop ‘Digital Welfare through Trustframework & Disposibel identities’

1

ThingsCon 2019

12 & 13 December; Rotterdam, the Netherlands. NGI Workshop on Trust-frameworks

‘If we are going to look for ways to build a better, more just, more dignified and respectful future, it is imperative that we consider governance mechanisms that can put communities and governments in positions of negotiating power with respect to optimization goals in the design of algorithmic systems. This implies identifying what data infrastructures must be owned or controlled by citizens, groups, communities and governments, what skills and expertise must be cultivated and what governance mechanisms must be put in place in order to be able to negotiate conditions of engagement with powerful platform economy actors. ‘ - Irina Shklovski (output from the session)

Workshop by Rob van Kranenburg (NGI Forward) and Theo Veltman, Rainmaker Innovation/Program manager, gemeente Amsterdam with Manon den Dunnen (strategic advisor Police Innovation) on Thursday December 12, 14-16:30h as part of the track ‘How to shape a responsible society’.1

1 Workshop: digital trust infrastructure. An independent and open digital trust infrastructure, case study The Netherlands. How do we get there? The key question of how to safeguard (digital) autonomy has no simple answer. The time to act is now. It is necessary to act ‘now’ in order to continue to make the most of the benefits of technology without jeopardizing individual autonomy. It would be wise to limit the monopoly position of large organizations, the digital platforms, and to strengthen the position of the individual, while this is still possible. This will require a concerted effort, including at the international level. Read more: Data makes the world go round; Proposal for research into three policy instruments designed to strengthen

Page 2: 2019 12 12 THINGSCON FINAL - the internet of things 12 1… · ThingsCon december 2019: ... data in a transparent and responsible way is laborious and complex. ... The event gives

ThingsCon december 2019: NGI workshop ‘Digital Welfare through Trustframework & Disposibel identities’

2

The workshop consisted of two lightning talks and a general discussion of 25 participants (designers, civil servants from Ministries and City, artists, independent policy advisors and academia/Industrial Engineering). As this was organized in the Thingscon event large industry was largely absent or underrepresented2. The group did consist of several SME, self -employed designers, consultants and advisors. Of the twenty participants, 11 agreed to be added to the dedicated Identity mailing-list (bringing the total to 146).

Lightning Talk 1: Why do we need a Trust-framework from a societal perspective?

To safeguard constitutional values, the notion of an inclusive society and a focus back on real people and their needs, wants and dreams.

Manon den Dunnen, working at the Dutch Police as a strategic specialist on digital transformation: Today I want to share with you the concept of the TrustFramework, or Digitale VertrouwensInfrastructuur (DVI) as we call it. As it has been quite an explorational journey for us, I can imagine that you have a lot of questions and

comments.

For me this journey started when I became aware of the shadow side of data. The way in which personal data is collected and used plays an important role in this. Moreover, poor security of digital infrastructure, devices and personal data, facilitate identity theft and cybercrime. The increasing dependence on this infrastructure makes us extra vulnerable.

To summarize there are 3 issues involved:

• Control over your personal data Data is often collected without consent or with disguised consent, causing citizens and other data owners to lose control over (the use of) their personal data. It is not transparent what data is collected and how, out of sight, it is mutually combined into a profile. For personalized services or decision-making procedures.

• Availability of data for social goals While data is shared easily in exchange for digital services like Google, it is very difficult for citizens and public organizations to access personal data for social goals. To obtain personal data in a transparent and responsible way is laborious and complex.

• Data exchange is costly and inefficient The current way of data exchange is complex and inefficient. Every provider of digital services has to arrange for the identification, permissions and logging of transactions themselves.

As an example of the added value of data and the importance of a Trust Framework I would like to introduce Jannie. She lives in an apartment complex, has difficulty walking and sleeps with an

(digital) autonomy

2 The next WP4 NGI FORWARD event will focus on Industrial Internet and related domain industry. 10th Living Bits and Things 2020 the leading digital transformation event in the central and east Europe (CEE) region, in Portorož, Slovenia. The event gives attendees current insights and understanding of the IoT – internet of things, AI – artificial intelligence, blockchain and digital transformation technologies. It brings up hot topics and future trends discussed during the event and challenging the professionals and experts from digital and traditional industries. https://www.livingbitsandthings.com/about-event.html

Page 3: 2019 12 12 THINGSCON FINAL - the internet of things 12 1… · ThingsCon december 2019: ... data in a transparent and responsible way is laborious and complex. ... The event gives

ThingsCon december 2019: NGI workshop ‘Digital Welfare through Trustframework & Disposibel identities’

3

oxygen bottle. The neighbor on number 38 down the street has her spare key. Normally nobody has anything to do with that, but if there is a fire, then it is very important for both Jannie and the fire brigade that this information is known. So how do you ensure that this information is only visible to the fire brigade and only if there is a real fire, without all other personal, medical information being disclosed? In short, it’s about:

Who may do what with my information under what condition and to what purpose This seems simple, but there is quite a lot involved, such as how do you know for sure that it really is Jannie and the fire department, or that Jannie still lives there? In addition, it is impracticable and undesirable that the fire brigade makes 1 on 1 agreements with all residents. Then registers all this and carry out checks.

On the one hand we want to protect public values and other hand seize opportunities. To realize this, the government must take responsibility. There is a need for a trust infrastructure that increases the availability of data in a safe and privacy-friendly manner, while at the same time preventing abuse through control and transparency. For this we need these building blocks. Where the government should take a role in safeguarding the following building blocks. That doesn’t mean that there will be one solution.

As a matter of fact, several solutions for all the building blocks for the provision of core Trust Framework services are available. Such DigiD and iDin for identification, consent registers and among others IRMA for attribution and authorization. Building blocks that have been tested against several use cases. Now they have to be made operational in mutual coherence and tested again against (new) use cases. At the same time, a legal entity and an exploitation model will have to be developed for governance and (operational) management. It’s costly and complex to make 1 on 1 agreements with all involved and it also makes it difficult to safeguard constitutional values. That is why the DVI works with standardized consent agreements. The idea is that these are drawn up in a collaborative in which a neutral party acts on behalf of the residents. E.g. Bits of Freedom in the Netherlands. In the example of Jannie, a kind of 112-alert App is created for which you can register and in which you can indicate which data you want to share. The great thing about this solution is that anyone who wants to, can participate, but it is not necessary. It is not known to the fire department who does or does not participate, so there is no pressure. In addition, everything is logged, so that it can be checked transparently. The DVI does not contain any data, it only checks the permissions and conditions after which the 2 parties can exchange data with each other. You can imagine that a lot of different applications are possible through the same infrastructure, however the time is too short for more examples.

The Trust Infrastructure consists of a generic facility providing core services like identification, authentication, consent and security. Next to this, it consists of data collaboratives that ensure responsibly functioning data markets. These organizations develop data sharing agreements, draft related consents agreements and manage the granted permissions. For the verification of identities and data sharing permissions, these organizations use the generic provisions of the DVI. The incentive to do so it that it allows them to access data in a GDPR-compliant, low cost and easily accessible way. In return, they must meet the requirements in terms of transparency and privacy & security by design.

The program to realize this infrastructure will start the design phase in January 2020, based on prioritized use cases like the one described. Feedback from, among others, this meeting will be included! It will be a continuous learning cycle.

Page 4: 2019 12 12 THINGSCON FINAL - the internet of things 12 1… · ThingsCon december 2019: ... data in a transparent and responsible way is laborious and complex. ... The event gives

ThingsCon december 2019: NGI workshop ‘Digital Welfare through Trustframework & Disposibel identities’

4

Lightning Talk 2: Why do we need a Trust-framework from a technical perspective?

To safeguard the ‘disposability’ of identities, that is, to provide a trustworthy engagement experience for all stakeholders, in such a way that the combination of seamless connectivity and personalized support is not build on continuous and real-time tracking and tracing of fully personal identities, but only on attribute based relational identities when engaging with services.

As already developed and tested under ongoing projects (DECODE, DOWSE by dyne.org), we propose to explore the

feasibility of an environment that is currently not widely available. We propose as a technical framework a provable computing triangle of the router (dowse.eu), the applications –wearables, home applications, connected cars and bikes, and smart city furniture (traffic camera, lantern pole, smart recycling bin)- and dedicated smartphones (running Estonian e-card) so we can demonstrate privacy preserving and secure gateway ownership between the networks offering owners of these applications full data ownership plus insights into their future potential behaviour by having our analytics running in the devices (edge), based on the fact that all three run zenroom.org smart contracts on embedded SIMS.

As a next step, we will demonstrate that new applications can be added to our ecosystem using a strict attribute-based solution, needing no full disclosure (of identity) beyond age, ability to pay for the service, legal compliance in terms of insurance and accountability…so digital services can be delivered to authenticated users without requiring the need for the full set of identifying data to be shared.

This is what we mean by Disposable Identities. Importantly, this eliminates risks of leaks and unauthorized reuse of personal data by those third parties providing the service.

The first reaction to Disposable Identities (DE) in ETSI, the standard organization of Europe, was to start in an existing ETSI Technical Committee (TC) with a Technical Report. Technical committees are made up of representatives of our members and led by a ‘Rapporteur’, draft most of our standards. ETSI members may participate in any group and work activity (other than certain security-related work where participation is controlled by the ETSI Board). Any proposal to start an item of work, such as to create a new standard or to update an existing one, must come from at least four members of ETSI and be agreed by the relevant standards group.

Initially the course was that the proposal could grow inside an existing ETSI Technical Committee like TC CYBER. If the proposal gains momentum more existing participating ETSI members (in TC CYBER° could propose to support and create a new Working Group in TC CYBER (WG "Disposable Identities"). The invitation came from ETSI proposing attendance at either TC CYBER (22-24 January) at ETSI in Sophia-Antipolis, France, or TC ESI (Electronic Signatures and Infrastructures) as they work on the trust services for eIDAS and identities are very close to their work (28-29 January at ETSI). Based on the feedbacks received from the Policy Officer of NGI FORWARD, Loretta Anania and the originator of eIDAS, Andrea Servida the choice was made to go with TC ETSI.

Page 5: 2019 12 12 THINGSCON FINAL - the internet of things 12 1… · ThingsCon december 2019: ... data in a transparent and responsible way is laborious and complex. ... The event gives

ThingsCon december 2019: NGI workshop ‘Digital Welfare through Trustframework & Disposibel identities’

5

Discussion

The discussion raised mainly three fundamental issues:

1. Problem? a. Whose problem are we solving? b. What is the problem and why?

2. Personalized data-management a. How do we reconcile a focus on personalized datamanagement? b. Is profiling always ‘bad’, is it not also building ‘reputation’? c. How can we ‘live a life’ and not be having a ‘meta-position’ continuously by

checking on our data- behaviour 3. How can we shift the discussion from digital inclusion to digital welfare?

PROBLEM

1. Whose problem are we solving? Who are the stakeholders and what exactly is the problem? We are facing the full effect of non-reciprocity (TCP-IP, WWW, IOT) on institutional actors and both ‘systems’ of commercial shareholder value service providers and enablers of current governments (regulation, monopoly of violence, defining the concrete definition of ‘democracy’) must be re-evaluated in terms of enablers for inclusive governance. Inclusive then entails persons, machines (objects) and nature as stakeholders. It was contested that ‘objects’ are a stakeholder.

2. What is the problem and why? a. Losing democratic influence meaning: the one who has the most power

(financially and network wise) wins. More and more companies like Facebook, Google and Amazon (GAFA) are growing into ‘too huge to fail or dismiss or disagree with’. Let’s face it: the CEO of any big company will have the ear of a secretary of state since (1) employment; (2) economic growth and/or (3) network: support in political and/or after-political career. Nowadays not even an individual European country has a fair chance against those companies. Even Europe has a limited influence. Fines of 100 or 200 million or more are a farce in relation to the revenue made

b. Losing individual freedom by being continuously monitored and registered into appropriate and desired behaviour or decision. Commercial companies and politicians always have tried that, the difference is now it can be done at an individual level. Imagine when Quantum computing is the norm at all levels in society, combined with AI and nanotechnology. All good things in themselves, making life more comfortable and safer. But as with all good things, there are dangers when used in the hands of people wanting to have their way at all cost.

c. Governments are losing control. Tax is paid in order to have some functions done which are beneficial for all. Like safety or freedom to decide, think and move, to develop for each and every individual. Apart from discussion about how governments are behaving, giving away their control to companies that execute service for government without the democratic control of council or parliament, as well as giving the rule more importance then the individual.

The democracy has her faults and weaknesses. It is dependent on people, with all their pros and cons. Those people however are chosen, had to go through a system in order to get there AND we are able to get rid of them if we need to (although really difficult). We can’t get rid of Google and alike, since they are (1) companies with lots of money and (2) we depend on their technology. It is proven again and again that those companies will decide based on what gives them the best options to grow in size, revenue, market share and influence. For instance, without a fifteen-year old saying what many people felt more and more, there would be no serious discussion about climate change; it is likely that there would be no ‘green deal’.

Page 6: 2019 12 12 THINGSCON FINAL - the internet of things 12 1… · ThingsCon december 2019: ... data in a transparent and responsible way is laborious and complex. ... The event gives

ThingsCon december 2019: NGI workshop ‘Digital Welfare through Trustframework & Disposibel identities’

6

That is what humans can do. As well as that is what can be done within democracy governed by elected people, dependent on people choosing them not knowing what they will choose. You need to – at least now and then – think about what the electorate want, need. Facebook and alike don’t need to. And they most likely won’t when it comes down to it. The case of Cambridge Analytica is only one example.

PERSONALIZED DATAMANAGEMENT

3. How do we reconcile a focus on personalized data management with non-zero history (context) and relational data? If I delete/non-disclose a personal item I may hamper someone else from completing a task or missing out on a piece of data which is information (context related) to another person?

4. Is profiling always ‘bad’, is it not also building ‘reputation’? Profiling is nice for comfort <> advice based on what you did in order to make life easy is OK, provided the profile being made is your choice as well as manageable in a way that this profile is not shared without your consent. Profiling without you knowing it, or without you being aware of the scope of companies and/or persons with whom that profile can be shared (eg PSD2), is off limits. It endangers freedom of choice, freedom of movement without being monitored etc.

5. How can we ‘live a life’ and not be having a ‘meta-position’ continuously by checking on our data- behaviour? What kind of a person does this imply? We seem to assume quite rational Habermas-like consenting and communicating persons but how do we include persons who cannot read or write, have issues understanding formal language and signs? How do we digitally include everyone?

Managing your personal data is important, even if you don’t understand every aspect of the digital society and its risks. Same as that being able to manage your own money is important, even though……… You always can get support. At the moment nobody is in charge of your personal data. You are nudged in giving it (cookies: making it complex and tiring to choose setting other than ‘accept all’) of forced (if you want to have access to services). We run the risk of becoming more and more a puppet on a digital string.

There is no need to give away your personal data. Already there are systems that will gov eth certainty that you are who you say you are without you needing to give all your personal data (eg Irma, Itsme). Similar we need an infrastructure in which we can avoid being traced and profiled without creating uncertainty and unsafety for others. In the same time creating more ease and comfort in dealing with companies by being able to create rules for access to data and home etc. As well as creating more efficiency and safety by using disposable identities in situations where persons and objects need one in order to tell the story (eg with an accident) where each object and individual need to be identified. The same goes for parts of airplanes: you need to be sure that it is a new part, not an old one being revamped. Also you want to know where parts, people (military and officers) are etc etc. In using disposable identities you can do that without giving away all your personal data, even if you are a policeman or soldier doing your job.

DIGITAL WELFARE

Page 7: 2019 12 12 THINGSCON FINAL - the internet of things 12 1… · ThingsCon december 2019: ... data in a transparent and responsible way is laborious and complex. ... The event gives

ThingsCon december 2019: NGI workshop ‘Digital Welfare through Trustframework & Disposibel identities’

7

6. How can we shift the discussion from digital inclusion to digital welfare? For each and every one, independent of social position, race, religion, cultural background etc?

The spider and the web

We agree that we cannot go back, into an analogue world of no digital connectivity. We agree we cannot go forward with our current notions of identity and decision- making systems. We realize that the political power issue is not identity, but taxes. Identity in the hands of non-state actors is the end of the business model of the state. The issue is thus extremely explosive and vital as it touches the heart of society and its workings.

As citizens we have a choice. We never had more agency as non-formal but potentially systemic actors. We can be the spider in the web. This implies that all end connections of the web have a clear view on our full personalities. Or we can build our desired connected world, not on the endpoints but on the intentionally combined (if ‘you’ consent) separate connections with each and every endpoint (any service).

The choice is ours.

Page 8: 2019 12 12 THINGSCON FINAL - the internet of things 12 1… · ThingsCon december 2019: ... data in a transparent and responsible way is laborious and complex. ... The event gives

ThingsCon december 2019: NGI workshop ‘Digital Welfare through Trustframework & Disposibel identities’

8

Figure 1 Drawing by Raffaela Rovida

We point all participants to the NGI Open Calls to look for funding of innovative projects in this domain:

https://www.ngi.eu/opencalls/

Page 9: 2019 12 12 THINGSCON FINAL - the internet of things 12 1… · ThingsCon december 2019: ... data in a transparent and responsible way is laborious and complex. ... The event gives

ThingsCon december 2019: NGI workshop ‘Digital Welfare through Trustframework & Disposibel identities’

9

Input from participant Gael van Weyenbergh

https://www.transformabxl.be/community/gael-van-weyenbergh-co-founder-meoh/

In contrast to a risk-averse approach of trust that deals with threats, we are suggesting to also look at the opportunities of trusting others and to how we could leverage synergies and multi stakeholder cooperation. It is indeed critical to approach trust from a risk-averse point of view in a world that is increasingly dominated by social distrust and by binary antagonisms that cultivate in-group out-group mentality. Nevertheless, each and everyone of us is also part of support networks made of trust, emotional resonance, and reciprocity be it with our family, close friends, and colleagues. These trust networks largely overlap and are in fact strongly intricated with each other so that they cover the entirety of human society (6 degrees networks).

Trust is an inherent risk that people take in order to achieve a positive outcome. Therefore trust can be tackled from a risk-aware point of view where people mitigate the risk by teaming up with a small set of peers they know well. Trusting others is indeed a fragile experience that can either take a turn for the worse or help build strong relationships. In turn, these can lead to high levels of social engagement and to great outcomes. Can technology help foster and further these trust networks at scale? Tech and business-driven approaches to social networking do not implement a community-centric approach for users to build trust and confidence among themselves. As a result, there is a void that awaits to be filled. Transposing the social dynamics of cohesive communities in the online realm is what we could aim to achieve with an approach to social networking informed by social sciences and complexity theory.

Input from participant Jennifer Veldman

https://jenniferveldman.org

In this salon new issues for consideration were addressed:

Page 10: 2019 12 12 THINGSCON FINAL - the internet of things 12 1… · ThingsCon december 2019: ... data in a transparent and responsible way is laborious and complex. ... The event gives

ThingsCon december 2019: NGI workshop ‘Digital Welfare through Trustframework & Disposibel identities’

10

Transparency is distrust. Trust has a foundation in relations. I trust you, you trust him/her, therefore I trust him/her. By definition wanting to know everything about him/her is a lack of trust. Trust is about willing to take a risk, transparency is about minimizing every chance of risk.

The Dutch insurance system will provide an excellent example: Insurance by default is based on covering risks. Every Dutch citizen is required by law to have to health insurance. This way, as a national collective, we have a bank saving account with which we can cover health costs for those who need it. In order to understand the costs of insurance covering the Insurance companies make calculations. This has always been this way, however since we are able to collect more data these calculations become more precise. What Insurance companies regard as ‘good behavior’ - eating healthy, clean driving, but also having no serious illnesses – is rewarded. Meaning, those who need insurance the most – those with chronic or life-threatening diseases – pay the highest bill and receive, relatively least. Even though collective insurance is required so these people can also benefit from it.

Trust is built on the same foundations: sometimes your trust is rewarded, sometimes it’s not. This is a risk you’re willing to take. The more data we have, the more we think we know. Context of data – good or bad – often is lacking. Luck or bad luck this way will continue to be a vicious circle.

Individual decision making. In line of empowering people we tend to built digital alternatives that put the power to the individual. The person itself is in control of his/her data. Although this is a noble idea, the practicality may have a different outcome for several reasons: 1) We may not oversee the consequences of sharing data – any data. Now or in the future. No matter how well trained in this subject. 2) We may not want to make this decision over and over again. The decision of sharing data no longer is something we have to make once in a while, but a couple a times a day. The mere frequency of it is overwhelming. 3) Already we are overwhelmed with global issues that may threaten our livelihood. The urgency, size and speed of climate change, the transformation from a fully analogue towards one covered completely by a digital layer with all of its warnings are too big to grasp for just one person. Let alone the daily challenges of surviving school, work, family and everything that comes with it. Increasing burn-out rates at younger and younger ages show that we already are under too much pressure – and/or don’t know how to cope with life. Can we really ask of these people another thing to stress about? Handing over more responsibility of something we do not really understand? With a possibility – not even a necessity – of dire consequence?

Learned Helplessness. Martin Seligman and others have studied Learned Helplessness. Put a dog in cage of which the bottom is wired to electricity. When the dog receives shocks from the right side of the cage, it quickly learns to stay on the left side. Repeat the experiment on the other side, the results will be similar. The dog learns to keep itself safe. Once the shocks no longer have a side, but can come from every part of the cage the dogs is confused, panics and eventually will lay defeated on in the cage, surrendering to the shocks. Even when the cage door is opened, the dog has a way out, it will stay in the cage defeated. People react similarly. Once they’ve learned that they are punished or left unrewarded for their efforts for a better life they will no longer rise. It will take time, effort and trust to rebuilt a will to stand up for themselves. The rise in strikes is a sign that we are on the way to get up. But with problems with on such a big global scale will we endure, have the patience to trust in results even though they may be far away in time or place?

Input from participant Irina Shklovski

https://pure.itu.dk/portal/en/persons/irina-shklovski(e44f9c2b-6ec3-4e19-9391-f68b30f6b88e).html

There are two main assumptions of the NGI framework that need to be reconsidered.

Page 11: 2019 12 12 THINGSCON FINAL - the internet of things 12 1… · ThingsCon december 2019: ... data in a transparent and responsible way is laborious and complex. ... The event gives

ThingsCon december 2019: NGI workshop ‘Digital Welfare through Trustframework & Disposibel identities’

11

First, there is the assumption that individuals have the right to disclose “their” data. However, if disclosure of data reveals information about more than one person, at times even without anyone being directly aware of this, then how can we make this assumption? Even shifting from “my data” to “data about me” as Peter Bihr had proposed in his closing keynote at Thingscon does not address the issue. Data are by nature relational and not individual, where handing the individual responsibility for disclosure decisions is not a viable approach.

Second, there is the assumption that addressing the problem of surveillance - of a range of commercial (and government) stakeholders collecting too much data about people with impunity - is the main problem that needs addressing. Solutions to address this issue range from encryption to proxy’s to data minimization approaches. The notion of disposable identity also fits into this range of solutions which focus primarily on maintaining secrecy as a primary objective.

I propose to reframe the problem from worrying about how much data someone might be disclosing or to whom, towards considering who has say in subsequent uses of these data. Most importantly, if data are intended for use in algorithmic systems with the purpose of creating greater efficiency and precision the question is what are the goals of such optimizations, and who sets these goals (as well as the premises and assumptions on which these goals are based). By overly focusing on the problem of data control and maintaining secrecy, we overlook the fact that in many ways it does not matter to algorithmic systems whether individuals are identifiable or not as long as the efficiencies of the market driven goals of profit are maximized. If we are going to look for ways to build a better, more just, more dignified and respectful future, it is imperative that we consider governance mechanisms that can put communities and governments in positions of negotiating power with respect to optimization goals in the design of algorithmic systems. This implies identifying what data infrastructures must be owned or controlled by citizens, groups, communities and governments, what skills and expertise must be cultivated and what governance mechanisms must be put in place in order to be able to negotiate conditions of engagement with powerful platform economy actors.