Gresham College Lectures

Data Protection for Thrillseekers - Dr Victoria Baines

April 17, 2024 Gresham College
Data Protection for Thrillseekers - Dr Victoria Baines
Gresham College Lectures
More Info
Gresham College Lectures
Data Protection for Thrillseekers - Dr Victoria Baines
Apr 17, 2024
Gresham College

We increasingly share with online services intimate details of our lives, such as mental health and reproductive data. Far from being a ‘tick box’ legal exercise, data protection is about fair and responsible use of our personal information.

It gives us rights which we are entitled to exercise against mega corporations, governments, and anyone who processes our data.

It’s time to get empowered. Because if we don’t use it, we might lose it.


This lecture was recorded by Dr Greg Constantine on 18th March 2024 at Barnard's Inn Hall, London

The transcript and downloadable versions of the lecture are available from the Gresham College website:
https://www.gresham.ac.uk/watch-now/immigration-detention

Gresham College has offered free public lectures for over 400 years, thanks to the generosity of our supporters. There are currently over 2,500 lectures free to access. We believe that everyone should have the opportunity to learn from some of the greatest minds. To support Gresham's mission, please consider making a donation: https://gresham.ac.uk/support/

Website:  https://gresham.ac.uk
Twitter:  https://twitter.com/greshamcollege
Facebook: https://facebook.com/greshamcollege
Instagram: https://instagram.com/greshamcollege

Support the Show.

Show Notes Transcript

We increasingly share with online services intimate details of our lives, such as mental health and reproductive data. Far from being a ‘tick box’ legal exercise, data protection is about fair and responsible use of our personal information.

It gives us rights which we are entitled to exercise against mega corporations, governments, and anyone who processes our data.

It’s time to get empowered. Because if we don’t use it, we might lose it.


This lecture was recorded by Dr Greg Constantine on 18th March 2024 at Barnard's Inn Hall, London

The transcript and downloadable versions of the lecture are available from the Gresham College website:
https://www.gresham.ac.uk/watch-now/immigration-detention

Gresham College has offered free public lectures for over 400 years, thanks to the generosity of our supporters. There are currently over 2,500 lectures free to access. We believe that everyone should have the opportunity to learn from some of the greatest minds. To support Gresham's mission, please consider making a donation: https://gresham.ac.uk/support/

Website:  https://gresham.ac.uk
Twitter:  https://twitter.com/greshamcollege
Facebook: https://facebook.com/greshamcollege
Instagram: https://instagram.com/greshamcollege

Support the Show.

Good evening everyone. I have a question for you. Do you consider yourself to be a thrill seeker? Are you the kind of person who engages in extreme online activities like sharing personal information with others in order to receive their approval? The Gresham College lecture that you're listening to right now is giving you knowledge and insight from one of the world's leading academic experts making it takes a lot of time. But because we want to encourage a love of learning, we think it's well worth it. We never make you pay for lectures, although donations are needed, all we ask in return is this. Send a link to this lecture to someone you think would benefit. And if you haven't already, click the follow or subscribe button from wherever you are listening right now. Now let's get back to the lecture Or perhaps you're the kind of person who uses apps to build emotional connections across the ether and enhance your physical pleasure. Perhaps you're someone whose habits are formed by the crest for rewards and dopamine hits, or who uses technology to experience emotion, excitement, and an influx of feeling? Would you see yourself as a risk taker? Someone who willingly exposes themselves to the possibility of loss or injury, who flings their hard-earned cash through thin air, who broadcasts their opinions and shows solidarity for others, who checks in on social media at the airport bar, or at a holiday destination, thereby telling potential burglars that they're not at home who entrust their personal information to numerous people they have never met and who permit those same people to track their movements? Now of course there are now very few of us who have not done at least some of these things online. And indeed I would go so far as to say that we are all now digital thrill seekers and risk takers to some degree managing the risks that we take in pursuit of those thrills entails protecting their digital components. And as users, we are partly responsible for this. There are measures that we as individuals can take to improve both our security, that's who can have access to our accounts and our devices and our privacy, what we share and who can see it. Now, we looked at both of these in my lectures last year and if you want to know more about those topics specifically, please do check out my lectures on cybersecurity for humans and on encryption. By signing up to services such as social media and online retailers, we agree to their terms and conditions, their T's and C's. We sign a contract with them in which we consent to their processing our personal data. And this means that those service providers also have a responsibility to protect this data. Data protection regimes and regulations enforce that responsibility and they also give us rights as data subjects the humans described by that data. Knowing and exercising those rights is not a trivial matter, but it is possible. And for us just as much for the organizations that process our data information is power. Our digital information is a value to a diverse array of service providers, retailers, and advertisers. When we supply our email addresses to an online shop or service, it gives them a means to send us offers and updates. If I give them my date of birth, they can target tailored offers to me around my birthday. Marketing emails often include pixels, HTML code that tracks when they are opened and also when someone clicks through to a website. These tools give brands the ability to measure engagement with their advertising campaigns, search engines, social media, video sharing platforms. They store the content we share, but also data about the content we view engage with and search for. And many flavors of online provider make extensive use of cookies, small text files that are downloaded onto your device when you visit a website. These can save you time by doing things like remembering what you have in your online shopping basket or parts of pages to help them load faster. But they can also track your browsing history to gain insights into your interests and things you might be persuaded to purchase. And this is the business model that certainly until very recently, um, big tech has thrived upon by learning more about our likes on their platforms and our browsing habits off them. They've been able to sell ads to brands on the premise that they can target them more effectively to people who have already looked for cars raincoats guitars, for example, Google Monetizes our web searches through a similar process. It sells ads that are prioritized as sponsored results when we search for a specific item or a particular interest as an indication of just how extensive this digital marketing ecosystem has become. Let's compare two visual overviews five years apart. So this first graphic was produced in 2015 and you can see that the landscape was already fairly densely populated, but just five years later in 2020, it was so crowded that even marketing people needed a map to navigate it. There are 8,000 companies and solutions represented here. It is very big business indeed. Now, online services would say that their terms and conditions inform users of what handing over their personal data means and that those users have a choice whether or not to accept them. You may be not entirely surprised to hear, however that the vast majority of us do not read those terms and conditions in a survey by the European Commission, as many as 90% of Brits and the same average proportion of Europeans reported that they always accept the terms and conditions of online providers, but only 21% had read them in full. In a similar study conducted in the US, just 9% reported that they always read a company's privacy policy before agreeing to the terms. Well, the sheer length of these policies would appear to have something to do with this in 2020 digital bank think money compared to the word counts of the terms and conditions for 13 of the most popular apps in the uk. And these ranged from just under 5,000 words for Google Meet to over 18,000 words for Microsoft teams, the total word count for the apps reviewed came to a whopping 128,415. And that as the academics amongst you will know, is longer than most PhD thesis, certainly mine included and I was a very wordy humanities student. If we were being rather cynical about this, we might argue that it's to the benefit of online providers not to have too many users enforcing their rights to opt out. But once we know how digital tracking and profiling works that are things we can do to stop it, we can unsubscribe from marketing emails that we no longer want to receive. We can use a tool like this one developed by rightly, which sends companies clear instructions to delete our contact details from their databases. We can opt out of all but the strictly necessary cookies. And if you have an Apple device, you can block third party cookies and invisible pixels in your email. If we do continue to accept all, we should at least be able to satisfy ourselves that we've done so consciously and not through ignorance. Why is this so important? Well, because as we shall see, not only is the data collected on us very revealing indeed, sometimes it's not even correct. Information is power. Remember. And there are tools that can help us identify what online services have on us. And I've included some links to some of them in the text accompanying this lecture so you can try them out for yourselves. Meta formerly known as Facebook, Google, Microsoft, apple X, formerly known as Twitter and Amazon, they all allow users to download copies of the data held on them. And these downloads contain data that we have generated and shared, but also inferences made from that data about who we are and the likely advertising audiences that we belong to. So here are some things that X, formerly known as Twitter has inferred about me, all 377 of them. Now this is from my professional account, so it's not surprising that there are interests linked to my work. And I've colored these in blue and you can see highlighted. Um, they include artificial intelligence, cybersecurity data, privacy and protection, internet of things and virtual reality. So that's not bad. We can add in, in magenta a selection of media outlets that have featured me, um, that have tweeted about me, accounts that I know I follow for professional reasons and places that I've worked so much so sensible, but it then starts to get a bit more personal. And in green are some of my real life non-work interests, archeology, dance, electronic and folk music. Dogs. Dogs and dogs. And Tupperware. Tupperware actually <laugh>, that does seem a bit odd. Where has that come from? And what I started to see as well and that I've highlighted in red are lots of names presumably of celebrities that I just don't recognize, but also some things that I have no interest in or I actively dislike. So I'm gonna let you draw your own conclusions about which is which. But we have college sports, Dr. Who, James Bond, Olympic weightlifting, star Trek, San Francisco Giants and U2. There was something about this that I couldn't quite put my finger on and then it suddenly dawned on me that one might describe at least some of these interests as well a bit stereotypically blokey. Could it be that the platform's algorithms had identified me as a man? Sure enough, <laugh>, this is precisely what X'S advertising models have inferred about me. And here's my profile photo for comparison. The logical and therefore likely explanation for this is that I work in the IT industry where the stereotypical profile is a male who is into science fiction and who either lives or wants to live in California. To be fair, I do also interact with people who regularly post about these subjects and interests. And on that note, I would like to thank Professor Daniel Dressner of the University of Manchester for the particular prominence of DR. WHO in My Results. Now, as you may remember, if you listen to my lecture on encryption last year, privacy is a human right. It's enshrined in Article 12 of the Universal Declaration of Human Rights no less. In 2018, the UN's High Commissioner for Human Rights published a report urging states to implement laws and institutions for the protection of personal data. In 2021, the UN Conference on Trade and Development, they found that 137 out of 194 countries had put such legislation in place and four months foremost amongst these has been the eus general Data Protection Regulation, GDPR, which came into force in May, 2018. It applies to any organization that processes the data of EU citizens regardless of where that organization is physically located. For now at least it's still in force in the UK because it was adopted before the UK left. The EU GDPR sets out our rights as data subjects and these are the right to clear and transparent information on the processing of personal data, whether or not it has been obtained directly from us. That subordinate clause is quite curious, isn't it? It suggests that other people may be sharing our personal data and the eagle eyed of You may have noticed that the Facebook signup page that I showed a few minutes ago told us as much. It informed us that other users may upload our contact information from their address books. We also have the right to obtain a copy of any personal data held on us via a subject access request. We have the right to have inaccurate personal data corrected, which hopefully includes the erroneous belief that I am an enthusiast of Olympic weightlifting. We have the right to have data erased where certain conditions are met, and this is also known as the right to be forgotten. We have the right to obtain portable data for reuse. In another context, the right to object to processing of our personal data where this is in connection with tasks carried out in the public interest in the exercise of official authority in the legitimate interests of others or for the purpose of direct marketing. And we have the right not to be subject to a decision based solely on automated processing or profiling. Think machine learning, artificial intelligence. So these read as powerful means to hold data controllers to account in practice. What it requires is for each of us to contact a provider directly whenever we want to exercise our rights. The data controller has to respond within a month unless they can demonstrate that they need more time or there is no merit in the request. And if the request is not answered to our satisfaction individuals in the 28 EU member states as they were in 2018, can complain to their National Data Protection authority in the uk. That's the Information Commissioners' office. The ICO national authorities have the power to fine data controllers. Likewise, if it's demonstrated that they have not acted in accordance with the core principles of GDPR. And these include lawfulness, fairness, transparency, and accuracy. Fines can be as large as 4% of total annual turnover. That may not sound like very much, but for companies like Google and Microsoft whose respective annual turnovers are 300 billion and 200 billion US dollars, this is no means some and fines do happen. In May of last year, 2023 meta formerly Facebook was fined 1.2 billion euros by the Irish Data Protection Commissioner for transferring the personal data of European users to the US without adequate data protection mechanisms. And in 2021, the Data Protection Commission in Luxembourg find Amazon 746 million euros for targeting ads at people without proper consent. This concept of consent is central to our data rights, but it can sometimes be difficult to ensure that it is meaningful in practice on services like search engines and on social media. Rejecting the terms may mean having access to less information than other people or being isolated from our peer networks. When our use of it is so essential to knowledge acquisition and community building. Sometimes our choice can feel like no choice at all. And this is particularly true I think given that some of the companies who process our data are less visible to us than others. These are the data brokers companies who scrape personal data often from publicly available sources like the electoral register and pages on the open web. And then they combine it in lists that they then sell on so that their clients can use it for marketing purposes. They include credit reference agencies like Experian, Equifax, TransUnion, and an investigation by the UK Information Commissioner in 2018 found that these agencies were not sufficiently transparent with consumers about how their data would be used when they performed credit checks. And it also identified several other areas of concern. Now, as a result, all three companies were served with preliminary enforcement notices by the data protection authority. And this led to Equifax and TransUnion making improvements to their products and even withdrawing certain products and services. So sometimes these authorities do have teeth. Just recently I have started to spot a pattern that suggests my data may have been sold by data brokers and it all seems to center on the suburb of Ston in Greater Manchester. I have no link to this place whatsoever. So imagine my surprise when last month I received an email from solicitors, um, addressed to a married couple who were not me and enclosing some very official looking documents for a house that they were in the process of buying. Now, I dunno about you, but for me, buying a house was one of the biggest and most personal events in my life so far. And if information about my purchase had been shared with a complete stranger by my legal representatives, I would be quite concerned. As some of you, uh, the regulars will know I am something of a digital busy body and I saw it as my duty to notify the sender of the error with some added emphasis that I hoped would result in them taking action. Rereading this email, I think I've struck a really nice balance between friendly and mildly threatening. Hi, you've sent this to me by mistake. Could you please remove my email address from your records? I never gave it to you. And I work in data protection, so this is pretty worrying given the personal documentation you've sent me. You may want to consult the information commissioner's office as this may constitute a breach of Mr. And Mrs. X's personal data. Best tatar for now, professor Victoria baes. I sent this email three weeks ago and I have yet to receive a response under GDPR. They have one more week to respond to me before I can report this to the information commissioner. So the clock is very much ticking. Now look, it could be that this was an honest mistake, but I had a vague memory of something similar happening before. And sure enough, in my inbox, there are emails going back to 2010 from different estate agents offering me properties in Ston Auto to value my house in Urmston. Each time I've responded that I want to be removed from their database, but they're kept popping up. And this suggests that I may be on a common list used by estate agents, perhaps supplied by data brokers based on information that they have trolled. That said, if you are someone formerly known as Victoria Baes, living in the Earnston area of Greater Manchester and currently in the process of buying a property, could you please stop giving people my email address? Thank you. Personal data can of course be much more sensitive than an email address. It can be data that defines us or even makes us who we are. Um, GDPR, the European legislation identifies special categories of data which are subject to additional conditions for processing. These concern are racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, genetics and biometrics, health, sex life and sexual orientation. And in my lectures this year, we've considered how an ever increasing number of connected devices generate process and store medically sensitive data and data about our sex lives. We share this kind of data with private companies much more frequently than we might realize. For instance, if we use an app to help us with our mental health guided meditation journaling or to connect with therapists, we are routinely sharing health related personal data. Apple Health allows users to generate a medical ID complete with blood type allergies, medical conditions and medications for sharing with emergency responders. And along with its competitor Fitbit, it allows menstruating users to track their cycles. Researchers at Privacy International and coding rights discovered that several period tracking apps encouraged users to log additional lifestyle information, including when, how and how often they have sex and their birth control habits. They also found that some of these apps were sharing data with third parties, including Facebook. And this kind of data is being viewed in a new light since in 2022, the US Supreme Court overturned the legal ruling that a woman's right to terminate her own pregnancy was protected by the US Constitution. That's a ruling that is more often known as Roe v. Wade in US states where abortion is now illegal. Law enforcement can compel providers such as online pharmacies and social media platforms to disclose user data relevant to an investigation. And as research by ProPublica has found some online pharmacies that retail abortion pills share data with Google that can potentially identify customers which could then be requested from Google by the authorities. Even before Roe v. Wade was overturned, meta reportedly disclosed the private messages of a 17-year-old girl and her mother facing criminal charges in Nebraska for carrying out an abortion after 20 weeks of pregnancy. Now, in this case, the content sought was actually on Facebook Messenger. It wasn't on a third party app, and the investigators served a search warrant on the company issued by a court in my previous lecture, sex and the Internet. We looked briefly at the 2015 hack of dating site, Ashley Madison, which is known for connecting people who want to have affairs. The hackers threatened to publish the personal data of users, including their real names, their home addresses, and their credit card payment details. And they demanded the immediate closure of the service. When this didn't happen, they published the data of millions of users. And among that data were users who had paid Ashley Madison's parent company to close their accounts and delete their personal information. Their appearance in those stolen data sets suggested that the platform had retained the personal data even of those people who had paid them not to. Now this breach happened before the general data protection regulation came into force in the eu. Under the current regime, the company would potentially have failed to honor the right of individuals to have their data erased that right to be forgotten. Millions of us have also shared our genetic data with private companies by taking DNA tests. Um, 14 million so far with 23 and me and over 25 million with ancestry DNA consumer genotyping opens up the possibility to profile our dispositions to certain health conditions and it gives us insights into our heritage. Some users have discovered their birth parents and other close relatives by permitting companies to perform this kind of analysis. In 2015, researchers at Princeton Center for Information and Technology policy noticed that the privacy policy of ancestry.com appeared to give the company permission to use customers genetic information for advertising purposes. ancestry.com has since changed this policy to exclude advertising from how it uses genetic information. However, it does still state that they share these inferences, which quote are derived from personal information such as to suggest familial relationships and to create consumer profiles for the purposes of research, product development and marketing. And examples of this include quote your ethnicity estimates, traits and genetic communities. The company defines genetic communities as groups of ancestry, DNA members who are connected through DNA, most likely because they descend from a population of common ancestors. So those of us who have taken their test seem to have agreed albeit tacitly and perhaps unwittingly to being served online ads on the basis of our genetic makeup. Now, it can often be difficult and arguably it's disingenuous to disentangle data protection on the one hand from cybersecurity on the other. If data is secure, it is by definition better protected. When an organization suffers a cyber attack, it needs to be able to satisfy the authorities either that no personal data has been compromised or failing that, that the required protection measures had been put in place in December of last year, 23 and me confirmed that hackers had stolen ancestry data on 6.9 million users. And in a letter sent to a group of victims, the company's lawyers stated quote that the incident was a result of user's failure to safeguard their own account credentials for which 23 and me bears no responsibility. The hackers appeared to have gained access by reusing stolen login credentials for other services. It's a type of attack known as credential stuffing, but they didn't do this 6.9 million times, they compromised around 14,000 accounts. Because of the way the platform works, they were able to gain access also to the data of customers who had automatically shared data with their hacked DNA relatives 23 and Me responded by resetting all user passwords and requiring everyone to use multifactor authentication. So there was an additional barrier to logging in over and above a password. But if that additional authentication had already been in place before the incident, far fewer accounts would've been compromised in the first place. And one could also argue that the company had a duty to air gap that user data shutting off linked data for relatives where there were indications of account compromise red flags such as logging in from an unexpected location or from an unexpected device. Now I would be the very first person to advise people to use different strong passwords for all their accounts. And in fact, some of you may remember we produced a cyber safety video on precisely this topic, which you can find on YouTube. But in this case, I would say blaming users for an attack of this scale. And impact is not only not in keeping with the spirit of data protection, it smacks of a conscious attempt to deflect attention according to GDPR. Our faces are also personal data. The increasing use of live automated facial recognition in public places has understandably proved to be controversial. And of particular concern is the practice of scanning people's faces and processing their facial data without their knowledge or consent. Hence the massive sign. In a high profile case in 2020, a court heard that South Wales police had captured half a million faces, quote, the overwhelming majority of whom were not suspected of any wrongdoing. Facial recognition is also being discovered by accidents in unusual places such as vending machines on student campuses. Online companies like Clearview AI that scrape billions of publicly available images of people's faces from websites, social media, again without their knowledge or consent for processing. By law enforcements are now coming under increasing scrutiny by national regulators. Scrutiny intensifies also when big tech companies seek to acquire smaller providers. So when Facebook now meta bought WhatsApp in 2014, it informed the European Commission that it would not be able to conduct reliable matches between Facebook users accounts and WhatsApp users accounts. So it meant that they couldn't join the two data sets together and enable deeper insights into users lives and behaviors. But in 2016, WhatsApp updated its terms of service and its privacy policy and they included the possibility of linking WhatsApp users phone numbers to Facebook accounts. As a result, the EU find Facebook 110 million euros for providing incorrect information during the investigation of the merger. If you wear an Apple watch, you will be used to the idea that data is shared with your other Apple devices. If you own a rival Fitbit Tracker, you may not know that Google now owns Fitbits and therefore has access to your health and fitness data. This merger was originally announced in 2019, but it was completed only in 2021. And that's because the European Commission investigated the acquisition and the chief focus was on whether Google's access to Fitbit data would give their advertising business an unfair advantage over competitors. Both Google and Fitbit were quick to reassure the public that Fitbit data would not be used to target Google ads at them. But had the EU not explicitly banned this, it's entirely possible that this safeguard may not have been introduced. Data is captured by connected devices in our homes. It's processed on providers servers and it's sometimes shared across services as the owner of both Ring Doorbell Technology and Alexa Echo Smart Speaker Technology. Amazon processes video recordings of users' properties and it uses voice recordings to train its speech recognition, its natural language systems, but also for targeting adverts. Our smart home device data is also of interest to and requested by law enforcement authorities when so much of our very personal and very sensitive data can be processed on servers in other countries, it makes sense for data protection regimes to be international or at least approximate so that each of us can be assured that our data will be protected to a similar standard wherever it is and wherever we are. And one of the reasons why the EU legislation why GDPR is held up as the gold standard of data protection is that it applies to everyone who wants to process the personal data of EU citizens wherever that processor may be located. And it required the countries that adopted it to introduce the same level of legal protections. So seen through this lens, the UK government's proposal of a new data protection and digital information bill may be viewed as an expensive attempt to reinvent the wheel in a post Brexit flexing of national sovereignty. And indeed, that is exactly how the government introduced it in 2022. Primary legislation that will harness our post Brexit freedoms to create an independent data protection framework. And one of its publicized promises is that it will reduce the number of annoying cookie popups that we see annoying. They may be. I would agree with that entirely, but reducing the number of opportunities we have to exercise our rights to object to being profiled through our online browsing habits doesn't seem to me to be the most just solution. The intention, I think may be to give the consumer a more convenient experience, but this should not be at the expense of their right to control how their personal data is collected and processed. Personally, I would prefer to have the opportunity to consent or reject every time that data is requested. The alternative is a one-time blanket consent, and that can open up opportunities for organizations to use data in ways that subjects may not have originally intended or agreed to. Now this draft legislation is being scrutinized in the House of Lords right now tomorrow. In fact, if you would like to follow it live online, it has the potential to give people in the UK different personal data protections to people elsewhere in Europe. So it is worth all of us keeping a very close eye on. Um, incidentally, this is the annoying cookie popup on the government's website about the legislation that proposes to do away with annoying cookie popups. So it's good to see that for the time being at least they are still complying with international law. Now, it may be tempting to feel like we as individuals can't do anything about the business model that was famously described as surveillance capitalism by Shoshana Zuboff in her book of the same name. But there are people fighting our corner. There are those state, national, and international regulators. There are civil society organizations, particularly those focused on privacy surveillance and freedoms of expression and information investigative journalists like Carol Kawada who first exposed that the data of millions of Facebook users had been collected by consulting firm Cambridge Analytica without their informed consent. And an honorable mention must go to Max Schrems, an Austrian lawyer who first started filing complaints against Facebook when he was still a student. Between 2011 and 2013, Schrems filed a total of 22 complaints with the Irish Data Protection Commissioner about the operations and policies of Facebook's European data controller. And the last of these concerned the export of European users data to the US in light of claims by Edward Snowden that the Prism Surveillance program enabled the US National Security Agency to access this data. And this eventually led to the European Court of Justice declaring the legal basis for these transfers invalid, which meant that the EU and the US had to go back to the drawing board and agree a completely new framework for transatlantic data transfers. If we have the chance to do this all again, would we design the global data ecosystem differently so that humans could have greater control over their own personal information? Well, the inventor of the worldwide web, Tim Burner's Lee certainly thinks that we still can. And in 2016, he launched solid social linked data. It's a protocol that allows individuals to store their data securely in pods on decentralized web servers. And I couldn't resist the temptation to show you an image of a futuristic space pod, but of course, in reality it's more like a folder, sorry, pod owners control, which people and applications can access their data. And that means it's a user centric rather than a company centric model. The government in Belgium is currently trialing use of pods so that citizens can share their education certificates and their medical records securely. So a user friendly solution for consumers may not be too far off, but for the time being, there are no shortcuts. I'm afraid. For those of us who want to exercise our rights over how our personal data is collected and processed. It's up to each of us to decide how much we are willing to share with tech companies and other service providers. And it seems that there is potential for us to reframe the deal from one in which we just trade privacy for convenience to one in which our willingness to let companies see into our lives is matched by transparency on their parts about what they do with our data and a greater willingness to let us see under the bonnet of their operations. And this is certainly a common theme that's emerging from recent online safety and consumer protection legislation, both in the UK and in Europe. Meaningful consent relies on us being properly informed. And while this can be a little labor intensive, it's too important for us to sleep walk through. The more we as single individuals exercise our rights, the greater our chance of holding accountable The organizations who process our data. We don't of course all have the tenacity or the legal training of a Max Rems. You don't all have to be a data protection busy body like me, but we can all be our own digital defenders. The dominant business model depends on the commodification of the details of our lives and on people, individuals acquiescing to that in sufficient numbers. But there is a financial cost to companies attached to processing our subject access requests an average of 20,000 pounds for each one according to a recent report. So if we were feeling collectively devilish, we might perhaps even devalue the business models simply by exercising our data rights by showing companies and governments that we care about the amount of data collected and our rights to object, we certainly stand a greater chance of transforming it into something fairer, more transparent, perhaps even more privacy focused. So the choice over how much of your life you share is yours and it needs to remain. So, so come on, let's do this. What are we all waiting for? Thank you very much. Is it possible to track where our data is or is it too late? So it's all very well saying, oh no, I I don't want my data shared, but who's got it? Yes. So, um, I didn't have time to talk about it in the lecture, but um, in the accompanying text, um, there are various tools that you can use it to firstly find out who's got your data. So that, um, the screenshot that I showed of rightly other tools are available, but this is one that, um, analyzes your email inbox, um, to identify all the people who you know are, are sending you emails and therefore have your email address. It's, it's just kind of a, a time saving tool. Um, as a security person, as you can imagine, I was a little bit nervous about allowing it to scrutinize the metadata in my personal email. Um, but you'll be pleased to hear that in their terms of service, they do state very clearly, um, that they don't look at the contents and that you can, you know, rescind, uh, your permissions, et cetera. So I took one for the team so I could show you on screen. Um, so lots of tools like that. Um, there are some, um, browser extensions. There's one for Mozilla, Mozilla monitor. Again, I'm not, I'm not pushing particular solutions, but there are ones that will look for, um, where your data has been scraped by data brokers. And I think that's the, the most difficult one really, because if you know you've got a Facebook account, you know that Facebook's getting your data, um, what you can also do in all of those social media services and your email is go into your settings, go into the settings on your phone. If you do nothing else, go into the settings on your phone and see, you know, for those different services, what are they sharing. So when you download, um, the um, data archive from Facebook from Meta, it shows you your ad preferences. Um, and then you can go through a privacy checkup tool to control that data and get it locked down so that you're not sharing as much. Now, if you are the kind of person who thinks that convenience is more important than privacy, that's okay too. I'm a security person, I want you to lock all of your stuff down. Um, but it's okay to think, well actually I want the web to remember when I've got something in my online shopping basket. You know, no one Cares what my Deal is. No, when it is, that's, so if that's your decision, that's fine, but please, I think my only wish would be that you engage with it actively and you make that decision rather than letting somebody else make that decision for you. Yeah, please engage brain before clicking cookies. Yes. Um, so a couple of questions from anonymous people here, which a a little bit linked. I think the, the first one is, uh, actually it's not anonymous. This was from Andrew, uh, Eson who asked were the fines levied on the big data companies for GDPR offenses actually paid. So that's the first question. Just, I dunno the answers you keep thinking. And the second one is a anonymous person says, well, finding Facebook a hundred million euros, which sounded quite a lot to me for lying about his data capabilities in WhatsApp seems pretty poultry. Well, there's lots of rich people in the city of London, so probably it is poultry for this person, um, are penalties just the cost of doing business, Right? Yes. So I'll take those two together. Um, and, and thank you Andy for your question and thank you for for, um, joining in. I know Andy offline, so it's nice to have some, some real people in our digital world. Um, so you will be unsurprised to hear that big tech companies employ teams of lawyers and outside counsel to fight the judgments against them. And one of the reasons why it can take a really, really long time for fines to be paid is that they do get contested. They do sometimes, um, get shot down, but generally speaking they'll be fine. Something for something. Um, and I think to some extent because with the public authorities, I, I mentioned this in the, in the handout, but not in the lecture with the public authorities. Um, even if, uh, a, a company is successfully fined that money will go back to the treasury, it doesn't come back to you as data subjects. It comes back to you as data subjects. If you bring a civil lawsuit, which is something that's been happening in the, in the us you'll be unsurprised to hear quite a lot, um, where users are potentially gonna be in for some money, particularly for the Cambridge Analytica scandal. Hmm. Um, so, but not outside of the US because that's where the civil lawsuit has been brought. It does raise a really interesting question about, you know, is there a different way to do this so that the money doesn't just go back to a government treasury, it goes back to the people who've been affected In terms of the sums, I think the answer to both of those questions is actually that sometimes the rulings are symbolic enough Mm-Hmm. For even large tech companies to change their policies. And sometimes it's the symbol that's more important than the amount, even if the amount looks absolutely whopping. Okay. Yeah. Now I've got time. Two quick ones I think. Um, so the first one is an interesting one. If you are based in the United Kingdom and we are giving this lecture from the United Kingdom, would you recommend to withdraw consent for the NHS to share our health data, even for research purposes or to improve their services? So this is a hot topic for Brits. We're all members of the NHS. I don't think you can not be a member of the NHS. Um, I guess not. So what should we do? So for me, this is all about informed consent. And this is all about having, you know, being asked to give your consent every time it's required by a different research agency. So if I think about, um, during covid ID the vast majority of us, I would wager willingly gave our data to the Zoe Study and, you know, various other folks because we knew it was gonna help people. But we also understood that it would be for those limited purposes that, you know, my data on my exercise and my eating habits wasn't suddenly going to end up being commercialized. You would hope. Um, there have been some potentially controversial partnerships between the NHS and some, um, big tech companies and some technology intelligence companies. Um, and, and I think that's, it's a good example for me of why I don't wanna give a blanket consent to somebody. I want them to come back to me every time they want my data for a new purpose. And that's a really important part of GDPR, certainly, and as we brought it into UK legislation, is you give consent for a specific purpose. If that purpose changes, the data controller has to come back to you Yeah. And ask you again. Yeah. Blanket consent is so last century, don't you think? Yeah. It's, It's very 20th century. Right, right. Okay. I have a question from Professor Danny Dresner. Aha. Yes. Who you referred to in the lecture. Should I be looking to enforce my right to have my for all things Dr. Who edited out of this lecture, <laugh>? I mean, he absolutely can do, but, um, I know Danny quite well and, um, I think he's very keen to have any publicity. Oh, so <laugh>. Oh. So Danny, if you don't mind, I'm gonna keep this one In. Oh, that's a wounding way to end. Feel Free to fight me in The courts <laugh>. But a brilliant, that's a brilliant end to a really, uh, wonderful and fascinating, uh, hour. So Victoria, thank you very much. Thank you everyone.