Tag: Privacy

  • Book review: Stolen Focus

    Book review: Stolen Focus

    In 2007 Douglas Coupland released the novel jPod. During a trip to the Czech Republic I read it on behest of my girlfriend, and I utterly loved it. Five nerds in cubicles (pods), assigned to their places due to the initial J of their respective surname, in a basement of Neotronic Arts are designing the gore in video/computer games. They’re joined by a sixth member, whose surname also begins with a J and she initially thinks they’re morons. They’re all born at the end of the 1970’s and beginning of the 1980’s and their attention span at work is maximum 15 minutes long. Morally they differ from their parents, they belong to the ego of the digital age and spend lots of time not working (a Gen X trait, Coupland’s generation I dare say). Having read it thrice it remains one of the my favourite books of all time.

    Fast forward to 2008, the year we travelled to the Czech Republic, and “Twitter makes you feel that the whole world is obsessed with you and you little ego – it loves you, it hates you, it’s talking about you right now” as Johann Hari writes. For someone who’s managed Twitter, Instagram and Facebook accounts for organisations, I can only agree – it’s invasive and takes control of you. I’m happy jPod was released before social media and the new generation of smartphones wrecked the attention span and ability to focus completely.

    “How to slow down in a world that is speeding up?” Hari continues in the book Stolen Focus: Why You Can’t Pay Attention – and How to Think Deeply Again. He outlines twelve problems for our individual and collective attention spans, and ability to focus. All of them will not be covered here though. For that, you have to read the book.

    On average a person working in an office is undisturbed for approximately 2 minutes and 30 seconds. Undisturbed by others, that is. The average attention span is merely 47 seconds, because people also interrupt themselves. All the time. Meanwhile it takes 23 minutes to return to a state of focus. Meaning we basically never stay focused.

    Hari interviewed lots of people for this book, James Williams at Oxford Internet Institute being one among them. His words resound deeper than many others (and there’s tons of important words said by intelligent people in the book). We need to take on crucial issues such as climate change, but “when attention breaks down, problem-solving breaks down.” This is a hypothesis Hari clings to, and I concur: tearing attention apart means people can’t concentrate, can’t direct energy on proper things. As Hari writes, “Depth takes time. And depth takes reflection.” Mind-wandering is a state of mind people should enjoy more, but instead blocks it out more or less completely by staring at screens. Also due to the thinking that directed thoughts, meaningful thoughts and chores are good, while letting your brain do “nothing” is useless.

    To flood social media with more information is a very good way of blocking debates and conversations – it shortens the collective attention span. Add actual noise and sounds, which both deteriorate hearing capacities. Somehow we believe it’s an equilibrium: you listen to noise and sounds 50 % of the day, and you can recuperate if 50 % is quiet. But that really depends on the noise (background chatter for instance, or cars passing by), the sounds (simple, more occasional sounds) and the silence. Allowing exposure of sounds and noise for hours each day, combined with voices and music, hurts the ears and hearing. Eventually it will deteriorate by system overload. The same with your brain. It cannot evade being disturbed and deteriorates slowly, making you more stupid.

    Hari interviews Sune Lehmann, a Danish researcher on time, who exclaims that the new upperclass will be the ones with very long attention spans, always able to limit information input and aware of what they are actually doing. The rest of us will simply react to the information fed to us. We read and watch stories about people who can sleep less, eat poor and bad food, and still outperform the average person: the Bond villains and the tech prodigies. They never experience sleep deprivation, never seem to slow down. It’s the opposite of Andy Weir’s main protagonist in Project Hail Mary who states that humans become stupid when tired. We don’t comprehend that the reason behind “greatness” is mind-wandering, thoughtful discussions, promenades, information intake (and helpers, such as wives, butlers or servants): Abraham Lincoln and Theodore Roosevelt writing their speeches and pondering tough decisions, Harry S. Truman thinking through information and memos before making extremely hard decisions. We desperately need the ability to think in order to grasp and tackle climate change, artificial intelligence and other important issues, though with ruined minds and attention spans we won’t. Another quote from James Williams: “You can only find your starlight and your daylight if you have sustained periods of reflection, mind-wandering and deep thought.” 

    Lehmann reminds me of Cal Newport’s Deep Work: the future will belong to the people who can focus, who can work deep. Because Earl Miller from Massachusetts Institute of Technology says we’ve learned to compare ourselves to computer processors, machine parts with the ability to multitask, when in fact we can’t. When we try to do two or three things simultaneously, our brains are reconfiguring relentlessly. While we may believe we’re doing several things at the same time, our brains constantly start a new chore, gets interrupted by another one, stops and initiates the new chore, then gets interrupted again, stops and tries to reinitiate the first chore but actually has to restart a little bit further back than before, because of the interruption. On it goes. In some small doses it’s worse to check your Facebook feed continuously than to get stoned – and who’s allowed to get stoned at work?

    Hari continues to tackle issues such as school systems reining in our children’s abilities to learn and move (more) freely, diagnosing children with ADHD, how reading on screens is bleeding into how we read paper, and the Western world’s issues with nutrition and obesity (your tired body craves sugar and fat, which is omnipresent, we cannot evade it).

    One thing I appreciate with Hari is how he allows different arguments to meet in the book, carried by other people who oppose one another, or Hari himself. And he ends with hope, telling us about the generation his grandmothers belong to and how one of them fought for universal suffrage in Switzerland in the 1970’s. Regarding the possibility to challenge these twelve distractions, destroying our ability to focus, Hari writes:

    “No source of power, no set of ideas, is so large it can’t be challenged.”

  • Technoreligion: youth and grown-ups

    Recently, the Swedish right-wing government proposed to outlaw mobile phones in classrooms (compulsory school) because negative results from the PISA results regarding school, “measures 15-year-olds’ ability to use their reading, mathematics and science knowledge and skills to meet real-life challenges.” Previous center governments have proposed the same.

    Simple causation: Bad results leads to conclusion about bad influence leads to obvious solution.

    Approximately 80 % of all schools already have a ban on phones during the school day. So, the proposals are just populist-like ideas, supposed to prove that governments are perceptive and .

    Rather, I see the usual conclusion as backwards, which is an unpopular stance whenever I discuss it with grown-ups. The solutions, from my perspective, is very simple: The grown-ups, first and foremost, must stop using phones so recklessly, so disrespectfully, so much. Don’t expect children and adolescents to put down their devices when the grown-ups show how it’s done: device in hand at all times.

    Our digital toys (a distant relative called them theme parks) are so precious we simply can’t let go of them. But since children are children and can be harmed, they should learn their place and proper behaviour. The idea that grown-ups are role models is completely absent in this conversation.

    It’s considered rude to question the technological “evolution”. Virtually nothing is possible or plausible or feasible to hinder. It’s “development”, the ubiquitous unstoppable force of linear thought and progression.

    Plenty of young children have SnapChat or TikTok. Amazingly, they’re not even 13 years old. Does that matter? They’re not allowed to use these platforms belonging to these companies because they underage. It’s in violation to terms of service. Do parents seem to care? No. Do companies seem to care? No.

    Discord, among others, and video games hav been blamed for exposing children and youth to extremism. Or rather, Discord is a platform serving extremism, while video games are the vehicle used by the extremists to disseminate information, misinformation and disinformation.

    YouTube Kids providing 9-year-olds and 14-year-olds recommendations to videos about guns and gun-related violence, gun-modification and injuries.

    Studies in the U.S. show deteriorating health related to teens, especially girls, which coincides with the introduction of the smartphone. Adam Alter, in his book Irrestible: The rise of addictive technology and the business of keeping us hooked, mentions a conversation by a young teenage girl lamenting the presence of a friend, who doesn’t listen to her, because the friend is too focused on her phone to care. Absence in presence.

    Imagine being ten years old and get your first smartphone. You have the usual flora of social media apps. Let’s pretend you receive approximately, generally, 100 notifications during 24 hours. 24 hours, everyday for ten years. When the child has left adolescence and become an adult, they have received 100 * 365 * 10 notifications = 365 000 notifications. A notification is designed to create a reaction: vibration, sound, light. Even one reaction is enough to usually increase the pace of your heart. 365 000 bodily reactions in your child.

    Thus, we have the issue of parents exposing their own children systematically on social media. Once upon a time I blogged about being a parent, even co-creating a podcast on the subject. One reason was the almost total lack of parents writing about being a parent, what it entails. Your tips and tricks, shortcomings, logistics, fears. Most bloggers wrote about consumption or clothing: buying things for your children. And the, almost, relentless pictional depiction of children dressed in clothes, playing with computers or iPads. Grown-ups exposing children. I even saw pictures of children sitting on the toilet, smiling, being only three years old. Imagine a teenager knowing you did that. Once you turn 80 and need to rely on a walker or support taking a shower – imagine your grandchildren photographing you, posting the picture with a funny comment on the Internet. Lucky you!

    It’s fascinating how much the grown-ups want to keep squeezing their own phones. No matter how much youth suffer or are exposed, we simply can’t let go of the screen ourselves. When will it stop?

    My argument? As long as grown-ups can’t be grown-ups, the school cannot help solve the issue. The governmental solution is based on the false causation and premise that mobile phones are used in compulsory school. But in many schools, electronic devices aren’t allowed. They’re given to the staff, kept in lockers during school and returned to the children when they leave for home.

    Since the issue isn’t children using phones in schools, my argument is that grown-ups are to blame. They must stop being addicted, much like a smoking parent must stop smoking rather than stopping their children from seeing parents smoking, or this issue won’t be solved. But since we’re a technoreligious society, where grown-ups hold the power in adherence to laws, the grown-ups won’t put down their phones.

  • Book review: Weapons of math destruction

    Book review: Weapons of math destruction

    This is a mandatory book during a course on democracy, that I actually read approximately three years ago and thus never reviewed (this website didn’t exist then), so I thought it was time for a proper review.

    Cathy O’Neil is a computer scientist and mathematician, who left the academic life for the financial industry in the early 2000’s, working with computers, for companies making lots of money. There she discovered what is now called Big Data and later became troubled by the purposes and intents of algorithms. After realising the even more troublesome side effects on society, she thus wrote this book, with the secondary title How Big Data increases inequality and threatens democracy.

    Through ten chapters, O’Neil takes the reader through what a data model is and how it can affect people in real life, such as the effects of university ranking models and the possibility of getting an adequate education, evaluations of teachers, online advertising, criminal injustice and justice and getting insurance, among other things. How come a data model deems a teacher unsuccesful or a job applicant unfit? Is the model correctly constructed or does it inherit its lack of perspective, and mathematical incoherence, from the creator? Data models with destructive effects on people’s lives are what she calls weapons of math destruction, WMD.

    In large, I agree with her and appreciate her arguments and conclusions. Negative feedback loops can infer that black men are more prone to commit crimes because the police has indicated black neighbourhoods as more exposed to petty crimes, sending police patrols to these neighbourhoods rather than white communities with more hidden crimes not marked on a map. This kind of feedback loop creates or maintains inequalities, which have destructive consequences for society.

    Sometimes, though, she contradicts herself. The extremes in statistical data are more likely to be pointed out and punished, she writes, although she also writes (rightly) that black men become an average in criminal statistics, simply being the median and mean, rather than the extreme. In a black community with more black men than white men, black men are the average. In a sense, being an average person, financially for instance, in a big data model can be very punishing, while being an extreme in form of extremely rich is better.

    On average (huh!) though, this book is still highly relevant, even though we’ve moved into the “age of AI”. AI-programmes rely on the same errors and statistical inferences as the programmes O’Neil discusses. Personally, I think the book is good for social scientists. She presents statistical models used by scientists and businesses, and how easily they can turn into stupid models discriminating people. It’s nice to get a mathematicians perspective and logical thinking.

    Conclusion: It still stands. Brief as that.

  • Book review: Click here to kill everybody

    Book review: Click here to kill everybody

    For those who don’t know of Bruce Schneier, he’s one of the world’s most famous and prominent cybersecurity experts. If there’s one person you’d like to guide you and hold your hand while in need, Schneier is the one. This book is about basics of cybersecurity, not the technical aspects, but rather about security on the Internet and the Internet+, the interconnected world of the Internet of things.

    Driverless cars, thermostats, drones, locks on doors, baby dolls and monitors, and pacemakers are interconnected – without any concern for security. Virtually all companies except for Apple and Microsoft sell inadequate and incomplete consumer products without testing, whereas in the the airplane industry a line of code can cost millions of dollars and pass through very rigorous testing before being applied in reality.

    “Click here to kill everybody” is a thorough and deep book about how this neglect of cybersecurity has consequences for people, society, companies and governments/authorities. It depends on rushed incentives and meddling from many governments.

    I love the metaphor “The Four Horsemen of the Internet Apocalypse – terrorists, drug dealers, pedophiles, and organized crime” that states and companies use to frighten people. If we standardize encryption in texting, telephone calls, files on your phone, the dark sides will become even stronger and the good forces will fail at catching and prosecuting villains (is the usual comments). The paradox is that states use front companies to do some of these works as well, like North Korea and organized crime and drugs. Even China (companies connected to the People’s Liberation Army), Russia (Internet Research Agency, under the now-well-known-name Yevgeny Prigozhin) and the US (the military-industrial complext and NSA-connected entrepreneurs) are all engaging companies to do their bidding, no strings attached.

    The situation we’re in: From bad to worse

    An entire chapter is named “Everyone favors insecurity”, a telling title. What it basically comes down to, is that companies are unwilling to pay for security, very much like ecofriendly products are more expensive, because taking ecological consideration into account costs more than not caring. Apple and Microsoft are two of the very few companies that actually pay attention to security, making sure that products are released when they’re as secure as possible. Most companies follow the former Facebook motto “Move fast and break things” and release rather delay and miss the launch.

    What people, and companies and authorities, then miss is the fact that our overall security is decreased, in peril, simply because it’s considered too expensive or too troublesome.

    Security should default, like encryption should be default, not optional or thought of in clear hindsight. When products are ready for sale, they should be as complete as possible. The ideal of move fast and break things should be abolished.

    Regulation

    Authorities need more transparency, less secrecy, more oversight and accountability, Schneier argues (and he isn’t alone). FBI, NSA and others don’t want encryption and want backdoors. This is completely contradictory security-wise. If the population is being preyed upon, if rogue elements can infect and steal from people, companies and authorities will also be easier targets. The more people who risk being infected and preyed upon, the more who will be in peril. Less security for civil society and people means states are less secure, although authorities want to weaken encryption, install backdoors – everyone gains access to damage, everyone looses.

    An argument often lost in the debate on regulation is that losing parties in this debate of regulation are small companies without assets or time on their side, and favour big corporations, who can much easier adapt. Big corporations are also prone to being in the attention span of the regulators and tended too, whereas smaller companies are seldom even seen, mostly overlooked. I think this is one of the most important aspects of the entire book.

    Another issue with regulation is its tendency to focus on particular technologies. Schneier’s suggestions is to “focus on human aspects of the law” instead of technologies, apps, or functions. Also, it’s better to aim for a result and let experts work to achieve that result rather than, again, focus on a specific technology.

    Summary

    Rights of the computers scientists / software developers / programmers are still very strong and they can develop pretty much what they want. We’re too short-sighted and can’t, or refuse to, see possible outcomes and changes from longer perspectives. “We accepted it because what they decided didn’t matter very much. Now it very much matters, and I think this privilege needs to end.” Just because products are digital doesn’t mean they have more right to exist, and living in a society where technology has become some kind of religious belief doesn’t mean technology is impervious to critic or bad things.

    Schneier argues that only states should have the capability to confront cyber attacks, not companies or other organizations. Considering they industry of spyware (or mercenary spyware as it’s called) I concur, though companies can help being part of cyber defense.

    One of Schneier’s guesses is that the security issues with “Internet+ will creep into their networks” in unexpected ways. Someone brings a device to work, which connects to the Internet and starts to leak data. Suddenly a company or authority realizes it has serious issues with real life implications.

    If you need a basic book about cybersecurity, without any technical details or prerequisites, this is a book for you. It’ll teach you what cybersecurity is about.

  • Book review: Reset

    Book review: Reset

    “We can reclaim the internet for civil society. The principle of restraint should be our guide.”

    The end.

    Basically, I could stop here and write no more. These are the last two sentences of the book Reset: Reclaiming the Internet for Civil Society of Ronald J. Deibert and the profound solution to problems with internet, social media, tech companies, surveillance, espionage, cyberwars, is about.

    Deibert founded The Citizen Lab in 2001 as a research group with a mission he had came up with: the dirty backside of the Internet. For once I read a boo by an author who doesn’t need to restort to the creepiest descriptions and depictions on what could happen – he controls this subject totally. You can read it between the lines, see it in the examples given, often from the research of the Citizen Lab and from various other sources, not the usual ones, he doesn’t inundate you with details and a massive amount of examples about everything that is inherently wrong with internet (for that, listen to Grumpy old geeks), and because of the chapters he’s chosen to focus on. He knows this stuff without the restless need to show how well he has (begun) to master this subject after a couple of years. The combination of chapters are his strength.

    Causes

    Surveillance capitalism as a concept is the first subject Deibert touches, writing about the omnipresent tech in our lives, the gadgets we surround ourselves with day and night, for most reasons. This has been covered by Carissa Veliz, Adam Alter (review coming) and (obviously) Shushana Zuboff (review coming), to name a few. Deibert writes about different absurd apps, ideas to capture more personal data and dangerous paths taken by companies, paths that can easily lead to authoritarian perspectives on society and societal change.

    How our addictive machines are used to spread propaganda, disinformation, misinformation, to destabilize societies, divide and rule among foreign adversaries is another bleak chapter. Companies, state actors, organisations are playing a very perilous game with democratic states and risking all progress on human rights. Insititutions are seemlingly falling apart, or at least being unable to thwart a slide towards more fragile societies.

    Thirdly, intrusive powers is about how technology is used to circumvent human rights and deliberation by (nation) states. Abuses of power become harder to track, inhibit and hold accountable. Technology is more often used to suppress minorities and people rather than elevate them.

    Aspects of climate and environment are usually completely excluded from books written by tech-related authors. The link to the natural world is many times exempt from being questioned. Two of the few eexceptions are Kate Crawford and Tung-Hui Hu, both of whom I’ll cover in time.

    I worked in politics for almost seven years and I concur with Deibert that “material factors play a major role in shaping political outcomes”, must be taken into account and politics should, at times, adapt to societal changes rather than neglecting them. Sometimes you simply follow, not lead. And tech is very much physical, material.

    No other expert, that I have encountered, has been able to combine all these issues and subjects into one coherent text about the state of the internet and democracy. A fellow Canadian and political scientist at that, Taylor Owen (yes, listen to his podcast), is the closest one we’ve got.

    Solutions

    Deibert’s a political scientist at heart, although you might think (or decieve yourself) he’s a computer scientist, and it shows when he delves into solutions. He presents the ideas and theory of republicanism, the theory “to tie down and restrain the exercise of power not only domestically, but also across borders.” Politics usually move rather slowly in democratic states and rightfully so, argues Deibert and the republicans, because decisions should take time and deliberation is necessary so as not to react emotionally or irrationally due to some fancy. Deliberation has become a word with negative connotations. Things should be decided quickly, without thoughtful processes, almost impulsively. Deibert argues that deliberation offers restraint, inhibits decisions to be simple (and often stupid) reactions to very contemporary issues. As such, restraint should be exhibited much more, in social media, in politics, on technologically related decisions. Deliberation should be a guideline, not an insult.

    At first my thoughts were similar to my reading of Beijmo’s De kan inte stoppa oss – basically, we’re f*cked. After a while I actually feel hope. For once, here’s a person with vast experience and knowledge of how bad things have turned for more than two decades, who can show us real adequate and suitable actions on a systemic level. Here are no individual recommendations on “block cookies”, “encrypt all your communications” or “refuse to use social media”. Deibert has spent more time than most humans on these issues, so what he writes is very much what we should do. We should move slower, more deliberately, in order to reclaim internet for civil society, not for states or companies.

    Conclusion

    If there’s one book to rule them all, this is the one.

  • Book review: The perfect police state

    Book review: The perfect police state

    This could well be a follow-up to Beijmo’s De kan inte stoppa oss. Instead of Syria as the main stage, the story and its focus is China. As Europe was an outlier in Beijmo’s book, Turkey and the US are the outliers here.

    Writer Geoffrey Cain presents himself early as the journalist he is, when travelling Xinjiang in western China, though the main character in the narrative is Maysem, an Uighur who has escaped Xinjiang. Cain introduces “the Situation”, the extreme oppression of Uighurs and other minorities in Xinjiang and the equally extreme surveillance there created by the Chinese Communist party. Its war on the three evils (terrorism, separatism and extremism) has created an omnipresent surveillance like no other, a dream of predecessors like Gestapo, Stasi and the KGB, much in the form of Sky Net (yes, sounds like Terminator) and Integrated Joint Operations Platform. Purposes are reeducation (brain wash) and genocide (e.g., by sterilization) and to dismantle the entire culture by sowing distrust between every person.

    Cain does a much better job than Beijmo at explaining his sources, his knowledge of things usually hidden, why he cross-checked interviewees, reasons for altering names and how he ended up covering this dire subject.

    Excerpt from pages 6-7 in the book, where Cain watches heavily equipped counterterrorism officers in the city of Kashgar:

    “I casually snapped a photo of the scene with my cell phone, and started to walk away. One of the police officers was wearing sunglasses with a built-in camera linked to China’s Sky Net surveillance database; the camera was connected by a wire to a minicomputer in his pocket. He turned left and glanced at me. If I were a local resident, he could probably see my name and national ID number on his lenses within seconds. Before I knew it, I was surrounded by police. I didn’t know where they came from, or how long they had been watching me. […] It’s likely that I’d been watched from the moment I arrived. Fellow journalists had warned me that my hotel room would be bugged and any laptops or smartphones I left in my room would be scanned. With 170 million cameras nationwide, some able to identify anyone from up to nine miles away, and government devices called Wi-Fi sniffers gathering data on all smartphones and computers within their range, the state probably knew a great deal about me the moment I stepped off the plane.”

    This occurred in 2017, just a couple of years after Beijmo’s story. Much has happened, albeit another country, in technology. It reminds me of the planes with cameras circling Baltimore, recording every vehicle and person in the city, as told by Bloomberg in 2016.

    By retelling Maysem’s story, and the stories of multiple other persons, we’re given a glimpse of the horrible oppression of the ethnic minorities in Xinjiang: reeducation centers, concentration camps, forced labor for Chinese and international corporations, the propaganda emanating from the Communist party, the male watchers allowed to sleep in female Uighurs homes and beds. Technologically, it’s on a totalitarian scale the Nazis could only have dreamed of. Despicable and omnipresent a system, the retelling is haunting, even if it’s not as blunt and violent as in Syria.

    Part in the creation of the surveillance stands the technological giants of China, with an unwitting Microsoft initiating the search for ever better surveillance many years ago. Some of the tech giants are Huawei, Hikvision and Tencent, the creator and holder of WeChat.

    Then, what is the actual difference technologically from our Western societies? Facebook Messenger and Alphabet/Google registering as much as they can about us on any device and with any trackers they can get hold of. Facebook with their glasses, Alphabet with their glasses and their watches. Android as an operating system with Google services installed allowing, by default, unrestrained data collection. Tech companies wish to create AI with our information and sell us as products to advertisers and corporations. In China, the Communist party wishes to create AI and control people in order to create a harmonious society without friction. The Citizen Lab released a report in 2020 named We Chat, They Watch, revealing how WeChat is used to spy on non-Chinese residents and to train AI.

    Cain’s book is a good read, and the seriousness of the issue is written on every page. To follow Maysem’s story, you need to read the book, because I won’t give you any more details. Usually, I don’t bother with graphics or design, but the cover of the book (in hardcover) is appealing.

  • Book review: Privacy is power

    “It’s not about something to hide, it’s about something to lose”. This quote by Edward Snowden sums up this book by Carissa Véliz, first released in 2020. Here, I will present some topics of the book for the interested. I should emphasize that this review concerns the 2020 edition.

    Carissa is an associate professor at the University of Oxford and has led an extensive research project on privacy, Data, privacy & the individual.

    Introduction of everyday life

    She takes us on a tour through everyday life with an array of technological devices and the related privacy issues: electronic door bells, cameras of various kinds and genetic tests. It’s nice to read something that’s actually relatable, in a setting of everyday life, starting in the morning and ending the same day. At times, though, it’s a bit far fetched. All those devices and the lack of privacy is there to depict a bleak and likely future more than life today, because very few people encounter all of those devices every single day. We meet them all in one day as one person.

    Collective aspect of privacy

    By far, my favorite part of the entire book, and probably the most important one too. We’re not isolated people, but interconnected and interdependent. On my phone, there’s personal data on people I call, text, send mails and photograph. In the photos is location data and biometric data on people. In my calendar I reveal information on people I meet: when, where, why and how.

    Perhaps my neighbor’s phone contains photos of me, processed by apps I didn’t even knew existed, now fed some of my biometric data. What are the apps, who owns those apps, which personal data do they share and disseminate and with which third parties? Where is my personal data actually stored and what actual purpose is behind the collection in the first place?

    Our urge to willingly share information about ourselves to people we know is a gate into sharing information with an unknown amount of people. How many people read the privacy policy of a new app or service?

    Privacy is power

    Thus, one conclusion is that privacy is about power, because personal data is power. Collection of personal data is power, so abstaining or avoiding to be “harvested” is a key to keep autonomy and the privacy of individuals (somewhat) intact.

    Companies, which is most often the case in the Western countries, collects lots of personal data on lots of people. Virtually no one can avoid or escape this massive collection. Holding personal data means power, because people can be “nudged” into doing things they aren’t even aware. Tristan Harris’ famous article on “How Technology is Hijacking Your Mind” is a telling example, the Cambridge Analytica scandal another.

    Carissa shares a story of how someone she knows works as a programmer and assigned the task of surveilling one single person for a period of time. His job is to follow and study this person, in order to understand what computer systems can do and the amount of personal data one is able to collect. It’s not been long since it was revealed how tech giants assigned staff to actually listen to people’s conversations through voice assistants.

    What happens if a state turns authoritarian, as happens in Poland and Hungary, from within the European Union itself? What happens when the state also uses the personal data companies have collected? Carissa tells us a moving story from World War II, which I will write about separately.

    The inevitable technological progress

    A very common trope of the debate in many countries is that technological progress is more or less absolute, inevitable. No matter what we say or do, technological progress cannot be stopped. It has become a religion of sorts, a belief rather than fact. Carissa names Google Glass as an example of hampered technological progress. After the reinvention of the smartphone and the smartwatch, the glasses would become the next inevitable device for the masses. After heavy criticism, much concerned with privacy, and outright bans, Google Glass project was officially abandoned.

    Read it

    These are some of the topics Carissa covers in her book and I have briefly reviewed parts of the content. There’s plenty of more and all I can do is urge you to read it.

    If you would like a good introduction on privacy, I recommend the episode Privacy by the podcast Constitutional. It’s set in the American context, but is a very good story of how privacy became a more complicated issue in the United States one hundred years ago and the importance of one man, Louis Brandeis.