Friday, Mar 11, 2022 • 12min

What tech companies know about your kids

Play Episode
Anthropologist Veronica Barassi urges parents to demand protections and look closely at the ways their kids’ data is being collected online. This talk was filmed at TEDxMileHigh. All TEDx events are organized independently by volunteers in the spirit of TED's mission of ideas worth spreading. To learn more about TEDxSHORTS, the TEDx program, or give feedback on this episode, please visit http://go.ted.com/tedxshorts http://go.ted.com/tedxshorts Follow TEDx on Twitter: https://twitter.com/TEDx https://twitter.com/TEDx Follow TEDx on Instagram: https://www.instagram.com/tedx_official https://www.instagram.com/tedx_official
Read more
Talking about
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Speakers
(2)
Veronica Barassi
Atossa Leoni
Transcript
Verified
Break
Atossa Leoni
01:50
A quick new idea, daily, from the world's greatest
TEDx
Talks. I'm your host
Atossa Leoni
, and this is
TEDx
SHORTS.
Share
02:04
Families use digital platforms every day, from online games to education
apps
and medical portals. But
anthropologist
Veronica Barassi says they may be collecting and selling your
children's
data
. With her eye-opening research, Barassi urges parents to look twice at digital terms and conditions instead of blindly accepting them, and to demand protections that ensure their
kids'
data
doesn't skew their future.
Share
Veronica Barassi
02:40
Every day, every week, we agree to
terms and conditions
. And when we do this, we provide companies with the lawful right to do whatever they want with our
data
and with the
data
of our
children
. Which makes us wonder: how much
data
are we giving away of
children
, and what are its implications?
Share
03:06
I'm an
anthropologist
, and I'm also the mother of two little girls. And I started to become interested in this question in 2015 when I suddenly realized that there were vast - almost unimaginable amounts of
data
traces that are being produced and collected about
children
. So I launched a research project, which is called "
Child
Data
Citizen", and I aimed at filling in the blank.
Share
03:32
Now you may think that I'm here to blame you for posting photos of your
children
on
social media
, but that's not really the point. The problem is way bigger than so-called "
sharenting
." This is about systems, not individuals. You and your habits are not to blame.
Share
03:52
For the very first time in history, we are tracking the individual
data
of
children
from long before they're born - sometimes from the moment of conception, and then throughout their lives.
Share
04:05
You see, when parents decide to conceive, they go online to look for "ways to get pregnant," or they download ovulation-tracking
apps
. When they do get pregnant, they post ultrasounds of their babies on social media, they download pregnancy
apps
or they consult Dr. Google for all sorts of things, like, you know, for "miscarriage risk when flying" or "abdominal cramps in early pregnancy." I know because I've done it, and many times.
Share
04:38
And then, when the baby is born, they track every nap, every feed, every life event on different technologies. And all of these technologies transform the baby's most intimate behavioral and health
data
into profit by sharing it with others.
Share
04:56
So to give you an idea of how this works, in 2019, the British Medical Journal published research that showed that out of 24 mobile health
apps
, 19 shared information with third parties. And these third parties shared information with 216 other organizations. Of these 216 other fourth parties, only three belonged to the health sector. The other companies that had access to that
data
were big tech companies like Google, Facebook or Oracle, they were digital advertising companies and there was also a consumer credit reporting agency.
Share
05:41
So you get it right: ad companies and credit agencies may already have
data
points on little babies. But mobile
apps
, web searches and social media are really just the tip of the iceberg, because
children
are being tracked by multiple technologies in their everyday lives.
Share
05:59
They're tracked by home technologies and virtual assistants in their homes. They're tracked by educational platforms and educational technologies in their schools. They're tracked by online records and online portals at their doctor's office. They're tracked by their internet-connected toys, their online games and many, many, many, many other technologies.
Share
06:19
So during my research, a lot of parents came up to me and they were like, "So what? Why does it matter if my
children
are being tracked? We've got nothing to hide." Well, it matters. It matters because today individuals are not only being tracked, they're also being profiled on the basis of their
data
traces. Artificial intelligence and predictive analytics are being used to harness as much
data
as possible of an individual life from different sources: family history, purchasing habits, social media comments.
Share
06:59
And then they bring this
data
together to make
data-driven
decisions about the individual. And these technologies are used everywhere. Banks use them to decide loans. Insurance uses them to decide premiums. Recruiters and employers use them to decide whether one is a good fit for a job or not. Also the police and courts use them to determine whether one is a potential criminal or is likely to recommit a crime.
Share
07:32
We have no knowledge or control over the ways in which those who buy, sell and process our
data
are profiling us and our
children
. But these profiles can come to impact our rights in significant ways.
Share
07:49
To give you an example, in 2018 the "
New York Times
" published the news that the
data
that had been gathered through online college-planning services - that are actually completed by millions of high school
kids
across the US who are looking for a college program or a scholarship - had been sold to educational
data
brokers.
Share
08:15
Now, researchers at
Fordham
who studied educational
data
brokers revealed that these companies profiled
kids
as young as two on the basis of different categories: ethnicity, religion, affluence, social awkwardness and many other random categories. And then they sell these profiles together with the name of the kid, their home address and the contact details to different companies, including trade and career institutions, student loans and student credit card companies.
Share
08:56
To push the boundaries, the researchers at
Fordham
asked an educational
data
broker to provide them with a list of 14-to-15-year-old girls who were interested in family planning services. The
data
broker agreed to provide them the list. So imagine how intimate and how intrusive that is for our
kids
. But educational
data
brokers are really just an example. The truth is that our
children
are being profiled in ways that we cannot control but that can significantly impact their chances in life.
Share
09:34
So we need to ask ourselves: can we trust these technologies when it comes to profiling our
children
? Can we? My answer is no. As an
Anthropologist
, I believe that artificial intelligence and predictive analytics can be great to predict the course of a disease or to fight climate change. But we need to abandon the belief that these technologies can objectively profile humans and that we can rely on them to make
data-driven
decisions about individual lives. Because they can't profile humans.
Share
10:11
data
traces are not the mirror of who we are. Humans think one thing and say the opposite, feel one way and act differently. Algorithmic predictions or our digital practices cannot account for the unpredictability and complexity of human experience.
Share
10:28
But on top of that, these technologies are always, always, in one way or another, biased. You see,
algorithms
are by definition sets of rules or steps that have been designed to achieve a specific result, OK? But these sets of rules or steps cannot be objective, because they've been designed by human beings within a specific cultural context and are shaped by specific cultural values. So when machines learn, they learn from biased
algorithms
, and they often learn from biased databases as well.
Share
11:05
At the moment, we're seeing the first examples of algorithmic bias. And some of these examples are frankly terrifying. This year, the
AI
Now Institute in New York published a report that revealed that the AI technologies that are being used for predictive policing have been trained on "dirty"
data
. This is basically
data
that had been gathered during historical periods of known racial bias and nontransparent police practices. Because these technologies are being trained with dirty
data
, they're not objective, and their outcomes are only amplifying and perpetrating police bias and error.
Share
11:53
So I think we are faced with a fundamental problem in our society. We are starting to trust technologies when it comes to profiling human beings. We know that in profiling humans, these technologies are always going to be biased and are never really going to be accurate. So what we need now is actually political solution. We need governments to recognize that our
data
rights are our human rights.
Share
12:27
Until this happens, we cannot hope for a more just future. I worry that my daughters are going to be exposed to all sorts of algorithmic discrimination and error. You see the difference between me and my daughters is that there's no public record out there of my childhood. There's certainly no database of all the stupid things that I've done and thought when I was a teenager.
Share
12:53
But for my daughters this may be different. The
data
that is being collected from them today may be used to judge them in the future and can come to prevent their hopes and dreams.
Share
13:08
I think that's it's time. It's time that we all step up. It's time that we start working together as individuals, as organizations and as institutions, and that we demand greater
data
justice for us and for our
children
before it's too late.
Share
Atossa Leoni
13:23
The
TEDx
Talk you just listened to was recorded at a
TEDx
event in
Denver
, Colorado
. All
TEDx
events are independently organized by volunteers, who believe in
TED's
mission of ideas worth spreading.
Share
13:41
Special thanks to the organizing team at TEDxMileHigh. Wanna listen to more
TEDx
Talks? Explore the entire archive on the
TEDx
Youtube channel. I'm
Atossa Leoni
. Thanks for listening, and see you tomorrow.
Share
Break
Add podcast
🇮🇹 Made with love & passion in Italy. 🌎 Enjoyed everywhere
Build n. 1.38.1
Atossa Leoni
Veronica Barassi
BETA
Sign in
🌎