Instagram Shows Kids' Contact Details in Plain SightSharing Email Address, Phone Numbers May Be Risky, Experts Say
Tens of thousands of minors on Instagram expose their email addresses and phone numbers to the public, a situation that child-safety and privacy experts say could be exploited by scammers or predators.
David Stier, a San Francisco-based data scientist and business adviser who recently brought a data exposure issue to Instagram’s attention, uncovered the situation.
Over the past month, Stier has been investigating Instagram profiles run by those who self-identify as minors, such as by writing “13yo” in the short bio space.
The minors in question have converted their Instagram profiles from personal ones to “business” ones, a category of profile Instagram introduced three years ago.
During that process, users are given the option to list their email address and phone number, or both. Instagram requires that at least one of those pieces of information is public, and business profiles overall can’t be private - meaning all photos are visible.
Stier says he estimates conservatively that 20,000 children in Australia have public contact information in their business Instagram profiles, although he’s been examining the scope worldwide. It’s not clear why so many kids have converted their profiles to business ones, although Instagram has promoted the feature.
Stier says as he dug deeper and saw more and more minors with their contact details exposed, he “had trouble sleeping.”
“When I've shown parents how kids can reveal their phone and email, the response is visceral, and they react as any protective parent would,” Stier says. “They can't believe that Instagram allows someone's kids - and maybe even their own - to display such private information that could put their kids in harms way."
An Instagram spokesman tells Information Security Media Group that the social media platform, which is owned by Facebook, is concerned about child safety and publishes guides for parents on social media safety. While the guides do have many sensible tips, none address this specific scenario of minors allowed to create business profiles.
Instagram introduced business profiles to help companies grow their followers; it offers businesses access to analytics tools and other features.
Many business profiles appear to conform to what Instagram envisioned - they're for legitimate businesses. In fact, Instagram initially required individuals who wanted to convert their profile to already have a Facebook business page, but that doesn’t apply anymore.
A search of minors' business profiles turns up many that appear to have no relationship with a business. One 14-year-old girl in Perth, for example, lists a business category of “grocery store,” although her profile doesn’t indicate any other connection to that type of business.
Another 14-year-old girl, also in Perth, lists a business as a “dancer,” but nothing else about her profile indicates a connection to a business. Clicking the “call” button in her profile within Instagram’s mobile app brings up her number, and it would be possible to immediately text her.
Combined with the public photos and bio information, it’s often easy to figure out what city a child lives in, where they go to school and their interests.
There’s no question about consent: All business users, including minors, are informed that they must display either an email address or phone number at minimum.
Too Much Information
The presence of contact details within Instagram immediately provides direct channels to a child. Leonie Smith, a cyber safety educator based in Sydney, says children are usually unaware of the possible negative ramifications of exposing their contact information online.
The Instagram situation “is concerning,” she says.
Exposing direct contact details for a child is “absolutely a concern” says Toby Dagg of Australia’s Office of the eSafety Commissioner, which oversees child-safety internet issues. Dagg manages the office’s Cyber Report group, which takes complaints about illegal content related to children.
Dagg says his office sees the effects of bad actors developing online relationships remotely through flattery, and sometimes, threats. The large-scale availability of minor contact details gives those bad actors many minors to target. “You only have to be successful one out of 1,000 times,” Dagg says.
It also opens a door to possible social engineering. Instagram is a big player in the so-called “influencer” space, where private companies recruit social media users for product promotions.
For individuals who shoot to the top of that sphere, it can prove to be lucrative, and there are people who have crafted online careers opening boxes, putting on makeup and responding to other types of soft promotions. But Dagg says that could leave children vulnerable to approaches that might falsely promise those kinds of gigs and - in the process - extract photos or other sensitive material.
More broadly, Dagg says the eSafety Commissioner works to raise awareness among software developers about application design principles that keep in mind differences between adults and minors.
“This is a really good example of why that sensitivity to safety along with privacy and security is fundamentally needed when providing vulnerable people access to services,” Dagg says.
Legal, But Risky?
Instagram isn’t violating any laws or data protection regulations. But there are evolving views of how online service providers should be evaluating risk, says Melanie Marks, principal of the Sydney-based privacy and cybersecurity consultancy elevenM.
Many privacy laws revolve around transparency and how adequately a service provider is gaining consent, Marks says. The issue of consent, however, can be trickier with minors, she says.
Privacy principles also often revolve around references to reasonableness, Marks says. In recent years, entities have sought to address reasonableness requirements through a risk-based approach.
"There are signs that privacy regulators are accepting that risk management approaches provide a practical way to make decisions in what is often a grey and subjective framework,” she says.
From a risk perspective, one question would be “whether the current risk-based approach that tech companies are taking is adequate or not – after all each business has a different view on what risks are tolerable and how to manage risks,” Marks says.
Whether kids who have exposed their details online via Instagram have faced unwanted contact isn’t easy to quantify. During the reporting of this story, no anecdotal evidence emerged.
But a survey published by Australia’s eSafety Commissioner last year titled “State of play – youth, kids and digital dangers” gives insight into the types of negative online experiences encountered by minors.
Over a one-year period between 2016 and 2017, the commissioner conducted a survey of 3,000 minors between the ages of 8 and 17 years old. It covered a variety of questions about their online experiences and how they handled certain situations.
Contact from a stranger or from someone the minor didn’t know was the top ranked negative experience. Only 55 percent of kids told their parents about negative online experience, according to the survey results.
Smith says that for many reasons, children may be reluctant to report to their parents that undesirable people approached them. For example, they may fear that their apps or device usage may be restricted.
“We’ve got a big problem with disclosure if they’ve [children] been contacted by strangers,” Smith says.
Masking Real Details
Stier says it’s unclear why kids are allowed to convert to business profiles because most appear to not have businesses. At a minimum, he says, Instagram should be using a tool to mask the real contact details.
For example, Instagram could forward communications to a user's real email address rather than reveal the real address. And perhaps for minors – even if they claim to have a business – Instagram could not allow them to post a phone number.
The point of social media is in many ways to be reachable. And it is possible to send anyone a message on Instagram, regardless if a person’s profile is set to private.
But that communication goes through Instagram’s own system, where presumably the company is running analytics to detect fraud or abuse. And it’s also possible for users to block those communications within the app.
With a business profile, an initial communication could occur outside of Instagram’s purview. Smith says that “the first people who learn of this [these kinds of changes] are people who have ill intent.”
Stier says it’s likely that other people have already found what he found and “not said anything about it,” which is why he has raised it with Instagram.
“It's not the hypothetical thousands of kids that could be harmed by Instagram's practices,” Stier says. “Instead, the focus should be on ensuring that Instagram adopts some simple fixes so that not a single child comes to harm.”