No, Senator Blumenthal, the problem is much bigger than finstas
“Will you commit to ending finsta?” Senator Richard Blumenthal asked Facebook’s Head of Global Safety, Antigone Davis, at Thursday’s hearing on the risks Instagram poses to teenagers.
Finsta, short for “fake Instagram” is slang for a category of private Instagram accounts users turn to for a variety of purposes — evading parental oversight, posting stupid memes and stream of consciousness content, and ducking the level of scrutiny often experienced on public, widely seen accounts.
The concept is a bit slippery, but when compared to Instagram’s standard filtered and curated profiles, it becomes more clear — finstas are for whatever doesn’t fit in those carefully calibrated images. (Though of course, anything posted online is cultivating some kind of identity — finstas just create a different, imperfect one.)
What Finstas are not is a service Instagram chose to roll out; from a technical perspective, they’re the same as any other private Instagram account. I keep my Instagram account on private, for example, just because I value my privacy. But it’s not a finsta, because it’s my main — and only — account, not one I turn to for a specific type of more casual, less curated content. Instagram cannot end finstas short of making it impossible to have multiple accounts associated with a single email address or making all profiles public.
“Senator, again, let me explain. We don’t actually, uh, we don’t actually do finsta,” David responded to Blumenthal.
The clip of the question went immediately viral, much like Mark Zuckerberg’s 2018’s congressional hearing in which he had to explain to a rather boggled senator that social media makes its money off of ads.
Blumenthal’s concerns about finsta — and an earlier statement from the hearing shows he has a better grasp on the term than his question showed — come from new revelations about internal studies done by Facebook identifying Instagram as “toxic” for teen users, exacerbating mental health issues such as anxiety and body image. Finsta accounts in particular, he said, allow users to hide from parental oversight, making it difficult for parents to know what their kids are seeing or posting.
These are vital concerns — Facebook only last week paused its plans for an Instagram for Kids platform, targeted at those under 13, and we have little evidence on how long-term use of social media impacts kids.
But that fundamentally misunderstands both finstas and where the dangers of social media lie. Finsta is a bit of a misnomer; in many ways, the accounts are where users go to be more real, to post thoughts and worries or pictures where they don’t look perfect. Finstas are used as a safer and less judgmental space than the platform at large, or a way to seek support from a specific circle of friends. Finstas are, if anything, an attempt to resist the dangerous undertow of the larger social media landscape.
More importantly, however, the dangers of social media go far beyond parental oversight. Even if the parent is hip and familiar with all the social media platforms, they cannot track everything their child looks at, nor, importantly, can they control what content the algorithm pushes.
For example, Blumenthal himself set up a fake account — effectively a finsta — for a 13-year-old, and followed accounts associated with extreme dieting. Within days, nearly all the suggested accounts were for users encouraging disordered eating, with bios reading “Thinner By The Day” or “TW <3 ED,” standing for “trigger warning: eating disorder.”
But this could happen on any kind of account. I know from my own Instagram use that you don’t need to follow accounts to start receiving suggestions of similar accounts — looking is plenty. Whenever I’m looking for photos of a hairstyle in advance of a haircut appointment, for example, I’m inundated by salon accounts for weeks. When I’m reporting a story about TikTok, all the suggested accounts become TikTok influencers.
And while it might be helpful for a parent to see this, the bigger issue is that it happens at all, and there are no safety measures in place to prevent harmful accounts from existing, let alone being recommended to anyone.
The same goes for the attempts to police hate speech, conspiracy theory or antisemitic accounts. Facebook and Instagram need to put people over profit and value healthy use of their platforms, not making people addicted to them by any means possible so they will have more users and sell more advertisements. One lawmaker compared the platform to cigarettes, wanting to get users hooked so they will become lifelong customers.
Finstas aren’t the problem; the issue is a more fundamental one, inherent to the culture of social media. You don’t need to be looking at pro-eating disorder accounts to still feel bad about your body — all you need is a thin, rich, professionally-styled influencer looking more beautiful and more popular than you ever could to feel bad about yourself. And that’s the currency that drives social media. It’s not going away.
A message from our CEO & publisher Rachel Fishman Feddersen
I hope you appreciated this article. Before you go, I’d like to ask you to please support the Forward’s award-winning, nonprofit journalism during this critical time.
At a time when other newsrooms are closing or cutting back, the Forward has removed its paywall and invested additional resources to report on the ground from Israel and around the U.S. on the impact of the war, rising antisemitism and polarized discourse..
Readers like you make it all possible. Support our work by becoming a Forward Member and connect with our journalism and your community.
— Rachel Fishman Feddersen, Publisher and CEO