Instagram may soon ask users to verify themselves with video selfies

“To help confirm that you’re a real person.", said Instagram

According to screenshots provided to Twitter by social media strategist Matt Navarra, Instagram is requesting some users to produce a video selfie displaying different angles of their face to verify that they’re a real person. Bot accounts, which can send spam messages, harass individuals, or artificially inflate like or follower counts, have long plagued the social network, and it’s probable that Meta (previously Facebook, Instagram’s parent company) is looking to this function to assist combat their popularity.

Image Source: GSMArena.com

The company began testing the functionality last year, but ran into technical hurdles, according to XDA Developers. Several users have said they were requested to record a video selfie to confirm their current accounts.

“Instagram is now using video selfies to confirm users identity.

Meta promises not to collect biometric data.”

Bettina Makalintal, another writer on Twitter, shared a screenshot of the help screen for the step where you actually take the video selfie — it reiterates that it’s looking at “all angles of your face” to prove that you’re a real person, and it shows that the verification screen is appearing for multiple people.

“why the fuck is Instagram making me take a video selfie in order to access my account”

tender juicy tofu pup (@bettinamak)

It’s unclear whether this tool is still in beta or is gradually being rolled out; I tried multiple times to put up a shady-looking Instagram account but was never given the video challenge. Meta didn’t say whether everyone will have to take a video selfie at some point, but Instagram said on Twitter that accounts with questionable activity (such rapidly following a lot of profiles) would be asked to take one. Instagram also stated that the tool does not utilize facial recognition and that the videos are reviewed by Instagram personnel.

“One of the ways we use video selfies is when we think an account could be a bot. For example, if the account likes lots of posts or follows a ton of accounts in a matter of seconds, video selfies help us determine if there’s a real person behind the account or not.”

Given Meta’s recent revelation that one of its Face Recognition services would be shut down, the move may come as a surprise to some. However, the business has subsequently clarified that it was simply shutting down a single Facebook function, not Meta’s entire usage of face recognition. The wording at the bottom of the screenshot further says that face recognition would not be used and that the video will be erased after 30 days.

Image Source: Variety

Meta’s assurance that the data will not be stored or shared may not be enough to comfort some users who are already wary of Facebook. People may recall a problem that allowed attackers to gain access to Instagram users’ purportedly private birthday information (which would soon be required to use the service) with only a DM. Of course, Instagram hasn’t pledged to destroy that birthday information as it has with the video selfie, but it’s hard to blame anyone (especially children or those who want to remain anonymous) for being hesitant to provide such information.

Reactions On Twitter regarding this Step By Instagram-

 

“Oh there’s no way at all that this will go sideways”

-The North Pole is a fusion centre (@hypervisible)

“I guess I’ll be deleting my Instagram account, then.”

-tomato.eth (@thefaketomato)

“Currently can’t access my Instagram because I’m protesting this video selfie thing. Will probably cave by tomorrow, but it feels good to take a stand today.”

-Kendall Ostraw (@KendallOstraw)


There’s a lot of tension in everyone’s mind regarding this news. Let us know your thoughts on this step taken by Instagram in the comment section below!

 

Also Read: Apple Music expands Chinese music reservoir via Tencent deal

Exit mobile version