Technology

Instagram is working on a programme that will protect users from unsolicited nude pics in their DMs- Technology News, Firstpost

People who send unsolicited nudes in DMs really are the scum of the earth. This was just one of the many ways that made using social media platforms like Instagram, rather disturbing, especially for women. Instagram, finally, is working on a solution that might put an end to this.

Instagram is working on a programme that will protect users from unsolicited nude pics in their DMs

Up until now, the only way to deal with unsolicited nudes was for a user to report the sender to Instagram. We all know how well that usually worked. Now, Instagram is working on a new programme that would filter out unsolicited nude messages sent over direct messages.

The discovery was made by Alessandro Paluzzi, an app researcher, earlier this week. According to a tweet that he posted, Instagram was working on technology that would cover up photos that may contain nudity but noted that the company would not be able to access the photos itself.

Instagram has also confirmed the feature to a number of publications. The Meta-owned company has said the feature is in the early stages of development and it’s not testing this yet.

“We’re developing a set of optional user controls to help people protect themselves from unwanted DMs, like photos containing nudity,” Meta spokesperson Liz Fernandez told a publication. “This technology doesn’t allow Meta to see anyone’s private messages, nor are they shared with us or anyone else. We’re working closely with experts to ensure these new features preserve people’s privacy while giving them control over the messages they receive,” she added.   

Screenshots of the feature posted by Paluzzi suggest that Instagram will process all images for this feature on the device, so nothing is sent to its servers. Plus, you can choose to see the photo if you think it’s from a trusted person. When Instagram rolls the feature out widely, it will be an optional setting for users who want to weed out messages with nude photos.

Last year, Instagram launched DM controls to enable keyword-based filters that work with abusive words, phrases and emojis. Earlier this year, the company introduced a “Sensitive Content” filter that keeps certain kinds of content — including nudity and graphical violence — out of the users’ experience.

Social media has badly grappled with the problem of unsolicited nude photos. While some apps like Bumble have tried tools like AI-powered blurring for this problem, the likes of Twitter have struggled with catching child sexual abuse material (CSAM) and non-consensual nudity at scale.

Because of the lack of solid steps from platforms, lawmakers have been forced to look at this issue with a stern eye. For instance, the U.K.’s upcoming Online Safety Bill aims to make cyber flashing a crime. Last month, the state of California in the United States passed a law that allows receivers of unsolicited graphical material to sue the senders. The state of Texas too passed a law on cyber flashing in 2019, counting it as a “misdemeanour” and resulting in a fine of up to $500.




Source link

Related Articles