950s eroticisminternet has changed how kids learn about sex, but sex ed in the classroom still sucks. In Sex Ed 2.0, Mashable explores the state of sex ed and imagines a future where digital innovations are used to teach consent, sex positivity, respect, and responsibility.
The algorithms that drive products like YouTube, Facebook, and Apple's iOS software share a common challenge: They can't seem to consistently distinguish between pornography and sexual and reproductive health content.
That's because the code engineered to prevent "adult" material from popping up in your timeline or search results can also easily block educational content meant to offer internet users candid, factual information about sex, sexuality, and health.
Critics say the algorithmic confusion may reflect lazy engineering and tech's infamous diversity problem. When the engineers who write code meant to push nudity or porn to the web's margins don't understand or care about the importance of accessing sexual and reproductive health content, especially for LGBTQ youth and other users who've been historically marginalized online, of course algorithms will block the widest possible swath of content. Critics also believe a straightforward solution to this problem exists, but say tech companies aren't interested in addressing their concerns.
SEE ALSO: Sex ed is missing something key for kids who've endured sexual traumaThe online sexual health company O.school reported in October how the iPhone's new software, with the parental control setting enabled, blocked not just its website but numerous entertainment sites and health resources for teens and adolescents. While the filter restricted sites like Teen Vogue and Scarleteen, it didn't deny users access to websites like the neo-Nazi Daily Stormer or the anti-gay Westboro Baptist Church.
That shocking contrast convinced O.school founder Andrea Barrica that Apple's algorithm might just be blocking certain terms, like teen, wholesale in order to prevent any clicks that might possibly send a user to prohibited content (i.e. "teen porn"). Yet Barrica couldn't confirm or dispel her suspicions — or learn anything about Apple's algorithm.
Barrica used her own network to reach Apple employees with hopes of discussing the situation but was met with silence. Then she published a blog post entitled "Censorship and Sex Ed" with pointed questions for Apple: Who designed the filter? Were parents consulted? Conservatives and religious groups? Doctors and Sex educators? What non-porn sites are being blocked?
She never heard from Apple. The company did not comment to Mashable about its algorithms or Barrica's post.
"They’re writing the policies in the most conservative way to avoid the problem."
"They're not targeting sex ed; they’re writing the policies in the most conservative way to avoid the problem," claims Barrica. "Apple’s conservative views on sexuality have so many far-reaching effects."
Barrica isn't the only one who's written an open letter to the platform controlling whether her company's content is seen online. In May 2017, a writer for the menstrual tracking app Clue detailed in a blog post how Facebook blocked the company's ads boosting its sexual and reproductive health content. Educational illustrations that featured vulvas, breasts, and penises were blocked. Ads to promote posts about underwear, birth stories, and puberty advice were also rejected.
A representative for Clue said that while the company declined to comment on the issue, the company stood by its 2017 post. A spokesperson for Facebook said Clue's ads ran afoul of its advertising policy's restrictions on "adult content" that, among other things, forbids ads that include nudity or images focused on individual body parts. Facebook's advertising policies are applied globally and are stricter than its community standards. Clue continues to publish content and advertise on Facebook.
AMAZE, a sex ed video series for adolescents and teens, has faced a similar problem on YouTube. Since its channel launched nearly three years ago, several of its 84 videos have been rejected for advertising because they were deemed to be "adult" content. Those include videos about female and male biological anatomy.
"YouTube advertising is critical to our work at AMAZE because it allows us to reach young people all over the world who are searching for guidance around sex, mental health, and more," Lincoln Mondy, a spokesperson for AMAZE, said in an email.
Though AMAZE is not considered adult content, its videos do include accurate depictions of genitalia and discussions of sexual health. That forthrightness, which is sometimes graphic, could be perceived by an algorithm and human reviewer as violating the platform's policy against advertising adult content that's "non-family safe." YouTube declined to comment on the policies and practices that inform its algorithms.
Tech companies might argue that their algorithms are actually working as designed by flagging content that violates its policies. Yet the fact that, for example, a benign illustration of a breast in an educational context is deemed objectionable gets at a bigger issue.
Part of the challenge facing engineers and tech companies is the reality that sexual health material produced for the internet today is often free of the stigma and shame traditionally associated with talking about sex. Instead of staid explainers that use vague terms and descriptions, this new generation of content asks and answers potentially embarrassing questions, sensitively addresses the diverse concerns of marginalized readers, and is unafraid to use accurate depictions of genitalia, making what once were awkward conversations sound pretty fun along the way.
So engineers who aren't paying attention to this trend, or don't even realize it's happening, are likely to write code that assumes most explicit words or images that appear on the internet are most likely a gateway to porn.
"One of the dynamics is they're not thinking about this as a case at all," says Jon Pincus, a software engineer and entrepreneur who is an adviser to O.school. "Whether it's lazy or overly simplified, my guess is they’re not actually trying to measure if they’re letting legitimate [sexual and reproductive health] stuff in while keeping other stuff out."
Pincus says designing algorithms that perform substantially better than they do today wouldn't be hard. Engineers and the companies that employ them could embrace fairness, accountability, and transparency as guiding principles, particularly because the availability of accurate sex ed information online is a public health issue.
This Tweet is currently unavailable. It might be loading or has been removed.
Ideally, companies using machine learning algorithms would train them with words, images, and descriptions of valid sexual and reproductive health information they want to accept, as well as the adult content or pornography they want to reject.
Beyond their philosophical approach, Pincus says tech companies could invite sexual and reproductive health experts to provide feedback on how algorithms are designed, or even hire them to consult. Pincus says that's common practice in the industry when there are no subject matter experts on staff.
Mondy, of AMAZE, agrees with such an approach.
"To us, the only solution involves an intentional partnership between tech giants and sexual health experts when they’re creating algorithms and content blockers," he said. "Tech giants aren’t sexual health experts and shouldn’t make such consequential decisions on what is and isn’t 'age appropriate' when it comes to online information."
"Tech giants aren’t sexual health experts."
Those companies, however, are reluctant to surrender that power and give outsiders influence over their product. When Tumblr announced last week that it would ban adult content, a spokesperson for the company declined to explain the criteria by which its algorithms and human reviewers would distinguish sex ed from nudity or porn but instead noted that "health-related situations" would still be allowed on the platform.
Though the resistance to transparency makes sense given the ruthless competition in Silicon Valley, Barrica believes tech companies have no incentive to endanger major advertising or a spot in Apple's app store by writing more nuanced algorithms that could maximize access to sexual and reproductive health information but potentially let pornographic content slip through the cracks.
"It's really fear-based," she says. "It goes back to lack of inclusion and diversity, and back to stigma."
"There’s so much power to control what people do and don’t see."
Topics Apple Facebook Health Social Good YouTube
Dystopian Fun! U.S. agency maps 'Hunger Games' districts against real statesChelsea Clinton burns Trump on Twitter for his baffling Sweden remark'I am an immigrant': Fashion icons release emotional video to fight TrumpSamsung confirms Unpacked stream for July 9This is possibly the most hilariously brilliant sports interview of all timeBruce Springsteen gives teen the guitar lesson of a lifetime while on stageSir David Attenborough is back and 2017 is finally starting to look upHow the UK government can hack your personal data'Lightseekers' puts the action back into action figuresHero mom sends her son a care package full of garbageSamsung confirms Unpacked stream for July 9Donald Glover will be the new Simba, but Mufasa will be a familiar faceApple responds to people's tweets with entire commercialsA man with a nasty habit of suing the EPA now leads it, because why not?Hero mom sends her son a care package full of garbage'Lightseekers' puts the action back into action figuresPlease let 'Justice League' be as awesome as this RC BatmobileOfo, one of China's most aggressive bikeMicrosoft CEO says artificial intelligence is the 'ultimate breakthrough'SpaceX's historic rocket launch Saturday could end in another dramatic landing Listen: An Archival Interview with Robert Fagles Coperni's CD 20 actually fun websites to learn something new TikTok's 'I'm not a nosey person' trend is for messy people who live for drama 'Quordle' today: See each 'Quordle' answer and hints for September 26, 2023 Mark Twain: “The Weather Is Too Devilish Hot” iPhone 15 reported issues are piling up: 5 common problems we're hearing The Plum Tree on West 83rd Street The WGA strike is officially over Who killed Ben? Solving 'Only Murders in the Building's Season 3 mystery Individualizing Books: A 1759 Hand An Appreciation of Tove Jansson Nick Sousanis on How Comics Help Us Make Connections 'That's Not My Name' goes viral on TikTok 8 burning 'Only Murders in the Building' questions ahead of the finale Best air purifier deal: The Dyson Pure Humidify + Cool is 42% off at Amazon Shona Sanzgiri’s Photos of Havana—An Embarrassment of Riches Apple Watch Ultra 2 deal: Save $70 via Best Buy Drops Jumping Through Iris Murdoch’s Favorite Painting, “The Flaying of Marsyas”
3.2445s , 8310.0546875 kb
Copyright © 2025 Powered by 【1950s eroticism】,Information Information Network