Child Safety Advocate warns of the Hyper-sexualization of Boys and Girls that spend a lot of time online and social media

Romans 1:28 “And since they did not see fit to acknowledge God, God gave them up to a debased mind to do what ought not to be done.”

Important Takeaways:

  • ‘Hyper-Sexualization of Boys and Girls’: Child Safety Advocates Demand Action to Protect Kids from Porn
  • If you have kids, you know they spend a lot of time online, and they use social media to share just about everything, from videos to dance moves to the latest fashion trends. But experts warn that can be very risky behavior as these online platforms have become hubs for sexual predators.
  • Child safety advocates warn social media exposes kids to harmful content, which can make them vulnerable to sexual exploitation.
  • “What we see over and over again is the hyper-sexualization of both boys and girls,” Lina Nealon of the National Center on Sexual Exploitation told CBN News.
  • According to the Internet Watch Foundation, 2021 broke all records for online child sexual abuse with 252,000 URL’s found containing images or videos of children being sexually abused compared with 153,000 the previous year.
  • Meanwhile, many child safety advocates stress the importance of parents talking with their kids about online predatory dangers as well as the need for legislation to protect them.
  • “We need to change federal and state policies to hold these corporations accountable,” Nealon said. “There’s so much more they can be doing. They have the resources, the talent to really augment and prioritize child protection, but from what we can see they’re prioritizing profit over and over again.”

Read the original article by clicking here.

Top hotels sued for ‘industry-wide failures’ to prevent U.S. sex trafficking

By Matthew Lavietes

NEW YORK (Thomson Reuters Foundation) – Landmark U.S. legal action was filed on Monday accusing several major hotel groups of profiting from sex trafficking on behalf of 13 women who claimed they were sold for sex in hotel rooms.

Twelve hotel chains were named and accused of knowing and ignoring warning signs that women and children were sold as sex slaves on their premises, according to the filing, a consolidation of 13 existing cases, in U.S. federal court in Columbus, Ohio.

The filing marked the first time the hotel industry – which has long been accused of serving as a breeding ground for sexual exploitation of women and children – faced action as a group.

The case drew together 13 separate actions that had been filed in Ohio, Massachusetts, Georgia, Texas and New York.

Among those named in the 13 cases were Hilton Worldwide Holdings Inc., Red Roof Inn, Intercontinental Hotels & Resorts, Best Western Hotels & Resorts and Wyndham Hotels and Resorts Inc.

Representatives of the hotel groups did not immediately respond to requests for comment.

The milestone case was filed by the New York law firm Weitz & Luxenberg on behalf of 13 women, many of whom were minors when they said the trafficking occurred.

The hotels “derived profit” and “benefited financially” by “providing a marketplace for sex trafficking,” the case said, citing “industry-wide failures.”

“Such corporate malfeasance has led to a burgeoning of sex trafficking occurring in … hotels that has reached the level of a nationwide epidemic,” it said.

An estimated 400,000 people are believed trapped in modern slavery in the United States, from forced labor to sex trafficking, according to the Global Slavery Index, published by the human rights group Walk Free Foundation.

“This is not one bad apple that need to be dealt with,” said Luis CdeBaca, former U.S. anti-trafficking ambassador-at-large.

“The entire barrel has a problem … For years the hospitality industry has known that sex trafficking and especially child sex trafficking has occurred on their properties and yet it continues to happen.”

One of the women in the complaint said she was held captive at age 26 at various locations of Wyndham Hotels for six weeks in 2012.

During her captivity, she said her nose was broken twice, her lip was permanently scarred and her face grew infected from repeated beatings.

“I just wish that people realize how much it really is here in the U.S.,” she told the Thomson Reuters Foundation. “It doesn’t matter if it’s a shady hotel or a nice hotel, it’s going on in all of them.”

Several hotel chains have launched initiatives in recent years to tackle trafficking, such as training staff to identify potential victims and raising awareness of the crime among guests.

“These changes have arrived far too late,” said the court documents. “Profit motives, not adherence to the law, continues to drive their decision making.”

The case seeks unspecified damages.

Weitz & Luxenberg has earned a reputation in personal injury and malpractice cases against companies that made or used asbestos, which has been linked to cancer.

“This is bringing that expertise from the multi-district litigation space to see if it could have the kind of impact in the trafficking world that it’s had in other spaces,” said Bridgette Carr, head of the University of Michigan’s Human Trafficking Clinic.

(Reporting by Matthew Lavietes; Editing by Ellen Wulfhorst. ((Please credit the Thomson Reuters Foundation, the charitable arm of Thomson Reuters, that covers humanitarian news, women’s and LGBT+ rights, human trafficking, property rights and climate change. Visit http://news.trust.org)

Facebook removes 8.7 million sexual photos of kids in last three months

FILE PHOTO: A Facebook page is displayed on a computer screen in Brussels, Belgium, April 21, 2010. REUTERS/Thierry Roge/File Photo

By Paresh Dave

SAN FRANCISCO (Reuters) – Facebook Inc said on Wednesday that company moderators during the last quarter removed 8.7 million user images of child nudity with the help of previously undisclosed software that automatically flags such photos.

The machine learning tool rolled out over the last year identifies images that contain both nudity and a child, allowing increased enforcement of Facebook’s ban on photos that show minors in a sexualized context.

A similar system also disclosed on Wednesday catches users engaged in “grooming,” or befriending minors for sexual exploitation.

Facebook’s global head of safety Antigone Davis told Reuters in an interview that the “machine helps us prioritize” and “more efficiently queue” problematic content for the company’s trained team of reviewers.

The company is exploring applying the same technology to its Instagram app.

Under pressure from regulators and lawmakers, Facebook has vowed to speed up removal of extremist and illicit material. Machine learning programs that sift through the billions of pieces of content users post each day are essential to its plan.

Machine learning is imperfect, and news agencies and advertisers are among those that have complained this year about Facebook’s automated systems wrongly blocking their posts.

Davis said the child safety systems would make mistakes but users could appeal.

“We’d rather err on the side of caution with children,” she said.

Facebook’s rules for years have banned even family photos of lightly clothed children uploaded with “good intentions,” concerned about how others might abuse such images.

Before the new software, Facebook relied on users or its adult nudity filters to catch child images. A separate system blocks child pornography that has previously been reported to authorities.

Facebook has not previously disclosed data on child nudity removals, though some would have been counted among the 21 million posts and comments it removed in the first quarter for sexual activity and adult nudity.

Shares of Facebook fell 5 percent on Wednesday.

Facebook said the program, which learned from its collection of nude adult photos and clothed children photos, has led to more removals. It makes exceptions for art and history, such as the Pulitzer Prize-winning photo of a naked girl fleeing a Vietnam War napalm attack.

The child grooming system evaluates factors such as how many people have blocked a particular user and whether that user quickly attempts to contact many children, Davis said.

Michelle DeLaune, chief operating officer at the National Center for Missing and Exploited Children (NCMEC), said the organization expects to receive about 16 million child porn tips worldwide this year from Facebook and other tech companies, up from 10 million last year.

With the increase, NCMEC said it is working with Facebook to develop software to decide which tips to assess first.

Still, DeLaune acknowledged that a crucial blind spot is encrypted chat apps and secretive “dark web” sites where much of new child pornography originates.

Encryption of messages on Facebook-owned WhatsApp, for example, prevents machine learning from analyzing them.

DeLaune said NCMEC would educate tech companies and “hope they use creativity” to address the issue.

(Reporting by Paresh Dave; Editing by Greg Mitchell)