The Defender Children’s Health Defense News and Views
Close menu
Close menu

You must be a CHD Insider to save this article Sign Up

Already an Insider? Log in

August 14, 2023

Big Brother News Watch

Biden Censors Battered — Expect an Epic Supreme Court Showdown + More

The Defender’s Big Brother NewsWatch brings you the latest headlines related to governments’ abuse of power, including attacks on democracy, civil liberties and use of mass surveillance. The views expressed in the excerpts from other news sources do not necessarily reflect the views of The Defender.

The Defender’s Big Brother NewsWatch brings you the latest headlines.

Biden Censors Battered — Expect an Epic Supreme Court Showdown

New York Post reported:

Federal judges hammered fresh nails into the coffin of the Biden censorship regime Thursday in New Orleans. The thrashing the administration received will likely set up an epic Supreme Court battle that could help redefine freedom for our era.

The Biden administration rushed to sway the appeals court to postpone enforcement of the injunction and then sought to redefine all its closed-door shenanigans as public service.

At least two of the three judges on last week’s panel will likely uphold all or part of the injunction against federal censorship. The Biden administration will probably speedily appeal the case to the Supreme Court, setting up an epic showdown.

If Team Biden can destroy freedom of speech by renaming censorship “content moderation,” what other freedoms will it destroy with rhetorical scams? If endless demands by the FBI and other agencies don’t amount to “coercion,” then it is folly to expect the feds to ever admit how they are decimating Americans’ rights and liberties.

The Kids Online Safety Act Isn’t All Right, Critics Say

Ars Technica reported:

Debate continues to rage over the federal Kids Online Safety Act (KOSA), which seeks to hold platforms liable for feeding harmful content to minors. KOSA is lawmakers’ answer to whistleblower Frances Haugen’s shocking revelations to Congress. In 2021, Haugen leaked documents and provided testimony alleging that Facebook knew that its platform was addictive and was harming teens — but blinded by its pursuit of profits, it chose to ignore the harms.

Sen. Richard Blumenthal (D-Conn.), who sponsored KOSA, was among the lawmakers stunned by Haugen’s testimony. He said in 2021 that Haugen had shown that “Facebook exploited teens using powerful algorithms that amplified their insecurities.” Haugen’s testimony, Blumenthal claimed, provided “powerful proof that Facebook knew its products were harming teenagers.”

But when Blumenthal introduced KOSA last year, the bill faced immediate and massive blowback from more than 90 organizations — including tech groups, digital rights advocates, legal experts, child safety organizations, and civil rights groups. These critics warned lawmakers of KOSA’s many flaws, but they were most concerned that the bill imposed a vague “duty of care” on platforms that was “effectively an instruction to employ broad content filtering to limit minors’ access to certain online content.”

The fear was that the duty of care provision would likely lead platforms to over-moderate and imprecisely filter content deemed controversial — things like information on LGBTQ+ issues, drug addiction, eating disorders, mental health issues, or escape from abusive situations.

Not all critics agreed that recent changes to the bill go far enough to fix its biggest flaws. In fact, the bill’s staunchest critics told Ars that the legislation is incurably flawed — due to the barely changed duty of care provision — and that it still risks creating more harm than good for kids. These critics also warn that all Internet users could be harmed, as platforms would likely start to censor a wide range of protected speech and limit user privacy by age-gating the Internet.

A Huge Scam Targeting Kids With Roblox and Fortnite ‘Offers’ Has Been Hiding in Plain Sight

Wired reported:

Thousands of websites belonging to U.S. government agencies, leading universities, and professional organizations have been hijacked over the last half decade and used to push scammy offers and promotions, new research has found. Many of these scams are aimed at children and attempt to trick them into downloading apps, malware, or submitting personal details in exchange for nonexistent rewards in Fortnite and Roblox.

For more than three years, security researcher Zach Edwards has been tracking these website hijackings and scams. He says the activity can be linked back to the activities of affiliate users of one advertising company. The U.S.-registered company acts as a service that sends web traffic to a range of online advertisers, allowing individuals to sign up and use its systems. However, on any given day, Edwards, a senior manager of threat insights at Human Security, uncovers scores of .gov, .org, and .org domains being compromised.

The schemes and ways people make money are complex, but each of the websites is hijacked in a similar way. Vulnerabilities or weaknesses in a website’s backend, or its content management system, are exploited by attackers who upload malicious PDF files to the website. These documents, which Edwards calls “poison PDFs,” are designed to show up in search engines and promote “free Fortniteskins,” generators for Roblox’s in-game currency, or cheap streams of Barbie, Oppenheimer, and other popular films. The files are packed with words people may search for on these subjects.

When someone clicks the links in the poison PDFs, they can be pushed through multiple websites, which ultimately direct them to scam landing pages, says Edwards, who presented the findings at the Black Hat security conference in Las Vegas. There are “lots of landing pages that appear super targeted to children,” he says.

Mental Health & Social Media: What Message Prevails?

ZeroHedge reported:

Mental health is a topic never far from the minds of girls as young as 11-15 years old in the United States, according to a recent study by Common Sense Media, an online parental guidance platform.

In real life, nearly seven in ten girls reported having had exposure to helpful mental health content and information in real life each month. But on the flip side, just under half (45%) said they heard or saw harmful content about suicide or self-harm while just under four in ten (38%) said the same of harmful content on eating disorders.

The report found that girls report that exposure to both topics is prevalent across TikTok, Instagram, YouTube, Snapchat and even messaging apps, with more than one in three girls reporting that they “hear or see things about suicide or self-harm that is upsetting to [them]” at least monthly on all platforms, with 15% of girls who use TikTok and Instagram saying they come across this type of content on the platforms on a daily basis. The figures are even higher for those with depressive symptoms.

Some 75% of the girls who reported moderate to severe depressive symptoms who use Instagram said they come across harmful suicide-related content on the platform at least once a month. This is nearly three times the likelihood of the girls without depressive symptoms who come across the content at the same frequency (26%).

According to the report, a similar pattern emerges for TikTok users (69% of girls with moderate to severe depressive symptoms see the harmful content versus 27% of girls without depressive symptoms) as well as for the other platforms.

TikTok Is Letting People Shut Off Its Infamous Algorithm — and Think for Themselves

Wired reported:

TikTok recently announced that its users in the European Union will soon be able to switch off its infamously engaging content-selection algorithm. The EU’s Digital Services Act (DSA) is driving this change as part of the region’s broader effort to regulate AI and digital services in accordance with human rights and values.

TikTok’s algorithm learns from users’ interactions — how long they watch, what they like, when they share a video —to create a highly tailored and immersive experience that can shape their mental states, preferences, and behaviors without their full awareness or consent. An opt-out feature is a great step toward protecting cognitive liberty, the fundamental right to self-determination over our brains and mental experiences.

Rather than being confined to algorithmically curated For You pages and live feeds, users will be able to see trending videos in their region and language, or a “Following and Friends” feed that lists the creators they follow in chronological order. This prioritizes popular content in their region rather than content selected for its stickiness. The law also bans targeted advertisements to users between 13 and 17 years old and provides more information and reporting options to flag illegal or harmful content.

A well-structured plan requires a combination of regulations, incentives, and commercial redesigns focusing on cognitive liberty. Regulatory standards must govern user engagement models, information sharing, and data privacy. Strong legal safeguards must be in place against interfering with mental privacy and manipulation. Companies must be transparent about how the algorithms they’re deploying work, and have a duty to assess, disclose, and adopt safeguards against undue influence.

Government Targeting U.K. Minorities With Social Media Ads Despite Facebook Ban

The Guardian reported:

Government agencies and police forces are using hyper-targeted social media adverts to push messages about migration, jobs and crime to minority groups. Many of the ads are targeted using data linked to protected characteristics including race, religious beliefs and sexual orientation. Stereotypes about interests and traits such as music taste and hair type are also widely used.

In one case, a government campaign aimed at helping young people off benefits was targeted at Facebook users with interests including “afro-textured hair” and the “West Indies cricket team”.

The “microtargeting” is revealed in an analysis of more than 12,000 ads that ran on Facebook and Instagram between late 2020 and 2023. Supplied to U.K. academics by Facebook’s parent company Meta, and shared with the Observer, the data gives an insight into the use of targeted advertising by the state based on profiling by the world’s biggest social media company.

In 2021, Facebook announced a ban on targeting based on race, religion and sexual orientation amid concerns about discrimination, which led to the removal of several interest categories that had been used by advertisers to reach and exclude minority groups. But the latest analysis suggests interest labels assigned by Facebook based on web browsing and social media activity are routinely used as a proxy.

New Zealand, Whose Pandemic Response Was Closely Watched, Removes Last of COVID Restrictions

Associated Press reported:

New Zealand on Monday removed the last of its remaining COVID-19 restrictions, marking the end of a government response to the pandemic that was watched closely around the world.

Prime Minister Chris Hipkins said the requirement to wear masks in hospitals and other healthcare facilities would end at midnight, as would a requirement for people who caught the virus to isolate themselves for seven days.

Reflecting on the government’s response to the virus over more than three years, Hipkins said that during the height of the pandemic he had longed for the day he could end all restrictions, but now it felt anticlimactic.

Suggest A Correction

Share Options

Close menu

Republish Article

Please use the HTML above to republish this article. It is pre-formatted to follow our republication guidelines. Among other things, these require that the article not be edited; that the author’s byline is included; and that The Defender is clearly credited as the original source.

Please visit our full guidelines for more information. By republishing this article, you agree to these terms.

Woman drinking coffee looking at phone

Join hundreds of thousands of subscribers who rely on The Defender for their daily dose of critical analysis and accurate, nonpartisan reporting on Big Pharma, Big Food, Big Chemical, Big Energy, and Big Tech and
their impact on children’s health and the environment.

  • This field is for validation purposes and should be left unchanged.
  • This field is hidden when viewing the form
  • This field is hidden when viewing the form
    MM slash DD slash YYYY
  • This field is hidden when viewing the form