The Defender Children’s Health Defense News and Views
Close menu
Close menu

You must be a CHD Insider to save this article Sign Up

Already an Insider? Log in

October 5, 2022

Big Brother News Watch

Head Start Educator Says She Went ‘Through Hell’ Fighting to Keep Job Due to State, Federal Vax Mandates + More

The Defender’s Big Brother NewsWatch brings you the latest headlines related to governments’ abuse of power, including attacks on democracy, civil liberties and use of mass surveillance.

The Defender’s Big Brother NewsWatch brings you the latest headlines.

Head Start Educator Says She Went ‘Through Hell’ Fighting to Keep Job Due to State, Federal Vax Mandates

Fox News reported:

The administrator of a Head Start program in Washington state said she went “through hell” fighting to keep her job in the face of state and federal vaccine mandates for program staffers — which still remain even as the rest of the country rolls back pandemic restrictions.

Sarah Werling, a Head Start center manager in Friday Harbor, Washington, told Fox News Digital that due to state regulations, she was barred from the classroom until this fall when Gov. Jay Inslee lifted the state’s coronavirus emergency.

Last October, she was initially granted a religious exemption by her program but was told she would need to be reassigned from her job. Werling said she pushed back against that decision before being allowed to keep her job, with an accommodation that barred her from coming into contact with “children, families and staff” — essentially exiling her from the classroom.

However, even now with Washington’s regulations lifted, Werling says she is forced to get tested for COVID-19 once per week and must wear a mask when in the classroom, due to Head Start’s rules for religious exemptions. The testing requirement is consistent with guidelines laid out in the HHS interim final rule for religious exemptions.

Doctors Sue California Over Law That Restricts Their COVID Advice to Patients

The Washington Times reported:

Two California doctors have filed a federal lawsuit to overturn a new state law restricting the advice they can give patients about COVID-19.

Gov. Gavin Newsom, a Democrat, signed Assembly Bill 2098 into law on Friday. It authorizes the Medical Board of California to levy professional sanctions against and revoke the licenses of doctors who share with patients “misinformation” that challenges the scientific consensus about COVID-19.

Filed Tuesday in the U.S. District Court for the Central District of California, the lawsuit claims the law violates doctors’ freedom of speech and the spirit of scientific inquiry. It names the 12 members of the state medical board and state Attorney General Robert Bonta, a Democrat, as defendants.

“AB 2098 intrudes into the privacy of the doctor-patient relationship, replacing the medical judgment of the government for that of the licensed professional, and chilling the speech of those who dissent from the official view,” the complaint states.

‘Our Hands Are Tied’: Bend-La Pine School Board Fires Three Teachers Who Defied State’s COVID Vaccine Mandate

KTVZ News 21 reported:

Before a crowd, some holding signs like “Rehire Don’t Fire” and “Jobs Not Jabs,” the Bend-La Pine School Board unanimously accepted Superintendent Steve Cook’s recommendations and fired three teachers Tuesday evening for failing to meet a state requirement to receive a COVID-19 vaccination or sign a religious or medical exception form.

​​Supporters of the three teachers rallied outside the school district Administration Building during the board’s closed-door executive session just before the hearings.

Supporter Adin Hess explained why he came out to support the teachers. “They were mandated in order to have a job,” he said. “So it’s unfair, and we see that there needs to be justice for them. We’re here in support of them.”

Former Mountain View High teacher and freshman football coach Mark Schulz, a 25-year member of the Cougars coaching staff, along with La Pine Middle School teacher Zachary Webb and Ensworth Elementary kindergarten teacher Kelly Lundy, each gave statements to explain why, from their moral and religious perspectives, the school district and board had a choice — and was making the wrong one.

Challenges to Vaccine Mandate at CU Medical Campus Narrowed

The Gazette reported:

A federal judge has taken a sprawling lawsuit from 17 anonymous staff and students at the University of Colorado’s Anschutz Medical Campus and pared down the number of legal claims that may go forward challenging the school’s COVID-19 vaccination policy.

U.S. District Court Judge Raymond P. Moore previously declined to block the medical campus’ vaccine mandate, indicating it did not appear to burden the plaintiffs’ religious exercise in an unconstitutional manner. Now, Moore has again reiterated the university acted reasonably by requiring COVID-19 vaccinations but nevertheless permitted some plaintiffs’ allegations to proceed against only some CU defendants.

The primary allegations of the unnamed staff and students remain intact — that the school’s September 2021 policy for obtaining religious exemptions to vaccination violates the religious freedom provisions of the state and federal constitutions.

The case is in an unusual stage, as Moore has refused to pause the proceedings while the Denver-based federal appeals court reviews his previous rulings denying the plaintiffs a preliminary injunction. The U.S. Court of Appeals for the 10th Circuit heard arguments in late September about that issue. Lawyers for the plaintiffs said a decision from the appellate court could affect Moore’s dismissal order.

How a British Teen’s Death Changed Social Media

Wired reported:

Last week, the senior coroner for north London, Andrew Walker, concluded that it was not right to say the British teenager Molly Russell died by suicide and said that posts on Instagram and Pinterest contributed to her death. “She died from an act of self-harm while suffering from depression and the negative effects of online content,” Walker said.

More children than Molly are exposed to disturbing content online. Almost two-thirds of British children aged 3-15 use social media, and one-third of online children aged 8-15 have seen worrying or upsetting content online in the past 12 months, according to a 2022 report by British media regulator Ofcom. Child protection campaigners say posts showing self-harm are still available, even if they are now harder to find than in 2017.

But Molly’s case is believed to be the first time social media companies have been required to take part in legal proceedings that linked their services to the death of a child. The platforms were found to have hosted content that glamorized self-harm and promoted keeping feelings about depression secret, says Merry Varney, solicitor at Leigh Day, the law firm representing the Russell family. Those findings “captured all the elements of why this material is so harmful,” she adds.

Pinterest and Instagram have tweaked the way their platforms work since Molly’s death, but child safety campaigners hope that looming U.K. regulation will bring more radical change. The inquest has renewed pressure on the new British government to introduce the long-awaited Online Safety Bill, which culture minister Michelle Donelan promised to bring back to Parliament before Christmas this year. “We need the bill to be brought forward as quickly as possible now,” says Hannah Rüschen, senior policy and public affairs officer at British child protection charity, NSPCC.

Elon Musk’s Plans for Twitter May Take Inspiration From Chinese Super Apps

CNBC reported:

Elon Musk’s revived $44 billion deal to buy Twitter sparked fresh debate over what the billionaire will do with the service if he eventually owns it. On Tuesday, Musk tweeted that buying Twitter is an “accelerant to creating X, the everything app.” He did not provide further details.

Musk may be hinting toward so-called “super apps” which are popular in China and other parts of Asia and pioneered by the likes of Chinese technology giant Tencent.

Super apps is a term to describe an app that often acts as a one-stop shop for all your mobile needs. For example, you might order a taxi or food via the app and at the same time do payments and messaging. This eliminates the need to have multiple apps for different functions.

Chinese app WeChat, run by Tencent, is the biggest super app in the world, with over a billion users.

Kansas City Police Look to Amazon Alexa for Clues Into Double Homicide

New York Post reported:

Kansas City police are seeking access to an Amazon Alexa they hope can help their investigation into the slayings of two medical researchers found dead over the weekend.

Detectives with the police department filed a search warrant in Jackson County for access to the device’s cloud storage, according to a report obtained by KSHB.

The investigators hope the Alexa contains clues as to who fatally shot researchers Camila Behrensen, 24, and Pablo Guzman-Palma, 25, before setting fire to the apartment they were found in Saturday morning.

The Alexa may have captured conversations between the killer and the victims before the shooting, the filing says.

The News Media’s Credibility Is at an All-Time Low: Why Businesses Should Care

Newsweek reported:

The credibility of the news media in the U.S. is just above its all-time low. In fact, of U.S. adult respondents to a Gallup poll, 29% said they had little trust in news media reporting and 34% had “none at all.” This should alarm everyone — but especially businesses.

We’ve certainly witnessed more news media outlets across the board become less neutral and more slanted in their reporting in recent years, which isn’t helping the cause. At times, modern news stories are nearly indistinguishable from opinion and commentary pieces.

What’s more, Americans have different opinions on which outlets are misrepresenting facts or fabricating information based on their political biases and worldviews. Some believe that the mainstream media is fake news and the alternative media is telling the truth, whereas others believe that alternative news is fake news and the mainstream media is telling the truth.

As a result, many are trying to fight fake news through increased censorship measures designed to combat disinformation by flagging potential “fake” stories and making it easier for users to report them. Although these may seem like quick solutions, they could actually further damage the credibility of the media among those who already fear the mainstream media is biased or fake.

China Bans Residents From Leaving Xinjiang, Just Weeks After Its Last COVID Lockdown

CNN World reported:

China has banned residents from leaving Xinjiang over a COVID-19 outbreak — just weeks after the far-western region began relaxing restrictions from a stringent extended lockdown, fueling public frustration among those scarred by food shortages and plunging incomes.

On Tuesday, the region — home to 22 million people, many belonging to ethnic minorities — reported 38 new asymptomatic COVID cases.

It was enough to alarm officials, with Xinjiang’s Vice Chairman Liu Sushe vowing to “strengthen the control of cross-regional personnel and insist that people do not leave the region unless it is necessary.”

Liu added that Xinjiang will strengthen control measures in airports, train stations and checkpoints to prevent the virus from spreading to other parts of the country. All outbound trains, inter-provincial buses and most flights will be suspended until further notice.

To Prevent Unnecessary Biopsies, Scientists Train an AI Model to Predict Breast Cancer Risk From MRI Scans

STAT News reported:

A biopsy that turns out to have benign results can be a relief. But in some cases, it could also mean a patient whose risk of cancer was low from the start has gone through an unnecessarily invasive procedure.

By and large, radiologists recommend that patients whose breast MRI scans raise suspicion of a cancerous growth get a biopsy done. But MRIs often pick up on benign lesions that other mammograms and ultrasounds may not. This leads to some patients having their lesions falsely classified as higher risk than they are, and undergoing a biopsy.

In a new paper published recently in Science Translational Medicine, Jan Witowski, a postdoctoral research fellow at New York University Langone Health, and his colleagues at NYU and Jagiellonian University in Poland present an artificial intelligence tool that can predict the probability of breast cancer in MRI scans as well as a panel of board-certified radiologists.

In a retrospective analysis, it was also capable of reducing unnecessary biopsies by up to 20% for patients whose MRIs show suspicious lesions that might warrant a biopsy, officially known as BI-RADS category 4 lesions.

Spotify Acquires Content Moderation Tech Company Kinzen to Address Platform Safety Issues

TechCrunch reported:

Spotify this morning announced it’s acquiring Dublin, Ireland-based content moderation tech company Kinzen, which had been working in partnership with the streamer since 2020. Deal terms were not disclosed. At Spotify, Kinzen’s technology will be put to use to help the company better moderate podcasts and other audio using a combination of machine learning and human expertise — the latter of which includes analysis from local academics and journalists, the company says.

Founded in 2017 by Áine Kerr, Mark Little and Paul Watson, Kinzen’s mission has focused on protecting public conversations from “dangerous misinformation and harmful content,” according to its website. This is an area Spotify has had direct experience with due to the controversy over its top podcaster, Joe Rogan, who spread COVID-19 vaccine-related misinformation on his show, leading to a public backlash and PR nightmare for the company.

At one point, 270 physicians and scientists signed an open letter to Spotify demanding that it create misinformation policies to address the matter. The hashtag #deletespotify was trending, and high-profile artists like Neil Young and Joni Mitchell pulled their music from the service in protest.

Spotify later revised its policies around COVID-19 and misinformation in early 2022, though critics and experts argued the actual changes fell short of making a sizable impact. This June, Spotify took another step toward getting a better handle on the content published to its platform with the creation of a “Safety Advisory Council,” whose job is to help guide Spotify’s future content moderation decisions.

Suggest A Correction

Share Options

Close menu

Republish Article

Please use the HTML above to republish this article. It is pre-formatted to follow our republication guidelines. Among other things, these require that the article not be edited; that the author’s byline is included; and that The Defender is clearly credited as the original source.

Please visit our full guidelines for more information. By republishing this article, you agree to these terms.

Woman drinking coffee looking at phone

Join hundreds of thousands of subscribers who rely on The Defender for their daily dose of critical analysis and accurate, nonpartisan reporting on Big Pharma, Big Food, Big Chemical, Big Energy, and Big Tech and
their impact on children’s health and the environment.

  • This field is for validation purposes and should be left unchanged.
  • This field is hidden when viewing the form
  • This field is hidden when viewing the form
    MM slash DD slash YYYY
  • This field is hidden when viewing the form