Google

Post Reply
User avatar
RTH10260
Posts: 7229
Joined: Mon Feb 22, 2021 10:16 am
Location: Switzerland, near the Alps
Verified: eurobot

Google

#1

Post by RTH10260 »

Revealed: Google illegally underpaid thousands of workers across dozens of countries
The documents show Google executives and attorneys at one point pursued a plan to come into compliance slowly and at the least possible cost to itself.


Julia Carrie Wong
Fri 10 Sep 2021 18.13 BST

Google has been illegally underpaying thousands of temporary workers in dozens of countries and delayed correcting the pay rates for more than two years as it attempted to cover up the problem, the Guardian can reveal.

Google executives have been aware since at least May 2019 that the company was failing to comply with local laws in the UK, Europe and Asia that mandate temporary workers be paid equal rates to full-time employees performing similar work, internal Google documents and emails reviewed by the Guardian show.

But rather than immediately correct the errors, the company dragged its feet for more than two years, the documents show, citing concern about the increased cost to departments that rely heavily on temporary workers, potential exposure to legal claims, and fear of negative press attention.

Google executives and attorneys at one point pursued a plan to come into compliance slowly and at the least possible cost to itself, despite acknowledging that such a move was not “the correct outcome from a compliance perspective” and could place the staffing companies it contracts with “in a difficult position, legally and ethically”.

Google admitted the failures and said it would conduct an investigation after being contacted by the Guardian.


https://www.theguardian.com/technology/ ... -documents


User avatar
bill_g
Posts: 3199
Joined: Mon Feb 22, 2021 5:52 pm
Location: Portland OR
Occupation: Retired (kind of)
Verified: ✅ Very Fied 🐱‍🏍✨

Re: Google

#2

Post by bill_g »

This is where Gomer Pyle says "Sooprize Sooprize Soooprize".


User avatar
RTH10260
Posts: 7229
Joined: Mon Feb 22, 2021 10:16 am
Location: Switzerland, near the Alps
Verified: eurobot

Re: Google

#3

Post by RTH10260 »

Google refuses to reinstate man’s account after he took medical images of son’s groin
Experts say case highlights well-known dangers of automated detection of child sexual abuse images

Johana Bhuiyan
Tue 23 Aug 2022 00.32 BST

Google has refused to reinstate a man’s account after it wrongly flagged medical images he took of his son’s groin as child sexual abuse material (CSAM), the New York Times first reported. Experts say it’s an inevitable pitfall of trying to apply a technological solution to a societal problem.

Experts have long warned about the limitations of automated child sexual abuse image detection systems, particularly as companies face regulatory and public pressure to help address the existence of sexual abuse material.

“These companies have access to a tremendously invasive amount of data about people’s lives. And still they don’t have the context of what people’s lives actually are,” said Daniel Kahn Gillmor, a senior staff technologist at the ACLU. “There’s all kinds of things where just the fact of your life is not as legible to these information giants.” He added that the use of these systems by tech companies that “act as proxies” for law enforcement puts people at risk of being “swept up” by “the power of the state.”

The man, only identified as Mark by the New York Times, took pictures of his son’s groin to send to a doctor after realizing it was inflamed. The doctor used that image to diagnose Mark’s son and prescribe antibiotics. When the photos were automatically uploaded to the cloud, Google’s system identified them as CSAM. Two days later, Mark’s Gmail and other Google accounts, including Google Fi, which provides his phone service, were disabled over “harmful content” that was “a severe violation of the company’s policies and might be illegal”, the Times reported, citing a message on his phone. He later found out that Google had flagged another video he had on his phone and that the San Francisco police department opened an investigation into him.

Mark was cleared of any criminal wrongdoing, but Google has said it will stand by its decision.

“We follow US law in defining what constitutes CSAM and use a combination of hash matching technology and artificial intelligence to identify it and remove it from our platforms,” said Christa Muldoon, a Google spokesperson.

Muldoon added that Google staffers who review CSAM were trained by medical experts to look for rashes or other issues. They themselves, however, were not medical experts and medical experts were not consulted when reviewing each case, she said.

That’s just one way these systems can cause harm, according to Gillmor. To address, for instance, any limitations algorithms might have in distinguishing between harmful sexual abuse images and medical images, companies often have a human in the loop. But those humans are themselves inherently limited in their expertise, and getting the proper context for each case requires further access to user data. Gillmor said it was a much more intrusive process that could still be an ineffective method of detecting CSAM.

“These systems can cause real problems for people,” he said. “And it’s not just that I don’t think that these systems can catch every case of child abuse, it’s that they have really terrible consequences in terms of false positives for people. People’s lives can be really upended by the machinery and the humans in the loop simply making a bad decision because they don’t have any reason to try to fix it.”






https://www.theguardian.com/technology/ ... nt-blocked


User avatar
raison de arizona
Posts: 10601
Joined: Mon Feb 22, 2021 10:21 am
Location: Nothing, Arizona
Occupation: bit twiddler
Verified: ✔️ certifiable

Re: Google

#4

Post by raison de arizona »



"Take these seeds and put them in your pocket, so at least sunflowers will grow when you all die here."
User avatar
sugar magnolia
Posts: 2257
Joined: Mon Feb 22, 2021 12:54 pm

Re: Google

#5

Post by sugar magnolia »

We still laugh about our friend getting in a screaming match with her google maps in the car when it kept trying to take us across the lake at City Park in NOLA instead of around it. If I never hear "recalculating" again I'll be fine with that.


User avatar
Volkonski
Posts: 8775
Joined: Mon Feb 22, 2021 11:06 am
Location: Texoma and North Fork of Long Island
Occupation: Retired mechanical engineer
Verified:

Re: Google

#6

Post by Volkonski »

Google Maps gives our North Fork address as being in the Hamlet of Laurel. In fact it is in Jamesport. However the directions it gives are correct.


“If everyone fought for their own convictions there would be no war.” ― Leo Tolstoy, War and Peace
User avatar
AndyinPA
Posts: 5926
Joined: Mon Feb 22, 2021 10:42 am
Location: Pittsburgh
Verified:

Re: Google

#7

Post by AndyinPA »

Google Maps has a road running next to our property for 500 feet. It's never been there. It exists on paper only. The owner once thought about putting in a small development there, but it never happened.


"When enough people make false promises, words stop meaning anything. Then there are no more answers, only better and better lies." - Jon Snow, GOT
Post Reply

Return to “Economy”