• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Cont: Musk buys Twitter II

I like the way the walls are decorated with all the company's achievements.
From what I've been told, one of the first things Elon Musk did at the Twitter headquarters was tear out all the creature comforts. They work 20 hours a day in a soulless pit of despair so that the richest man on the planet has a chance to become the first trillionaire.
 
The latest iteration of grok means it has been making sexualised images of children when prompted.

View attachment 67637
Link to conversation

Statement by the UK's Of com.

We are aware of serious concerns raised about a feature on Grok on X that produces undressed images of people and sexualised images of children.

We have made urgent contact with X and xAl to understand what steps they have taken to comply with their legal duties to protect users in the UK. Based on their response we will undertake a swift assessment to determine whether there are potential compliance issues that warrant investigation.
 
Elon Musk's CSAM-generating machine apologies for generating CSAM.

In a related story, this morning my toaster apologized to me for burning my toast.
Yup. X is providing a service through grok. And X didn't see fit to prevent illegal use. Worse, when notified of the issue, at best they treated it about as seriously as a minor usability bug not a safety issue. X could have stopped from as soon as it was notified. Instead they asked people to not do bad things. Which I guess is sufficient to show that they were knowingly allowing illegal activity.

Even if they were unknowing, that would still be a failure in the system. I'm a fan of the naval presumption that the captain is responsible if their ship runs around, even if they are asleep at the time.
 
Yup. X is providing a service through grok. And X didn't see fit to prevent illegal use. Worse, when notified of the issue, at best they treated it about as seriously as a minor usability bug not a safety issue. X could have stopped from as soon as it was notified. Instead they asked people to not do bad things. Which I guess is sufficient to show that they were knowingly allowing illegal activity.

Even if they were unknowing, that would still be a failure in the system. I'm a fan of the naval presumption that the captain is responsible if their ship runs around, even if they are asleep at the time.

Pedocon Theory remains undefeated.
 
I'm sure that if I offered a "print anything service" and someone asked me to print forged £50 notes, I'd be able to get away with it on the basis that someone else asked me to do it.
 

"There are no restrictions on fictional adult sexual content with dark …"

Well, if you're in the UK you should delete X from all your devices because you are ONE received tweet or DM away from committing a strict liability offense under Section 63 of the Criminal Justice and Immigration Act (2008) (as amended 2015), carrying a 2-3 year prison sentence.

(◊◊◊◊◊◊◊ American techbro morons think their local laws apply everywhere.)
 

Back
Top Bottom