• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

Does 'rape culture' accurately describe (many) societies?

Curious if this question has been asked yet -

"This thing only exists to stimulate sexual arousal". Well, ... so what? How is that inherently a bad thing, absent any other factors like non-consent, etc. ?
Anything that feels good must necessarily be morally evil.
 
Do you think discussing slavery would help you arrive at a definition of the porn you want to ban? I really don't see how. Is there some question over whether or not the definition of porn should specifically include or exclude depictions or accounts of slavery? Maybe you could explain the connection between the definitions.

That doesn't sound like a problem with the definition of trafficking or slavery, but rather, the distinctly different legal question of under what conditions the trafficked person is or is not culpable for the crimes they commit which may or may not have been under any duress.
I'll try to clarify.

Based on your:
I'm not the one proposing major new legislation that (justifiably or not) reduces citizens' rights and is Constitutionally questionable (likely unconstitutional) in my country.

it seems that you are saying a porn law would inevitably overreach and we would end up criminalising those that are not actually making porn. What I am saying is that, in the same way, the Modern Slavery Act of 2015 (MSA) suffers from the issue of defining what is criminal and what is not - to the extent that, as I quoted:

"...you have one government body finding that a young person has been trafficked, while another insists that they are a perpetrator who should face the full force of the law."

Guilt will be dependant on how one defines 'compulsion' and 'duress' - just as defining what is and what is not pornographic will do so. As outlined in One Pump Court regarding the application of section 45 of the MSA:

The Defence is tailored to the particular factual scenarios that can be commonly present in cases of human trafficking. This makes it more applicable, and therefore potentially easier in principle for defendants to rely upon, than the common law defence of duress (though it may often be the case that both defences are brought simultaneously).

For example, in trafficking cases, the defence of duress could potentially fall short where a traumatised victim is compelled to commit an act without the specific threat of immediate danger. One example of such cases are those that have been documented by The Guardian, whereby Nigerian victims of human trafficking are subjugated through the use of ‘voodoo’ or ‘juju’ rituals. These rituals operate to make a victim feel completely bound to and controlled by their trafficker and extremely afraid of being cursed.


So even though the common law defence of duress might fall short, Section 45 could still cover the defendant.

In the same way that we did not balk at updating the law on slavery (indeed, did not balk at passing The Abolition of Slavery Act in 1833), we shouldn't do the same with a law against porn just because someone is pointing out potential hurdles.

The UK already bans porn - extreme porn. I expect we could find grey areas if we tried.

That depends entirely on what definition of porn you wish to codify into a legal ban. Please state that definition.
I'll go with:
The relationship between pornography use and harmful sexual attitudes and behaviours: literature review:
For the purpose of this review, ‘pornography’ is defined as ‘any media (including: internet, books, videos, magazines etc.) intended to sexually arouse consumers through the depiction of nudity or explicit sexual behaviour.’
Should it or should it not include "Let's you and me get it on tonight!" printed on a Valentine card along with pictures of hearts and flowers?
If that is expressing the thoughts of an individual to their partner, then I can't see that that would fall foul of a porn law.
Should it or should it not include "Let's you and me get it on tonight!" printed on a Valentine card along with pictures of handcuffs, a blindfold, a ball gag, and a riding crop?
Getting warmer - but without further details I would suspect not.
Should it or should it not include a stage production of Romeo and Juliet where the lead performers are costumed and made up to portray teenagers, including briefly appearing nude on stage, but not visibly engaging in any real or simulated sex act except kissing?
Not sure. What we do know, as I posted previously, American Beauty was never banned under CPPA. You never responded to that fact.
Should it or should it not include vibrators as I defined them in my earlier post?
I guess that wouldn't violate the definition given.
Should it or should it not include a detailed illustration in a health textbook for sixth graders (11-12 year olds) showing how to insert and position a tampon?
No.
And don't forget, the question "should it include..." actually means "should the person or persons responsible for... have their freedom or property taken away by the apparatus of the state?" Because that's how laws actually work.
Indeed, yes.

Now list all the scenarios that might equally fall between the cracks regarding MSA and tell us we should never have legislated.
 
No. There's a lot more that goes into the perceived age of the actor than just their acting. Makeup, lighting, wardrobe, the script... You may not have clocked Buffy as 16, but that doesn't automatically mean you clocked her at 20-something.
Is there an issue with a particular scene of BTVS (of a sexual nature) that we need to know about?
 
Hypothesis. If the Internet were "overrun" with such material, then we would see certain things.
  • You and I would be stumbling across it all the time, in our normal websurfing.
A reminder that the 'such material' in question is child abuse content. Was Nicholas Kristof wrong when he said that Pornhub was infested with such? A reminder they removed over 80% of their content.
  • Avoiding/blocking/reporting such material would be touted as a key feature of anti-virus, ad-blocking, and vpn products.
  • Half a dozen commentary YouTubers with over a million followers each would have each done a piece on it by now.
We don't see any of those things, ergo, the headline is full of ◊◊◊◊. I've seen more articles and commentary about the child-exploitive stuff that is overrunning large swaths of the Internet, than I have about whatever scourge the headline editor imagines has swamped us all.
Obviously a lot of this stuff won't be easily accessible.
 

In a detailed report released today by the Internet Watch Foundation, experts warned that legitimate generative AI software packages were being trained on real child sexual abuse images offline so they could then generate ‘realistic’ AI-generated child sexual abuse material (AI CSAM).
 
“Perpetrators can legally download everything they need to generate these images, then can produce as many images as they want – offline, with no opportunity for detection. Various tools exist for improving and editing generated images until they look exactly like the perpetrator wants,” the report noted.
 
A reminder that the 'such material' in question is child abuse content. Was Nicholas Kristof wrong when he said that Pornhub was infested with such? A reminder they removed over 80% of their content.
I never saw any. Never even saw any reporting in the media about such a thing. So, not "overrun".
Obviously a lot of this stuff won't be easily accessible.
Then "overrun" is the wrong word. I think you've fallen victim to fear mongering. Willingly?
 
“Perpetrators can legally download everything they need to generate these images, then can produce as many images as they want – offline, with no opportunity for detection. Various tools exist for improving and editing generated images until they look exactly like the perpetrator wants,” the report noted.
So? People have always been able to create the images they want in private. The new tech does not change the fundamental principle. Personally I do not consider the creation of artwork depicting fictional scenarios to be a matter for government regulation.

It's also a red herring, here. We already know you want to ban all pornography, with a broad definition. So don't waste our time and insult our intelligence, pretending your concern is much narrower.
 
So? People have always been able to create the images they want in private. The new tech does not change the fundamental principle. Personally I do not consider the creation of artwork depicting fictional scenarios to be a matter for government regulation.
No they haven't always been able to create child porn videos in the way described. That you don't think we we should regulate is off the scale wrong.

Where did the money go? Did it go into GenAI to create these kind of videos - or did it go into making sure they were never created?
It's also a red herring, here. We already know you want to ban all pornography, with a broad definition. So don't waste our time and insult our intelligence, pretending your concern is much narrower.
I have no idea what you are talking about.
 
Legalise porn
Porn becomes normalised
Develop GenAI which learns from human-created data including porn
GenAI will now create CSAM
 

Back
Top Bottom