• 1 Post
  • 378 Comments
Joined 1 year ago
cake
Cake day: August 22nd, 2023

help-circle








  • Man, they could have made the letter something that would persuade people about the importance of ideas and how no nation is a monolith, but they just couldn’t help but make it a blatantly “Israel is right” letter.

    “We continue to be shocked and disappointed to see members of the literary community harass and ostracise their colleagues because they don’t share a one-sided narrative in response to the greatest massacre of Jews since the Holocaust.

    “Israel is fighting existential wars against Hamas and Hezbollah…"

    Someone here is obfuscating reality, and it’s not the boycotters. These people are insane.







  • Absolutely agree. My comment above was focused on whether some minimal amount of CSEM would itself make similar images happen when just prompting for porn, but there are a few mechanics that likely bias a model to creating young-looking faces in porn and with intentional prompt crafting I have no doubt you can at least get an approximation of it.

    I’m glad to hear about the models that are intentionally separating adult content from children. That’s a good idea. There’s not really much reason an adult-focused model needs to be mixed with much other data. There’s already so much porn out there. Maybe if you want to tune something unrelated to the naked parts (like the background) or you want some mundane activity, but naked, but neither of those things need kids in them.


  • I have not personally explored AI porn, but as someone with experience in machine learning and accidental biases that’s not very surprising to me.

    On top the of the general societal bias towards youth for “beauty” related roles, smoother and less-featured faces (that in general look younger) are closer to an average face so defaulting to that gets a bit of training boost (when in doubt, target the mean). It’s probably also not helped by youth-related porn keywords (teen, daughter, young) that further associate other porn prompts (even ones not about youth) with non-porn images of underage women that also have those keywords.


  • I assume any CSEM ingested into these models is absolutely swamped by the massive amount of adult porn that’s much more easily available. A handful of images aren’t going to drive model output in datasets of the scale of the image generation models. Maybe there are keywords that could drill down to be more associated with the child porn, but a lot of “young” type keywords are already plentifully applied to adults, and I imagine accidental child porn ingests are much less likely to be as conveniently labeled.

    So maybe you can figure out how to get it to produce child porn, but it probably won’t just randomly produce it for an innocent porn prompt.



  • I don’t trust it because there’s no believable plan to make it commercially viable, so it’s just going to end up defunct or enshittified. Mastodon is up front, it’s a volunteer service that you can either pay for or roll the dice on the instance staying up. And there’s a built-in way to move on when one goes down.

    BlueSky is a B-corp, which theoretically means they can say their mission takes priority if sued by an investor in court, but doesn’t in any way require them to make it the primary goal, and the reality of funding and money and investors means that’s almost certainly not going to happen.