• 0 Posts
  • 59 Comments
Joined 1 year ago
cake
Cake day: June 30th, 2023

help-circle










  • Problem with one line of data? Better shutdown the airspace.

    Amazing this hadn’t happened before with a strategy like that.

    Also, duplicate waypoints are allowed, just not in the same region. But also exit points don’t have to be explicitly indicated and the system will just look for the nearest waypoint in another region.

    Sounds like the whole thing was a needlessly hacky messy standard. I’ve dealt with quite a few of them, but to tolerate it in air traffic control? Good grief…





  • Relevant part of Coroners and Justice Act 2009 (UK)

    Section 65 (regarding what “child” means in the context of indecent images)

    (6)Where an image shows a person the image is to be treated as an image of a child if—

    (a)the impression conveyed by the image is that the person shown is a child, or

    (b)the predominant impression conveyed is that the person shown is a child despite the fact that some of the physical characteristics shown are not those of a child.

    (end quote)

    In other words, an image can be treated as an indecent image of a child if the “impression conveyed” is that the person is under 18, even if that person has older “physical characteristics”.

    This legislation is more directed at non photographic imagery (so hentai / CGI etc) and the reference to physical characteristics is apparently a reference to a large breasts or “1000 year old vampire teeth” not being viable as an excuse that the image doesn’t give the impression of a child.

    I can’t recall specifically what legislation was used regarding the age play couple I referenced. I can’t find a specific law that says it’s wrong for a photograph of an adult to appear underage. So it may just be that they were reported to police because they shared their images online without context. I don’t know if they were subsequently prosecuted.





  • Bound to be tested in court sooner or later. As far as I understand it one is “in possession” if they have access to a set of steps or procedures that would recover an image. So this prevents offenders from hiding behind the fact their images were compressed in a zip file or something. They don’t have a literal offending image, but they possess it in a form that they can transform.

    What would need to be tested is that AI generators are coming up with novel images rather than retrieving existing ones. It seems like common sense but the law is quite pedantic. The more significant issue is that generators don’t need to be trained on csem to come up with it. So proving someone had it with the intent of producing it would always be hard. Even generators trained on illegal material I’m not sure it would be straight forward to prove that someone knew what it was capable of.