This is kinda old information, but my understanding was that there were 3 issues with dasiy-chained UPSes.
The first is that you’re potentially going to cause a ground loop, which is not healthy for the life of anything plugged into those UPSes.
The second is that there’s a potential for a voltage droop going through from the first to second UPS, which means the UPSes will flap constantly and screw their batteries up, though I’d be shocked if that was necessarily still true for modern high-quality units.
And of course, the UPS itself won’t be outputting a proper sinewave when it’s on battery, which means your 2nd UPS in the chain will freak out (though again, maybe modern ones don’t have that limitation).
Yeah, I think you’ve made a mistake in thinking that this is going to be usable as generative AI.
I’d bet $5 this is just a fancy machine learning algorithm that takes a submitted image, does machine learning nonsense with it, and returns a ‘there is a high probability this is an illicit image of a child’, and not something you could use to actually generate CSAM with.
You want something that’s capable of assessing the similarities between a submitted image and a group of known bad images, but that doesn’t mean the dataset is in any way usable for anything other than that one specific task - AI/ML in use cases like this is super broad and has been a thing for decades before the whole ‘AI == generative AI’ thing became what everyone is thinking.
But, in any case: the PhotoDNA database is in one place and access to it is scaled by the merit of uh, lots of money?
And of course, any ‘unscrupulous engineer’ that may have any plans for doing anything with this is probably not a complete idiot, even if a pedo: they’re going to have shockingly good access controls and logging and well, if you’re in the US, if the dude takes this database and generates a couple of CSAM images using it, the penalty is, for most people, spending the rest of their life in prison.
Feds don’t fuck around with creation or distribution charges.