Some of the image generators have attempted to put up guard rails to prevent generating pictures of nude children, but the creators/managers haven’t been able to eradicate it. There was also an investigation by Stanford University that showed that most of the really popular image generators had a not insignificant amount of CSAM in their training data and could be fairly easily manipulated into making more.
The creators and managers of these generative “AIs” have done slim to none in the way of curation and have routinely been trying to fob off responsibility to their users the same way Tesla has been doing for their “full self driving”.
Normally, I would agree with you, but this summary that does not include the explicit description of the torture is a lot easier to read.