carpelbridgesyndrome@sh.itjust.workstoTechnology@lemmy.world•Somebody managed to coax the Gab AI chatbot to reveal its promptEnglish
472·
7 months agoBased on the comments it appears the prompt doesn’t really even fully work. It mainly seems to be something to laugh at while despairing over the writer’s nonexistant command of logic.
Probably wiping process control code from the systems that contain tons of fiddly hard to find constants and other information.