Salamendacious@lemmy.world to News@lemmy.world · 1 year agoMeet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training dataventurebeat.comexternal-linkmessage-square119fedilinkarrow-up1557arrow-down138 cross-posted to: [email protected][email protected][email protected]
arrow-up1519arrow-down1external-linkMeet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training dataventurebeat.comSalamendacious@lemmy.world to News@lemmy.world · 1 year agomessage-square119fedilink cross-posted to: [email protected][email protected][email protected]
minus-squareubermeisters@lemmy.worldlinkfedilinkarrow-up2arrow-down1·edit-21 year agodeleted by creator
minus-squareAsifall@lemmy.worldlinkfedilinkarrow-up11·1 year agoI don’t think the idea is to protect specific images, it’s to create enough of these poisoned images that training your model on random free images you pull off the internet becomes risky.
minus-squareSCB@lemmy.worldlinkfedilinkarrow-up2arrow-down5·1 year agoWhich, honestly, should be criminal.
deleted by creator
I don’t think the idea is to protect specific images, it’s to create enough of these poisoned images that training your model on random free images you pull off the internet becomes risky.
Which, honestly, should be criminal.