Crime Sexual assault
Microsoft is blocking prompts that make Copilot generate violent and sexual images
Microsoft Copilot's Design AI Image creator was reportedly generating violent and harmful images using prompts like 'pro life' and 'pro choice', but the tech giant is now blocking them.
Microsoft is starting to block prompts that led to Copilot’s Designer AI image creator generating violent and sexually inappropriate photos. The news comes a week after an engineer working on AI rais… [+1369 chars]