Microsoft Worker Warns AI Tool Creates ‘Sexually Objectified’ Images: Urges FTC and Lawmakers to Take Action
REDMOND, Washington – A Microsoft software engineer has raised concerns about the tech giant’s AI image generation tool, Copilot Designer, creating potentially harmful and offensive content. Shane Jones discovered a security vulnerability in OpenAI’s DALL-E model, which is integrated into Microsoft’s AI tools, allowing for the generation of abusive and violent images. In a letter addressed to Microsoft’s board, lawmakers, and the Federal Trade Commission, Jones criticized the company for not taking sufficient measures to …