Microsoft’s AI Tool Exposed: Shocking Sexual and Violent Images Unleashed on Kids by Copilot Designer in Recent Scandal!
Seattle, Washington – A Microsoft AI engineer has raised concerns about the company’s AI text-to-image generator, Copilot Designer, warning that it may be generating violent and sexual imagery. Despite repeated warnings from the engineer, Shane Jones, Microsoft has allegedly failed to take action to address these issues. Jones, who participated in red-teaming efforts to test the tool’s vulnerabilities, claims that Microsoft ignored his warnings and did not implement safeguards or disclose the mature content rating …