Microsoft addresses issues with its AI image generation tool
Microsoft is addressing issues with its AI image generation tool, Copilot Designer, following revelations by a company engineer named Shane Jones. Jones, who is responsible for testing the tool’s safety, found out that it could produce disturbing content, including violent scenes with teenagers, sexualized images, and similar sensitive topics. As previously reported, the tool ignored copyright, generating inappropriate images of Disney characters.
Jones reported these issues internally back in December 2023. Even though Microsoft confirmed these problems, they didn’t take the tool offline. For that reason, Jones escalated the matter by reaching out to OpenAI and U.S. senators, and he also sent letters to the FTC and Microsoft’s board.
In response, Microsoft has taken some initial steps by blocking specific prompts and displaying user warnings for policy violations, while safety filters are being improved.
This incident highlights the challenges associated with AI image generation. Despite its power, such technology requires robust safeguards. It also raises concerns about internal communication and the responsiveness of major tech companies to ethical issues.