Actress Scarlett Johansson has been invited to testify before a House Technology Subcommittee regarding her recent accusations against OpenAI. Johansson claimed that a voice used in OpenAI’s ChatGPT was “eerily similar” to her own, despite not agreeing to any collaboration. Rep. Nancy Mace (R-SC), who chairs the subcommittee, sent the invitation, aiming to shed light on the broader implications of deepfake technology.
The controversy began when Johansson revealed her frustration over the “Sky” chatbot’s voice, which she believes mimics her own. She shared that OpenAI CEO Sam Altman had previously asked her to provide her voice for the feature, an offer she declined. When a similar voice emerged in ChatGPT, Johansson’s legal team contacted OpenAI.
OpenAI has refuted the claim, stating that the “Sky” voice was created with a different professional actress using her own natural speaking voice. Altman apologized for the lack of clear communication and announced a halt to the use of Sky’s voice out of respect for Johansson.
Rep. Mace’s office informed DailyWire that Johansson is unable to attend the hearing scheduled for July 9 but might be available in October. Mace highlighted the importance of Johansson’s perspective in the ongoing debate about deepfakes and technological advancements.
Johansson’s statement emphasized the need for clarity and legislation to protect individual identities and rights. “In a time when we are all grappling with deepfakes and the protection of our own likeness, our own work, our own identities, I believe these are questions that deserve absolute clarity,” she remarked. She called for transparency and appropriate legislation to safeguard individual rights against the misuse of technology.
This development comes as Congress increasingly focuses on the implications of deepfake technology and artificial intelligence. Johansson’s potential testimony could play a crucial role in shaping future legislation aimed at protecting personal identities and ensuring ethical standards in the use of AI and deepfakes.