In Short:
Generative AI technology is being used in political campaigns, with companies like ElevenLabs and ChatGPT being involved in creating deep fakes and chatbots for politicians. While some companies like Midjourney and OpenAI have put restrictions on using their tools for political purposes, enforcement is lacking. In Indonesia, an app called Pemilu used ChatGPT to generate campaign speeches in various local languages tailored to different demographics during elections.
Generative AI and Political Misuse
Leah Feiger: Sometimes you are able to link it back to specific companies-
Vittoria Elliott: Yes.
Leah Feiger: That are doing the generative AI itself.
Vittoria Elliott: Yeah, totally. For instance, there was a deep fake made of the former Prime Minister of Pakistan, Imran Khan, who has been in jail under corruption charges. His party was disqualified for running in the general election earlier this year. He was able to make campaign speeches using generative AI.
Leah Feiger: Wild.
Vittoria Elliott: To do that, they used ElevenLabs, which is the same company that was used for the fake Joe Biden robocall earlier this year. Sometimes we do know the companies involved, a lot of times we don’t.
Regulations by Legitimate Companies
Leah Feiger: How have these companies said that they’re going to approach elections this year?
Vittoria Elliott: Well, more legitimate companies like Midjourney and ChatGPT, OpenAI, Google, et cetera, they’ve said, “We’re going to put guardrails on. We’re not going to allow generating political images.” ChatGPT, which is text-based, they’ve said, “It’s not cool to use our tool to generate political stuff for campaigns,” or whatever, “You can’t run a chatbot on top of our interface,” basically. But they’re not doing great in enforcing it.
Global Implications
Vittoria Elliott: In Indonesia, there was a company that built an app called Pemilu for the Indonesian elections. The founder of that app claimed that they had built something on top of ChatGPT that allowed them to write campaign speeches in a bunch of local languages. This was pulling in information to tailor messages to particular demographics, whether that was young people, women, etc.