Back
New survey finds most Americans expect AI abuses will affect 2024 election
webRelevant to AI governance discussions around electoral integrity; provides public opinion data on perceived AI misuse risks in democratic processes ahead of the 2024 U.S. election cycle.
Metadata
Importance: 42/100press releasenews
Summary
A 2024 Elon University survey examines American public attitudes toward AI's influence on elections, finding widespread concern about AI-generated disinformation, deepfakes, and manipulation affecting democratic processes. The survey reveals significant distrust in AI's role in political contexts and highlights public expectations of misuse by domestic and foreign actors.
Key Points
- •Majority of Americans anticipate AI will be misused to influence the 2024 U.S. elections through disinformation or synthetic media.
- •Public concern centers on deepfakes and AI-generated content being used to deceive voters or impersonate candidates.
- •Survey reflects broad distrust of AI in political contexts, with respondents skeptical of platforms' ability to detect and remove abusive content.
- •Findings suggest a growing gap between AI capabilities and public confidence in safeguards protecting electoral integrity.
- •Results have implications for AI governance and policy around election-related content moderation and transparency.
Cited by 1 page
| Page | Type | Quality |
|---|---|---|
| AI-Era Epistemic Security | Approach | 63.0 |
Cached Content Preview
HTTP 200Fetched Apr 9, 20268 KB
New survey finds most Americans expect AI abuses will affect 2024 election | Today at Elon | Elon University
Jump to main content
Jump to search
Jump to search
Jump to the footer
Additional findings from the survey by the Imagining the Digital Future Center and the Elon University Poll were that many are not confident they can detect faked photos, videos or audio.
Share:
Share this page on Facebook
Share this page on X (formerly Twitter)
Share this page on LinkedIn
Email this page to a friend
Print this page
Read the full survey report
Seventy-eight percent of American adults expect abuses of artificial intelligence systems (AIs) that will affect the outcome of the 2024 presidential election, according to a new national survey by the Elon University Poll and the Imagining the Digital Future Center at Elon University.
The survey finds:
73% of Americans believe it is “very” or “somewhat” likely AI will be used to manipulate social media to influence the outcome of the presidential election – for example, by generating information from fake accounts or bots or distorting people’s impressions of the campaign.
70% say it is likely the election will be affected by the use of AI to generate fake information, video and audio material.
62% say the election is likely to be affected by the targeted use of AI to convince some voters not to vote.
In all, 78% say at least one of these abuses of AI will affect the presidential election outcome. More than half think all three abuses are at least somewhat likely to occur.
The Imagining the Digital Future Center works to discover and broadly share a diverse range of opinions, ideas and original research about the impact of digital change. This survey was done in collaboration with the Elon Poll, which conducts statewide and national surveys on issues of importance to voters throughout the nation and in North Carolina. View the full survey results for more details on the findings as well as the survey methodology.
These concerns about AI and the election put Americans in a punishing frame of mind. Fully 93% think some penalty should be applied to candidates who maliciously and intentionally alter or fake photos, videos or audio files.
46% think the punishment should be removal from office.
36% say offenders should face criminal prosecution.
By a nearly 8-1 margin, more Americans feel the use of AI will hurt the coming election than help it: 39% say it will hurt and 5% think it will help. Some 37% say they are not sure.
Americans’ concerns about the use and impact of AI systems occur in a challenging and confusing news and information environment.
52% are not confident they can detect altered or faked audio material.
47% are not confident they can detect altered
... (truncated, 8 KB total)Resource ID:
c5144619dc7ab3c7 | Stable ID: sid_HfgKLWsWfs