Analysis: Disinformation evolved in Japan electionーNHK WORLD-JAPAN NEWS

By NHK WORLD-JAPAN

Share:

Key Concepts

  • Generative AI & Disinformation: The increasing sophistication of AI, particularly tools like OpenAI’s Sora, is enabling the creation of highly realistic fake videos and images used to spread disinformation.
  • Social Media Monetization: The financial incentive to generate views on social media platforms drives some actors to create and disseminate false information.
  • Political Manipulation: Disinformation is being used to influence public opinion regarding political parties, policies (like the consumption tax), and government agencies.
  • Critical Thinking & Verification: The importance of questioning information encountered online, verifying sources, and avoiding emotionally-driven sharing.
  • Children and Families Agency Budget Misinformation: A specific example of disinformation circulating regarding the 7.3 trillion yen budget of the agency, falsely claiming it could fund universal newborn payments or consumption tax abolition.

The Resurgence of Online Disinformation in Japan’s Elections

The recent general election campaign in Japan was significantly impacted by a resurgence of online disinformation, fueled by advancements in generative artificial intelligence. Experts have observed a marked increase in the creation and spread of fabricated content, including realistic AI-generated videos and images designed to mislead voters. Instances included deepfake videos of women seemingly criticizing a political party, garnering over 400,000 views and widespread reposting with reinforcing commentary, and fabricated news program segments alleging wrongdoing by a political party. The core issue is that the increasing realism of these creations directly undermines trust in candidates and political institutions.

The Role of OpenAI’s Sora and Increased Sophistication

NHK’s Yabui Jr. notes that while disinformation isn’t new to Japanese elections – it was also present in the upper house election last year – the launch of OpenAI’s Sora 2 in autumn dramatically increased the accessibility and sophistication of fake content creation. Sora 2 allows “just about just about anyone” to quickly and easily generate elaborate fake videos and images. This ease of creation is a key factor in the proliferation of disinformation.

Specific Disinformation Examples & Tactics

A significant portion of the disinformation focused on key voter concerns, specifically the consumption tax and measures to address rising prices. Posts circulated claiming that abolishing the newly established Children and Families Agency, with a budget of 7.3 trillion yen, would free up sufficient funds to provide 10 million yen to each newborn child. Another claim suggested abolishing the agency would allow for the complete removal of the consumption tax.

Yabui Jr. clarifies that while the agency’s budget is indeed 7.3 trillion yen for the current fiscal year, the vast majority of these funds are allocated to essential services like nursery schools, child allowances, and child care leave benefits. The tactic employed is leveraging a correct fact (the budget amount) to create a false sense of legitimacy for an unrealistic proposition. A common characteristic of this disinformation is the selective presentation of information, ignoring crucial contextual details.

Motivations Behind Disinformation Campaigns

Three primary motivations for creating and spreading disinformation were identified:

  1. Financial Gain: Many video and social media platforms offer monetization options, rewarding content creators based on views. Disinformation, designed to stimulate strong emotional responses, attracts attention and thus generates revenue.
  2. Political Support: Some individuals attempt to bolster support for a specific political party by disseminating false information, believing it will benefit their preferred candidate or ideology.
  3. Foreign Interference: While not definitively proven in the recent election, experts acknowledge the possibility of information manipulation by foreign actors. Currently, there is insufficient evidence to confirm foreign involvement in Japan’s latest election.

Recommendations for Mitigating Disinformation

Yabui Jr. emphasizes the enduring importance of critical thinking and information verification. He advises users to:

  • Question Everything: Approach online content with skepticism.
  • Account Analysis: Examine the typical content and information sources of the accounts sharing information.
  • Source Verification: Independently verify the sources of information presented.
  • Emotional Regulation: “Calm down” when encountering emotionally charged content. Emotional responses can impair judgment and lead to impulsive sharing. He specifically recommends taking a “deep breath and think before hitting that share button.”

He acknowledges that social media provides valuable information but also harbors significant amounts of disinformation, a situation that is unlikely to change.

Conclusion

The increasing sophistication of generative AI poses a significant threat to the integrity of political discourse in Japan. The spread of disinformation, driven by financial incentives, political motivations, and potentially foreign interference, requires a proactive and critical approach from citizens. Emphasis on verifying information, understanding the motivations behind content creation, and regulating emotional responses are crucial steps in mitigating the harmful effects of online disinformation and preserving trust in democratic processes.

Chat with this Video

AI-Powered

Hi! I can answer questions about this video "Analysis: Disinformation evolved in Japan electionーNHK WORLD-JAPAN NEWS". What would you like to know?

Chat is based on the transcript of this video and may not be 100% accurate.

Related Videos

Ready to summarize another video?

Summarize YouTube Video