Recently, social platform X (formerly Twitter) announced that it will pilot a highly-watched new feature - allowing artificial intelligence (AI) chatbots to participate in generating "community notes". The initiative was driven by Elon Musk and integrated its Grok big model technology to further improve the accuracy, background transparency and public information literacy of the platform.
What is "Community Notes"?
Community Notes is a user-led fact-checking and background supplement mechanism. The system allows platform users to write notes for specific content (such as misleading information, AI-generated videos, public speech, etc.), supplement important background or provide correction instructions.
Before publishing, all notes must pass "consensus review" from users with different opinions to ensure fair and impartial content. This system has shown some results in improving the credibility of X community content.
Can AI also participate in notes writing? X is under testing
In the new pilot, the X platform will allow AI models to automatically generate candidate community notes and submit them to the review process. These AI notes may be generated based on platform public APIs, search indexes, fact databases and cross-platform context information, mainly dealing with:
fuzzy or fake video content;
Politician misleading remarks;
Background supplement to hot events;
Mispoken science information, etc.
The content generated by AI will follow the same review standards as human user's notes, and will not be "automatically published", but will wait for community review to decide whether to display it.
Human-computer collaboration is at the core: AI will not replace human judgment
Research institutions pointed out that although AI can improve information coverage and speed, there is still a "illusion" problem (i.e., fictional facts). Therefore, X official made it clear:
“The goal is not to let AI tell users how to think, but to build an ecosystem of knowledge collaboration where humans have the ultimate decision-making power.”
This model emphasizes AI as a tool to improve the quality and speed of community notes in an auxiliary way, rather than leading information interpretation.
Industry effect: Meta, TikTok and other platforms are also following
X's community notes mechanism has attracted widespread attention from the industry. Meta has even recently stopped relying on third-party fact-checking services and instead developed a community-driven annotation system. Platforms such as TikTok and YouTube are also trying to build an error correction mechanism for user participation.
This shows that in the era of generative AI, platforms are gradually moving from "centralized audit" to a hybrid model of "community collaboration + AI support".
Risks and challenges coexist: the platform will be cautiously promoted
Despite the huge potential of this feature, the X platform also acknowledges risks:
The proliferation of AI content may cause reviewers to deal with;
Third-party model access (such as user embedding custom LLM) will bring new challenges in content credibility and security;
If model optimization overemphasizes "cailigence" and ignores "accuracy", it will weaken the seriousness of community notes.
So X is currently testing the feature only on a small scale and plans to adjust the rollout pace based on community feedback.
Outlook: A new paradigm for information verification ecology?
At a time when the platform's information dissemination mechanism is becoming increasingly complex, the "community + AI" verification mechanism may become a new standard for future content platforms. AI can be used to quickly generate first drafts or suggestions, while the human community is responsible for review and checking. This "decentralized collaboration" model may gradually replace the traditional "closed audit" system.
AIbase believes that X's move may become an important turning point for AI to participate in public information governance, especially in the context of a surge in deep forgery content in the global election year and the background of a surge in deep forgery content, which shows its experimental significance.