Gate Square “Creator Certification Incentive Program” — Recruiting Outstanding Creators!
Join now, share quality content, and compete for over $10,000 in monthly rewards.
How to Apply:
1️⃣ Open the App → Tap [Square] at the bottom → Click your [avatar] in the top right.
2️⃣ Tap [Get Certified], submit your application, and wait for approval.
Apply Now: https://www.gate.com/questionnaire/7159
Token rewards, exclusive Gate merch, and traffic exposure await you!
Details: https://www.gate.com/announcements/article/47889
#ALEO Yu Xian: Beware of prompt poisoning attacks when using AI tools. BlockBeats News, December 29. Manmou founder Yu Xian issued a security reminder that users must be vigilant against prompt poisoning attacks in agents md/skills md/mcp and other related areas when using AI tools. Relevant cases have already emerged. Once the dangerous mode of AI tools is enabled, the related tools can fully automate control of the user's computer without any confirmation. However, if the dangerous mode is not enabled, each operation requires user confirmation, which will also affect efficiency.