Researchers poison their own data when it is stolen by AI to ruin results
Researchers from China and Singapore proposed AURA (Active Utility Reduction via Adulteration) to protect GraphRAG systems. AURA deliberately poisons proprietary knowledge graphs so that stolen data produces hallucinations and wrong answers Correct outputs require a secret key; tests showed approximately 94% effectiveness in degrading stolen KG utility Researchers from universities in China and Singapore have […]
Researchers poison their own data when it is stolen by AI to ruin results Read More »










