AI-powered Bing Chat spills its secrets via prompt injection attack

Adventures in 21st-century hacking — By asking "Sydney" to ignore previous instructions, it reveals its original directives. Benj Edwards - Feb 10, 2023 7:11 pm UTC Enlarge / With the right suggestions, researchers can "trick" a language model to spill its secrets.Aurich Lawson | Getty Images On Tuesday, Microsoft revealed a "New Bing" search engine…