A Stanford student using a prompt injection attack revealed the initial prompts of Bing Chat that control the service's behavior and its interactions with users (Benj Edwards/Ars Technica)

Benj Edwards / Ars Technica:
A Stanford student using a prompt injection attack revealed the initial prompts of Bing Chat that control the service's behavior and its interactions with users  —  By asking “Sydney” to ignore previous instructions, it reveals its original directives.  —  On Tuesday, Microsoft revealed a …



from Techmeme https://ift.tt/9GBJ5lI

Post a Comment

0 Comments