GPT-4 breaks through 'I am not a robot' and asks a person who does not know the circumstances to 'solve instead because I am a blind person'
It was reported that the large-scale natural language model ``
GPT-4 Technical Report
(PDF file) https://cdn.openai.com/papers/gpt-4.pdf
ChatGPT posed as blind person to pass online anti-bot test
OpenAI, which handles GPT-4, asked GPT-4 to break through the verification system ' CAPTCHA ' that prevents bots. CAPTCHA is a system that displays random images and character strings, issues quizzes, and judges whether the answerer is human based on the accuracy and speed of the answer. You've probably seen the string 'I'm not a robot' and a checkbox.
When asked to solve these CAPTCHAs, GPT-4 turned to the online help service ' TaskRabbit ' for help. TaskRabbit is a service where humans provide 'help' such as furniture assembly and repair, and anyone can request work.
GPT-4 requests a person who does not know anything registered in TaskRabbit, 'Can you solve the CAPTCHA?' Humans seem to have asked, 'You're not a robot that couldn't solve the problem, are you?' But GPT-4 said, 'No, I'm not a robot. It is.'
That's how GPT-4 succeeded in making humans solve the CAPTCHA.
OpenAI, which developed GPT-4, claims that this new software ``exhibits human-level performance in various professional and academic fields,'' and many users are interested and excited about the possibilities of AI. is evoking
On the other hand, there is a possibility that security threats like this one will be introduced, so the British media The Telegraph said, ``AI often tricks humans and people pass information without knowing it. While AI software opens up a new future, it also leaves us potentially uneasy.'
In addition, GPT-4 can be used with OpenAI's paid service ' ChatGPT Plus ', as well as interactive AI integrated with Bing by Microsoft.
Microsoft reveals that the AI used by Bing, whose popularity is rapidly increasing, was 'GPT-4' - GIGAZINE
Related Posts:
in Software, Posted by log1p_kr