These days generative AI and AI chatbots are widely used by millions of people to such an extent that it is used for conversations on a daily basis. There are many such tools that are getting developed with each passing day and it is getting hard for users to judge which of these available tools are reliable and which are not. A few of these tools are specially launched for cyber attacks on people who are unaware.
Recently, a generative AI tool WormGPT was developed that is similar to ChatGPT. The creator of this tool claims that this tool is a direct competitor and enemy of ChatGPT. It will not restrict any malpractice or text generated illegally.
Threat of WormGPT
People having wrong intentions can use this WormGPT to promote malicious activities. The tool can promote phishing and business email compromise (BEC) attacks.
It is a blackhat alternative to GPT models that are specially designed for malicious activities. Security researcher Daniel Kelly quoted that these technologies can be used by cybercriminals in automating fake emails that can be personalised according to the recipient thus, it can increase the chances for success of the cyber attack.
The hackers can easily leverage the chat memory retention and code formatting features of WormGPT. Using the tool effective text and phishing emails can be created. No prior proficiency is required for the users to utilise the features of WormGPT for any malign purpose.
There is no predefined set of rules while generating content using WormGPT, unlike ChatGPT. This can lead to an increase in the number of cyber attacks, cyber scams and cyber crimes. The common people need to stay alert all the time to the emails they receive or all kinds of chats they receive through social media or any other medium to avoid such kinds of scams. The best people can do is avoid any kind of conversation with strangers on the internet.
Cybercriminals are using AI-based tools not just for ordinary chats and messages but also to create AI-based videos that can resemble people, friends or family members. They use this to start the conversation on platforms like WhatsApp and dupe the common people for money.