Hi, this is a leak from your chatbot: 346K confidential client files are exposed by WotNot.
Concerns over data security and privacy have been raised by the fact that ChatGPT and other chatbots from well-known AI businesses have all been duped into disclosing private information. But what happens if you add another link to the chain so that people may reach the chatbot? There may be more systems through which customer data passes, increasing the danger of third-party vendors and other exposure points.
Concerns over data security and privacy have been raised by the fact that ChatGPT and other chatbots from well-known AI businesses have all been duped into disclosing private information. But what happens if you add another link to the chain so that people may reach the chatbot? There may be more systems through which customer data passes, increasing the danger of third-party vendors and other exposure points.
An excellent illustration of the materialized risk is WotNot, a tailored chatbot builder for companies. The Cybernews research team revealed new indices on August 27, 2024, while they were carrying out a standard inquiry using OSINT techniques. A massive Google Cloud Storage bucket containing 346,381 files was found by the researchers. The misconfigured storage was available to anyone without consent and included a treasure mine of personal data.
WotNot, an AI startup, has been blamed for the bucket. Among its 3,000 clients are Merck, the University of California, Chenening, Amneal Pharmaceuticals, Chenening, Donna Italia, Zydus Group, and other businesses. Numerous data kinds were saved in the bucket, suggesting that it might have belonged to clients of numerous businesses. Included data:
National identification documents, such as passports, are essential for confirming identity and can be utilized in a number of identity verification procedures. Important identifying details such as full names, passport numbers, birth dates, and other details are contained in IDs. Medical records: these documents contain comprehensive health information, such as test results, treatment histories, diagnoses, and other private health information. A thorough picture of a person's medical background and present state of health is given by the data.
Resumes: documents contain full names, contact information, career experience, educational background, but most crucially, sensitive data such as home addresses, email addresses, and phone numbers, among other facts Additional documents: such as train tickets and trip schedules.
Despite WotNot's small scale, the impacted individuals face a serious security and privacy risk as a result of this disclosure. Threat actors have a full arsenal for identity theft, medical or employment fraud, and other scams thanks to the leaked personal documents, according to Cybernews researchers. They caution that cybercriminals might try to use the data to initiate spearphishing attacks, create fictitious insurance claims, open bogus financial accounts, and engage in other social engineering operations.
A new shadow IT resource is introduced by AI.
Businesses may utilize the WotNot chatbot development platform to build intelligent, customizable, and engaging bots for a variety of applications with little to no coding knowledge. The platform facilitates the deployment of chatbots across a variety of channels, such as SMS, Facebook Messenger, WhatsApp, and websites. Wotnot has offices in the USA and India.
The event serves as an example of how companies need to understand that their obligations for data security go beyond their internal systems. "In the instance of WotNot, confidential data originating from their business clients was ultimately made public, demonstrating how a single vendor's security breach can jeopardize data from numerous businesses and thousands of people downstream," Cybernews analysts said.
A new shadow IT resource that is not directly under the organization's control is brought about by AI services. Systems that are not under the direct supervision of the central IT department and get beyond several security and restriction measures are referred to as "shadow IT." Because AI services are interconnected, businesses need to protect their own systems and carefully consider the security policies of all the partners in their AI implementation chain.
AI services raise the possibility of uncontrolled data transfers, just like any other shadow IT resource, making it more challenging to safeguard them and adhere to best practices or legal requirements. On September 9th, researchers from Cybernews responsibly told WotNot about the problem. More than two months and a dozen follow-up emails were required before the business closed access to the leaked information.
It's unknown if the disclosed WotNot data was accessed by any threat actors or third parties. Consequently, it is advised to conduct frequent security audits, keep an eye on access logs in the past, and make sure that appropriate encryption and access restriction measures are in place.