top of page

Employees Are Sending Sensitive Data into GenAI Tools

The emergence of GenAI tools, such as ChatGPT, has created new opportunities and challenges for data-driven organizations. While these tools can enhance productivity and innovation, they also pose a serious threat to the security and privacy of sensitive data. How can organizations assess and mitigate this threat? A new report by a browser security company, LayerX, offers a comprehensive analysis of the GenAI data exposure risk. The report, titled "Revealing the True GenAI Data Exposure Risk", reveals the extent and impact of data leakage through GenAI tools and provides practical recommendations for data protection professionals.

Open AI Logo, Chat GPT on phone with thumb

Another report, titled "The Numbers Behind the ChatGPT Risk”, examines how 10,000 employees use ChatGPT and other generative AI apps, and exposes the potential dangers of this practice. The report reveals that 6% of employees have copied and pasted sensitive data into GenAI, and 4% of them do it on a weekly basis. This exposes the organizations to a high risk of data leakage. The report also answers important questions about the extent of GenAI usage among different departments, the percentage of "paste" actions within this usage, the types of sensitive data that are most likely to be compromised by pasting, and the frequency of employees who paste sensitive data into GenAI.

Bard, ChatGPT is a large language chatbot developed by OpenAI and released November 2022, using AI on Cell phone

ChatGPT is a powerful AI tool that can generate natural language responses based on user inputs. However, it also poses a serious risk for data security and privacy. A recent report has revealed some alarming statistics about how ChatGPT and other generative AI apps are used and misused by employees in various organizations.

According to the report, ChatGPT usage has increased by 44% in the last three months, with an average of 19% of employees using it for various purposes. However, not all of these uses are benign or harmless. As mentioned above, the report found that 6% of employees have pasted sensitive data into ChatGPT, exposing it to potential data breaches and leaks. Moreover, 4% of employees have done this on a weekly basis, indicating a recurring and careless behavior.

The types of sensitive data that were pasted into ChatGPT include source code, internal business information, and personal identifiable information (PII). These data were mostly pasted by users from the R&D, Sales & Marketing, and Finance departments. The report warns that this data exposure can have severe consequences for the organizations and their customers. These consequences include, but are not limited to, loss of intellectual property, competitive advantage, reputation, and trust.

Types of sensitive data exposed in GenAI, Internal Business Data, Source Code Regulated PII, Customer Data, Other, Pie Chart

The report recommends that organizations implement strict data protection policies and measures to prevent data exfiltration by ChatGPT and other generative AI apps. It also suggests that employees should be educated and trained on the proper and ethical use of these tools, and that they should be held accountable for any violations or misuse.

This report offers valuable insights for data protection stakeholders who want to develop effective GenAI data protection plans. In the GenAI era, it is essential to monitor the GenAI usage patterns within an organization and verify that the current products can deliver the required insights and protection. If not, stakeholders should consider adopting a solution that enables continuous monitoring, risk analysis, and real-time governance on every event within a browsing session.

Get in touch with Webcheck Security today if you want to mitigate the risks as your employees take advantage of this cutting-edge technology and secure your data going into GenAI. We will provide you with a free consultation and assistance with data protection planning that suits your organization's needs and budget. Don't let your data fall into the wrong hands; contact us today!

106 views0 comments


bottom of page