Google urged its employees not to reveal confidential information when they correspond with chatbots such as Bard, which the company developed, and ChatGPT, Reuters reported on Thursday. In addition, the company instructed employees not to use lines of code generated by chatbots.
More stories:
Google, which confirmed to the news agency its long-standing policy on safeguarding information, joins a number of technology companies, including Samsung, Amazon and, according to reports, Apple, who have ordered their employees not to reveal commercial secrets in correspondence with the chatbots that have become particularly popular tools in recent months. According to a survey conducted by the Fishbowl website in January of 12,000 employees in the US, 43% of them use ChatGPT and other artificial intelligence tools –sometimes without the knowledge of their managers.
There are two main reasons for the companies' concern: First, some of the conversations with chatbots like ChatGPT and Bard are checked by human testers and there is a fear that they will be exposed to confidential information. Second, the conversations of the chatbots are used to further their training, which raises the concern that Bard or ChatGPT will leak trade secrets in conversations with other users. For example, a Google employee may consult with Bard about marketing text for a product that has not yet been launched, and the chatbot will "leak" the content of the conversation to other users.
Google and Microsoft, which adopted the technology of OpenAI, introduced earlier this year generative artificial intelligence tools that are intended for businesses and keep their data separated from that of other companies. Cloudflare, which protects websites against cyberattacks, also introduced a tool that allows its customers to label information in order to prevent it from leaking from their internal systems. But these tools still don't prevent employees from privately using free chatbots that are open to the general public and revealing to them information that would be better kept to themselves.