Following the announcement of Microsoft Copilot recently, Thales wanted to share this comment from Chris Harris, EMEA Technical Director.
Chris discusses the potential security pitfalls of this technology and highlights what businesses will need to consider before it’s adopted within an organisation.
Productivity gamechanger
Chris Harris said, “Navigating workplace productivity tools can be very, well…unproductive. No doubt then that many workers and businesses will be excited by this latest development from Microsoft."
"Less time typing meeting notes, the first cut of a PowerPoint presentation completed in seconds, emails automatically drafted… Microsoft is right in dubbing Copilot a ‘productivity game changer’."
ChatGPT style technology
Copilot puts ChatGPT-style technology directly into the daily lives of workers"
He continues, “However, while businesses may already be calculating how much time they’ll save in admin, they’d do well not to get too excited."
"This development puts ChatGPT style technology directly into the daily lives of workers, into the tools we all use universally to connect with colleagues and clients and we cannot overlook the potential risks to data privacy."
Copilot
Copilot will draft presentations, spreadsheets, and emails based on data that already exists within that company’s server.
So, in theory, if an employee wanted to send a campaign report to a customer, they can instruct Copilot to do so, and the technology will pull the relevant data it sees on the system.
Pitfalls
Would Copilot be smart enough to restrict the data that it pulls?
Sounds simple enough, but this is filled with potential security pitfalls.
What if that employee in question shouldn’t have access to such information? Would Copilot be smart enough to restrict the data that it pulls?
Robust access management protocols
Chris Harris adds, “There’s also the risk of the technology pulling data that it shouldn’t. For example, if it misunderstands the instruction and pulls data from another client’s project."
"The technology is not infallible, and throws human error into the mix and companies could very easily wind up with a data privacy nightmare. Companies implementing Copilot must be vigilant to ensure sensitive data stays protected, and that robust access management protocols are respected from the get-go.”