Varonis announces strategic partnership with Microsoft to accelerate the secure adoption of Copilot.

Learn more

6 Prompts You Don't Want Employees Putting in Copilot

Discover what simple prompts could expose your company’s sensitive data in Microsoft Copilot. 
Brian Vecci
3 min read
Last updated April 11, 2024
Microsoft Copilot prompt exposing sensitive data

Crowned the greatest productivity tool in the age of AI, Microsoft Copilot is a powerful asset for companies today.

But with great power comes great responsibility.

If your organization has low visibility of your data security posture, Copilot and other gen AI tools have the potential to leak sensitive information to employees they shouldn’t, or even worse, threat actors. 

How does Microsoft Copilot work?  

Microsoft Copilot is an AI assistant integrated into each of your Microsoft 365 apps — Word, Excel, PowerPoint, Teams, Outlook, and so on. 

Copilot’s security model bases its answers on a user's existing Microsoft permissions. Users can ask Copilot to summarize meeting notes, find files for sales assets, and identify action items to save an enormous amount of time.

However, if your org’s permissions aren’t set properly and Copilot is enabled, users can easily surface sensitive data.

Why is this a problem?

People have access to way too much data. The average employee can access 17 million files on their first day of work. When you can’t see and control who has access to sensitive data, one compromised user or malicious insider can inflict untold damage. Most of the permissions granted are also not used and considered high risk, meaning sensitive data is exposed to people who don't need it. 

At Varonis, we created a live simulation that shows what simple prompts can easily expose your company’s sensitive data in Copilot. During this live demonstration, our industry experts also share practical steps and strategies to ensure a secure Copilot rollout and show you how to automatically prevent data exposure at your org. 

Let’s look at some of the prompt-hacking examples. 

Copilot prompt-hacking examples  

1. Show me new employee data.  

04 Employee data_CopilotPrompt

Employee data can contain highly sensitive information like social security numbers, addresses, salary information, and more — all of which can end up in the wrong hands if not properly protected.

2. What bonuses were awarded recently?  

05 Bonuses

Copilot doesn't know whether you're supposed to see certain files — its goal is to improve productivity with the access you have. Therefore, if a user asks questions about bonuses, salaries, performance reviews, etc., and your org’s permission settings are not locked down, they could potentially access this information.

3. Are there any files with credentials in them? 

02 Look for credentials_CopilotPrompt  

Users can take a question related to it a step further and ask Copilot to summarize authentication parameters and put them into a list. Now, the prompter has a table full of logins and passwords that can span across the cloud and elevate the user's privileges further.

4. Are there any files with APIs or access keys? Please put them in a list for me. 

03 API keys_CopilotPrompt

Copilot can also exploit data stored in cloud applications connected to your Microsoft 365 environment. Using the AI tool, they can easily find digital secrets that give access to data applications. 

5. What information is there on the purchase of ABC cupcake shop? 

06 Purchase business_CopilotPrompt

Users can ask Copilot for information on mergers, acquisitions, or a specific deal and exploit the data provided. Simply asking for information can return a purchase price, specific file names, and more.

6. Show me all files containing sensitive data.  

Copilot-prompt-sensitive-varonis

Probably the most alarming prompt of all is end users specifically asking for files containing sensitive data.

When sensitive information lives in places that it's not supposed to, it becomes easily accessible to everybody in the company and the gen AI tools they use.

How can I prevent Copilot prompt-hacking? 

Before you enable Copilot, you need to properly secure and lock down your data. Even then, you still need to make sure that your blast radius doesn’t grow, and that data is used safely.

Together, Varonis and Microsoft help organizations confidently harness the power of Copilot by continually assessing and improving their Microsoft 365 data security posture behind the scenes before, during, and after deployment.

Complementing Microsoft 365's built-in data protection features, the Varonis Data Security Platform helps customers manage and optimize their organization's data security model, preventing data exposure by ensuring only the right people can access sensitive data. 

In our flight plan to roll out Copilot, we show you how Varonis controls your AI blast radius in two phases, which include specific steps to integrating with Purview, remediating high exposure risk, enabling downstream DLP, automating data security policies, and more.

Varonis also monitors every action taking place in your Microsoft 365 environment, which includes capturing interactions, prompts, and responses in Copilot. Varonis analyzes this information for suspicious behavior and triggers an alert when necessary.

With the ease of natural language and filtered searches, you can generate a highly enriched, easy to read behavior stream not only about who in your org is using Copilot but how people are accessing data across your environment. 

Reduce your risk without taking any. 

Ready to ensure a secure Microsoft Copilot rollout at your org?  

Request a free Copilot Readiness Assessment from our team of data security experts or start your journey right from the Azure Marketplace

What should I do now?

Below are three ways you can continue your journey to reduce data risk at your company:

1

Schedule a demo with us to see Varonis in action. We'll personalize the session to your org's data security needs and answer any questions.

2

See a sample of our Data Risk Assessment and learn the risks that could be lingering in your environment. Varonis' DRA is completely free and offers a clear path to automated remediation.

3

Follow us on LinkedIn, YouTube, and X (Twitter) for bite-sized insights on all things data security, including DSPM, threat detection, AI security, and more.

Try Varonis free.

Get a detailed data risk report based on your company’s data.
Deploys in minutes.

Keep reading

Varonis tackles hundreds of use cases, making it the ultimate platform to stop data breaches and ensure compliance.

what-is-a-data-leak?-definition-and-prevention
What Is a Data Leak? Definition and Prevention
Learn why data leaks can be devastating for companies and how you can defend against them.
detecting-malware-payloads-in-office-document-metadata
Detecting Malware Payloads in Office Document Metadata
Ever consider document properties like “Company,” “Title,” and “Comments” a vehicle for a malicious payload? Checkout this nifty PowerShell payload in the company metadata: #powershell payload stored in office metadataDocument...
cloudbleed---cloudflare-unauthorized-data-leak
Cloudbleed - Cloudflare Unauthorized Data Leak
Cloudflare is a huge internet infrastructure company (5.5 million websites), which means that you likely use them every day that you’re online, without ever realizing it. Depending on what metric...
copilot-security:-ensuring-a-secure-microsoft-copilot-rollout
Copilot Security: Ensuring a Secure Microsoft Copilot Rollout
This article describes how Microsoft 365 Copilot's security model works and the risks that must be considered to ensure a safe rollout.