Salesforce Einstein Copilot: Boosting Productivity With a Focus on Security

AI tools like Salesforce Einstein Copilot can improve efficiency, but also increase risk. Check out these tips on preparing for a Copilot rollout.
Tomer Ronen
2 min read
Last updated July 3, 2024
Copilot pop-up box that says

Gen AI tools increase efficiency, provide insights, and offer recommendations, but with those capabilities comes increased risk.

Salesforce Einstein Copilot is a powerful conversational AI assistant integrated into the CRM’s platform, designed to enhance productivity by automating tasks. However, many Salesforce users don’t realize their role in protecting their org’s sensitive data.

Salesforce Ben Technical Instructor Andrew Cook and Varonis Field CTO Brian Vecci recently held a session on tips for preparing for a Copilot rollout. They discussed how to lock down data the AI tool shouldn’t access and explained why security is every user’s responsibility.

Salesforce’s shared responsibility model

While Salesforce is responsible for securing the SaaS app’s infrastructure, platform, and services, the security of the information stored within the CRM tool lies squarely on the user’s shoulders.

Your data is your responsibility. Everyone should be aware of that, especially as the topic of data becomes more and more important.

Andrew Cook, Salesforce Ben Technical Instructor

 

Salesforce’s shared responsibility model states that although Einstein Copilot securely processes and does not retain data, customers are responsible for the data itself, its permissions, and its usage.

“Copilot has no way to know who and what is supposed to have access to what data,” Brian said. “So customers must ensure Copilot is used responsibly.”

Using the shared responsibility model to understand which security aspects an organization is responsible for and which fall under the SaaS application’s domain can help security teams mitigate risks and protect sensitive information.

You can see a traditional breakdown of the shared responsibility outlined below, which several cloud providers such as Salesforce, AWS, and Microsoft adhere to at their organizations. 

Blog_SharedResponsibilityModel_Diagram_202402_V1

A traditional shared responsibility model.

Einstein Copilot security challenges

Copilot can make a Salesforce user’s life much easier but can also be an insider threat’s best friend, Brian said.

The AI tool can help users quickly find C-level contacts for revenue accounts, identify sensitive PII data, and easily determine the highest and lowest-paying customers.

“All of these actions are potentially dangerous,” Brian said. “But if you’re doing a good job with preventative controls, making sure that people and applications only have access to what they’re supposed to, and are monitoring how data is being accessed and when there’s potentially malicious access, then everything should be locked down.”

However, Brian said that employees often have more access to data than they need, and sensitive data may be hidden in unexpected places.

Preparing your organization for Einstein Copilot

With complex roles, permission sets, and org-wide configurations, it’s virtually impossible to see which users can do the most damage in Salesforce.

To protect your org before, during, and after a copilot deployment, Brian recommends locking down permissions, ensuring proper data handling through training and guardrails, and identifying sensitive data that AI shouldn’t have access to.

Users should only have access to the data they need to do their jobs.

Brian Vecci, Varonis Field CTO

Protect your sensitive Salesforce data with Varonis.

Varonis gives you a complete view of effective access for every Salesforce user, allowing you to easily right-size permissions and get to a least privilege model. Ensure compliance by only allowing people to access the data they need to do their jobs.

Varonis for Salesforce also gives you the power to locate and control hard-to-find regulated data across all your Salesforce instances. Whether it’s stored in records and fields or files and attachments, we’ll find, surface, and protect it.

“You’ll be able to automatically visualize where you have exposures and where you have risk,” Brian said. “You’ll be able to automatically lock everything down.”

Watch the entire discussion to see how Salesforce Einstein Copilot can help your org and learn more about your role in securing data.

What should I do now?

Below are three ways you can continue your journey to reduce data risk at your company:

1

Schedule a demo with us to see Varonis in action. We'll personalize the session to your org's data security needs and answer any questions.

2

See a sample of our Data Risk Assessment and learn the risks that could be lingering in your environment. Varonis' DRA is completely free and offers a clear path to automated remediation.

3

Follow us on LinkedIn, YouTube, and X (Twitter) for bite-sized insights on all things data security, including DSPM, threat detection, AI security, and more.

Try Varonis free.

Get a detailed data risk report based on your company’s data.
Deploys in minutes.

Keep reading

Varonis tackles hundreds of use cases, making it the ultimate platform to stop data breaches and ensure compliance.

generative-ai-security:-preparing-for-salesforce-einstein-copilot
Generative AI Security: Preparing for Salesforce Einstein Copilot
See how Salesforce Einstein Copilot’s security model works and the risks you must mitigate to ensure a safe and secure rollout.
copilot-security:-ensuring-a-secure-microsoft-copilot-rollout
Copilot Security: Ensuring a Secure Microsoft Copilot Rollout
This article describes how Microsoft 365 Copilot's security model works and the risks that must be considered to ensure a safe rollout.
why-your-org-needs-a-copilot-security-scan-before-deploying-ai-tools
Why Your Org Needs a Copilot Security Scan Before Deploying AI Tools
Assessing your security posture before deploying gen AI tools like Copilot for Microsoft 365 is a crucial first step.
6-prompts-you-don't-want-employees-putting-in-copilot
6 Prompts You Don't Want Employees Putting in Copilot
Discover what simple prompts could expose your company’s sensitive data in Microsoft Copilot.