Most organizations are investing in AI, but they struggle to make it part of everyday work. Tools get rolled out, excitement spikes, and then adoption stalls. Varonis' Yoav Lax wanted to create a different outcome.
Yoav, Varonis' AI Solutions Architect, has spent the last two years working hands‑on with engineering teams to move AI from experimentation into everyday work. In this blog, we'll share the practical framework he used to turn early skepticism into real AI adoption — so other teams can apply the same approach and see daily, measurable impact for themselves.
The problem we had to solve
When we started our journey with gen AI at Varonis, developers voiced legitimate concerns:
- “AI won’t solve my issues…”
- “It’s cluttering my code…”
- “Too much effort…”
- “What if someone deletes production data?”
Those sentiments reflected real risks and friction points. A transformation would only stick if we addressed these questions with process, transparency, and measurable impact — not slogans.
Within two years, our engineering team’s adoption of gen AI moved to 100% (2025). Over the same period, we saw faster delivery cycles and fewer production bugs, indicating that code quality rose as adoption grew. These are the steps we took:
Foundation first: access, buy‑in, feedback
Our first principle for teamwide adoption of gen AI was to open access paired with leadership’s buy‑in. We started small, giving licenses to influential engineers and all leaders from day one, then expanded as we validated impact. Weekly feedback loops surfaced friction quickly and created momentum.
The goal was to learn fast, compare workflows “with/without” GenAI support, and build an environment that makes adoption inevitable.
Key moves:
- Grant seed licenses in a pilot cohort of respected technical voices
- Include leadership early, so they experience value firsthand
- Run tight feedback cycles; share findings with the organization
The catalyst: guild, champions, workshops
[To increase adoption amongst our teams, we also formed an AI Guild, an exclusive hub of practitioners who shape standards, share patterns, and unblock teams, and appointed AI Champions across groups to be “field agents” for enablement.
We opened enrichment sessions (news, initiatives, success stories) to the broader org, where hundreds joined live. Most importantly, we ran hands‑on workshops that lifted people from basic usage to advanced techniques in a single day.
This matters because adoption accelerates when practitioners have a community, a playbook, and visible role models.
From theory to practice: hackathons
To cement habits, we hosted internal gen AI Hackathons focused on real day‑to‑day problems. Think of these as “dry runs” before touching core product code; practical building beats theoretical training every time.
In the weeks leading up to the event, we prepared our teams for success:
- Each team nominated a representative to complete an AI course, such as "How to Architect AI Agents" or "How RAG Works"
- We ran architecture sessions to finalize the design choices ahead of time
As a result, when the hackathon day arrived, teams were genuinely ready to deploy. Several projects shipped to production within weeks, proving that experimentation can — and should — translate into operational value.
Transparency drives adoption
We published team‑level adoption scorecards so groups could benchmark themselves, set targets, and respectively compete. We also analysed friction by IDE. For example, we observed higher acceptance and interaction rates in VS Code than in some other IDEs, so part of the adoption plan included nudging toward VS Code where appropriate. Visibility plus practical guidance beat mandates.
Engineering outcomes, not just usage
Beyond activation, we measured pull‑request (PR) dynamics where value becomes undeniable:
- PR coding time decreased by 152%
- PR review time decreased by 75%
- PR cycle time decreased by 96%
- Post‑review change rate decreased by 41%
These are business outcomes — faster throughput with fewer quality surprises — and the metrics leaders care about.
The framework we followed
We organized the journey into five phases:
- Detect Enablers: Identify influential leaders in every engineering group.
- Rollout: Grant licenses broadly, collect feedback, amplify success stories.
- Monitoring: Track activation, usage depth, and goals; define metrics that matter.
- Level‑Up: AI Guild, workshops, champions, hackathons to drive advanced capability.
- Completion: Normalize AI in delivery, reviews, and ops; keep improving with data.
This gave Varonis a clear path and a shared language for progress.
The cultural impact at Varonis
With speed and quality improving, our engineers adopted a builder’s mindset toward AI. Since the last hackathon, a community emerged that swaps patterns and ships with confidence.
The point isn’t novelty for novelty’s sake — it’s to cultivate an organization that learns and delivers better because AI is embedded.
Culture as a system: The AI Hub
Our internal AI Hub is the organizational backbone that turns AI from a tool to a culture. It’s a web app that centralizes how teams discover, use, and measure AI — so adoption is consistent, secure, and tied to outcomes.
Our AI Hub includes:
- Custom domain agents connected to sources: Teams publish agents that speak the language of their domain and connect securely to internal data (e.g., Jira, GitHub, Jenkins, Salesforce). Each agent abstracts workflows into natural‑language tasks with guardrails and auditability.
- MY AI score (personal adoption dashboard): A user‑level view of activation and impact: interaction depth, code acceptance, PR review/cycle time, and post‑review changes - so every engineer can see how AI improves their delivery and where to level up.
- MCP catalog (discoverable capabilities): A searchable registry of MCP‑based tools and integrations. Engineers browse, preview, and plug capabilities into their agents without reinventing the wheel.
- Knowledge bases and guild recordings: Org KBs, best‑practice articles, and recorded AI Guild sessions are indexed for retrieval. The Hub’s chat surfaces clips, notes, and references inline, turning learning assets into working context.
How your organization can adopt AI
To have your organization adopt gen AI, start with approved access and leadership buy‑in, build a guild and champions, run hands‑on workshops and hackathons, measure relentlessly, and ship real AI‑powered outcomes.
This can result in near‑universal adoption, faster delivery, fewer production bugs, and a steady stream of innovation.
What should I do now?
Below are three ways you can continue your journey to reduce data risk at your company:
Schedule a demo with us to see Varonis in action. We'll personalize the session to your org's data security needs and answer any questions.
See a sample of our Data Risk Assessment and learn the risks that could be lingering in your environment. Varonis' DRA is completely free and offers a clear path to automated remediation.
Follow us on LinkedIn, YouTube, and X (Twitter) for bite-sized insights on all things data security, including DSPM, threat detection, AI security, and more.