Securing Gen AI RAG Data using Azure AI Search

Large Language Models (LLMs) and Generative AI have inherent limitations, such as outdated knowledge, lack of private data access, and the potential for hallucinations. In this session, we will introduce a strategy for overcoming these challenges: Retrieval-Augmented Generation (RAG). Attendees will see how a GenAI RAG application can provide access to real-time, private data stored in an external knowledge base without needing to fine-tune the base LLM model.

 

With an understanding of the GenAI RAG application, we will explore an example cloud infrastructure hosting the application using Azure AI Search, Azure Storage, and Azure Container Apps. The cloud architecture review will uncover new attack vectors and cloud security misconfigurations that can unintentionally leak RAG data to an attacker. Attendees will see how these vulnerabilities can be used to gain unauthorized access to AI data. Then, we will look at the cloud security controls needed to authorize access to the RAG data.

 

Attendees will walk away with an understanding of GenAI RAG applications, the underlying cloud infrastructure powering these AI systems, and the security controls needed to protect sensitive RAG data.

 

Learning Objectives:

  • Review GenAI RAG application architecture
  • Identify misconfigurations in GenAI RAG cloud infrastructure
  • Learn GenAI RAG cloud security controls

Event Topic

Cybersecurity, Data Center / Infrastructure, Security

Relevant Audiences

All State and Local Government, All Federal Government

Other Agency

Other Federal Agencies
Securing Gen AI RAG Data using Azure AI Search
Event Type
Virtual / Online
Event Subtype
Webinar / Webcast
When
Tue, Dec 09, 2025 | 12:00 pm - 1:00 pm ET
Registration Cost
Complimentary
Website
Click here to view event website
Organizer
SANS Institute