Denver February 19
Microsoft, AWS and Google subject matter experts discuss Generative AI - including LLM Model Comparisons, GenAI Cost Management, RAG Data Architecture, NIST AI Risk Management Framework, Responsible AI - plus CyberSecurity/Compliance, Data Analytics/Privacy and Application Modernization/Cloud Migration.
Angelbeat provides a unique opportunity to directly interact with, ask questions to, plus learn from, the world’s leading Cloud/AI firms, in one focused program.
Scroll down to see the detailed agenda and speakers. Click on the name to view their Linkedin profile, and session title for additional information. CPE credit hours are provided.
While the event is designed for on-site attendees, every presentation is also Livestreamed via Zoom, like a regular webinar, for remote/online viewers.
Please use your Angelbeat account - created on the secure Memberspace platform - to signup by clicking the green REGISTRATION button. You are automatically registered for the Zoom LiveStream, and can then request to attend in person (limited to 30 individuals so act quickly). If you already have an account, you will be asked to confirm your registration. Otherwise you are prompted to fill out a simple contact form.
TIME: 9 am MT until Noon
Speakers, Topics, Agenda
9 am MT Ron Gerber, CEO Angelbeat
Angelbeat Overview and Introductory Comments
Ron’s summarizes the day’s agenda, describes how Angelbeat can organize private/custom AI workshops for your organization, and why you should document an Artificial Intelligence Risk Management Framework, based on NIST Standards. Click on image above to watch Gerber’s Videoblog/Podcast the_angel_beat about Top AI Issues in 2025.
9:10 am Chris Wiederspan, Director Application Innovation, Azure, Microsoft
Microsoft OpenAI Generative AI Partnership Update
Learn about all the new innovations on the Microsoft and OpenAI Partnership.
Practical Insights and GenAI Technical Deployment Recommendations
AI Data Architecture & Governance: RAG, Data Lakes, Data Privacy
Hallucinations/Deepfakes AI-Generated Bad, Wrong, Malicious Output/Content
Strategies to Optimize GenAI Output/Content: Prompt Engineering vs Fine Tuning vs Data Labeling
Hybrid Orchestrations and Models: LLM vs SLM vs Industry-Specific vs Functional-Specific
AI and ML workloads in Azure Kubernetes Service (AKS)
Artificial intelligence (AI) and machine learning (ML) are transforming the way we solve complex problems and deliver value to customers. Generative AI has produced a paradigm shift that has transformed applications into data-driven, personalized experiences across a full range of applications involving text, code, images, videos, voice, music, and more. We've also seen an increase in model size, operational complexity, and an increased need for security and cost-efficiency.
To develop cutting-edge AI models, you need to be able to allocate computing resources across diverse workloads. This includes model training, serving, inference, and managing auxiliary tasks across infrastructure and workflow orchestration.
It's essential to enable easy integration with open-source software, frameworks, and data platforms. Developing, refining, optimizing, deploying, and monitoring ML models can be challenging and complex. These inherent complexities require a platform, like AKS, that allows you to maximize the performance of your AI and ML workloads while reducing inefficiencies and bottlenecks.
9:40 am AWS Keynotes
Todd Moore, Senior Solutions Architect
David Marsh, Senior Solutions Architect
AWS Security Best Practices that Accelerate Innovation
At AWS, security is the top priority. As cyber threats become increasingly complex, organizations implement more complex tools to mitigate them.
In this session, you will learn how automation, centralized security tooling, and AI-assisted DevOps processes contribute to a security posture that allows your teams to innovate quickly without compromising on security.
Amazon SageMaker Unified Studio: An Integrated Experience for All Your Data and AI
As AI and analytics use cases converge, transform how data teams work together with SageMaker Unified Studio.
Discover your data and put it to work using familiar AWS tools for complete development workflows, including model development, generative AI app development, data processing, and SQL analytics, in a single governed environment
Create projects to collaborate with your teams, securely share AI and analytics artifacts, and access your data stored in Amazon Simple Storage Service (Amazon S3), Redshift, and more sources through Amazon SageMaker Lakehouse
10:10 Speaker TBD
Unlocking AI-Driven Security: Trellix Wise and Amazon Bedrock in Action
In this session, attendees will explore how the integration of Amazon Bedrock telemetry with Trellix XDR delivers unparalleled observability into AI-enabled applications. This integration offers complete visibility into the outputs of generative AI (GenAI) workloads, helping to detect suspicious activities and malicious content. Attendees will also learn how to enforce guardrails around data privacy and promote responsible AI practices, ensuring the safe deployment and management of GenAI in their environments.
10:30 Roger Grimes, Data-Driven Security Evangelist, KnowBe4
Hacking the Hacker: Assessing and Addressing Your Organization’s Cyber Defense Weaknesses
Cybercriminals are out there, watching and waiting for the perfect opportunity. They are gathering information about your organization and users, devising the perfect plan to infiltrate your defenses.
But with a strategic approach to cyber defense you can hack the hacker before they strike! Roger A. Grimes, Data-Driven Defense Evangelist at KnowBe4, exposes the mind of a hacker to help you see your cyber risks from the outside. You’ll learn:
How hackers collect “private” details about your organization and your users
The most common root causes that lead to damaging cyber attacks
Common mistakes made when designing cyber defenses and how to fix them
Data-driven strategies for mitigating your biggest weaknesses
Why a strong human firewall is your best, last line of defense
Get the details you need to know now to outsmart cybercriminals before you become their next victim.
10:50 am Tinu Anand, Cloud Strategy Leader, Google
Bring generative AI to real-world experiences quickly, efficiently, and responsibly, powered by Google’s most advanced technology and models including Gemini. Here are some of the specific topics to be addressed:
Build Applications and Experiences Powered by Generative AI: With Vertex AI, you can interact with, customize, and embed foundation models into your applications. Access foundation models on Model Garden, tune models via a simple UI on Vertex AI Studio, or use models directly in a data science notebook. Plus, with Vertex AI Agent Builder developers can build and deploy AI agents grounded in their data.
Customize and Deploy Gemini Models to Production in Vertex AI: Gemini, a multimodal model from Google DeepMind, is capable of understanding virtually any input, combining different types of information, and generating almost any output. Prompt and test Gemini in Vertex AI using text, images, video, or code. With Gemini’s advanced reasoning and generation capabilities, developers can try sample prompts for extracting text from images, converting image text to JSON, and even generate answers about uploaded images.
New Generation of AI Assistants for Developers, Google Cloud Services, and Applications: Gemini Code Assist offers AI-powered assistance to help developers build applications with higher velocity, quality, and security in popular code editors like VS Code and JetBrains, and on developer platforms like Firebase. Built with robust enterprise features, it enables organizations to adopt AI assistance at scale while meeting security, privacy, and compliance requirements. Additionally, Gemini for Google Cloud offerings assist users in working and coding more effectively, gaining deeper data insights, navigating security challenges, and more.