Loading…
Tuesday November 12, 2024 5:00pm - 5:10pm MST
Over the past year, there has been a lot of important work and discussions around how to run Large Language Models (LLMs) on top of Kubernetes. But what happens once you have your LLM running in your cluster? How can you build an application around your LLM, where the LLM can act as an agent and call out to various services you have deployed in your cluster to solve complex tasks that would otherwise cause the LLM to hallucinate? Is there a way to do this without making changes to the services and deployments already in your cluster? In this talk I will be exploring what an LLM agent is and how you build an agent that will call out to your services. I will show this with a demo of an LLM agent running in a kubernetes cluster that is able to automatically detect new tools and agents deployed to the cluster and coordinate them to complete complex tasks in a conversation with the user.
Speakers
avatar for Calum Murray

Calum Murray

Knative Eventing Maintainer and UX Lead, University of Toronto, Canada
I'm a software engineer, and I love building cool things in open source. I like to seek out the most interesting and challenging problems which I think will have a large impact, and build creative solutions to them. I also like to share my passion for open source with others, and... Read More →
Tuesday November 12, 2024 5:00pm - 5:10pm MST
Salt Palace | Level 1 | 151 G
Feedback form is now closed.

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Share Modal

Share this link via

Or copy link