


AI infra & open-source models #3
We're hosting an AI infra & open-source models meetup in San Francisco during
Open Source AI Week, bringing together CUDA and ROCm experts, AI researchers, and AI infrastructure experts interested in GPUs and open-source models. Join us for lightning talks, and focused discussions.
This curated meetup is designed to bring together individuals deeply involved in the AI space to exchange ideas, share technical insights, and connect.
Schedule:
6:00pm – Arrival & chats
6:15pm – Lightning talks and demos (Part 1)
7:00pm – Break
7:15pm – Lightning talks and demos (Part 2)
8:00pm – Networking, food, and drinks
Each talk or demo runs for 5 minutes, followed by a 5-minute QA.
Talks:
TBA
Your hosts:
About dstack
dstack is an open-source container orchestrator that simplifies workload orchestration and drives GPU utilization for ML teams. It works with any GPU cloud, on-prem cluster, or accelerated hardware.
About Lambda
Lambda is where AI teams find infinite scale to produce intelligence: from prototyping on on-demand compute to serving billions of users in production, we guide and equip the world's most AI-advanced organizations to securely build and deploy AI products.
