Cover Image for vLLM Inference Meetup: New Delhi, India
Cover Image for vLLM Inference Meetup: New Delhi, India
Avatar for vLLM Meetups and Events
Join the vLLM community to discuss optimizing LLM inference!

vLLM Inference Meetup: New Delhi, India

Registration Closed
This event is not currently taking registrations. You may contact the host or subscribe to receive updates.
About Event

We are excited to invite you to the vLLM meetup in Delhi hosted by Red Hat.

​​This is your chance to connect with a growing community of vLLM users, developers, maintainers, and engineers from Red Hat. We'll dive deep into technical talks, share insights, and discuss our journey in optimizing LLM inference for performance and efficiency.

​​What to expect:
Technical insights
Networking with industry experts
Hands-on learning & demos

Agenda

09:30-10:00: Kick-off and Opening Remarks
10:00-10:30: How AI Inference works
10:30-11:00: vLLM and Advance Inference Techniques
11:00-11:30: LLM Compressor
11:30-12:00: Running GenAI Mode on vLLM
12:00-12:30: Break

12:30-02:00: Hands-on Lab

Bring your laptop with SSH installed. GPU instances provided by organizers.

Hosts:
eprasad96@gmail.com
jpathani@redhat.com

Location
Worldmark Aerocity
Aerocity, Indira Gandhi International Airport, New Delhi, Delhi 110037, India
Location: Red Hat India Pvt.Ltd 5th Floor, West Wing World Mark -1 Aerocity New Delhi - 110037 Phone - 011 66442029
Avatar for vLLM Meetups and Events
Join the vLLM community to discuss optimizing LLM inference!