Unstructured Data Meetup SF
This is an in-person event! Registration is required to get in. Github will email you a form the day before the event, which you will need to complete for your access pass. Registration will close 2 days before the event.
Topic: Connecting your unstructured data with Generative LLMs
What we’ll do:
Have some food and refreshments. Hear three exciting talks about LLMs and unstructured data.
5:30 - 6:30 - Welcome/Networking/Registration
6:35 - 7:00 - Christy Bergman, Developer Advocate, Zilliz
7:05 - 7:30 - Alexy Khrabrov, OSS Community Director, IBM and Chair, Generative AI Commons at the Linux Foundation
7:35 - 8:00 - George Williams, Organizer, Big-ANN NeurIPS 2023
8:00 - 8:30 - Networking
Who Should attend:
Anyone interested in talking and learning about Unstructured Data and LLM Apps.
When:
January 16th, 2023
5:30PM
Tech Talk 1: Vector Search in the Age of OpenAI Assistants using Milvus
Speaker: Christy Bergman
Abstract: The OpenAI Assistants API allows you to build AI assistants within your own applications. An Assistant has instructions and can leverage models, tools, and knowledge to respond to user queries. The Assistants API currently supports three types of tools: Code Interpreter, Retrieval, and Function calling. I will talk about and show a few examples what you can and cannot do just invoking the OpenAI assistant directly to get final answers, versus using custom execution loops for Retrieval.
Tech Talk 2: theAlliance.ai: Defining AI Adoption as a Community
Speaker: Alexy Khrabrov
Abstract: With more than 60 members, including Zilliz, the mission is to ensure and evaluate high-quality and safe AI through benchmarks, tools, and methodologies. The Alliance encompasses leading AI organizations, including HuggingFace, LangChain, LlamaIndex, top research universities, hardware giants such as Dell, Intel and AMD, foundations such as NumFOCUS and Linux Foundation, and more.
Tech Talk 3: Are CPUs Enough? A Review Of Vector Search Running On Novel Hardware
Speaker: George Wiliams
Abstract: The Cambrian explosion of artificial intelligence has ushered in a new Golden Age for computer hardware. While Nvidia steals the lion share of headlines due to their dominance running deep learning workloads, several other hardware alternatives are starting to appear in the market. AI-adjacent technologies like vector search will also reap the benefit from this hardware renaissance. In this talk, I’ll talk about the intersection of vector search and advanced hardware, hardware that goes far beyond traditional CPU design which has dominated the computing landscape for over 60 years. I’ll review the results from the first NeurIPS BigANN hardware competition, we’ll discuss the benchmarks that are key to differentiate hardware alternatives, and we will look at new chip architectures that will fuse vector search and transformer inference in one device. No prior knowledge about hardware is necessary for this talk. I will describe at a high level the fundamental differences among CPU alternatives such as GPUs, Vector Processors, Compute-In-Memory microprocessors, AI Accelerators, Neuromorphic Chips, and Field Programmable Arrays.
Where:
This is an in-person event! Registration is required to get in. Registration will close 2 days before the event.
Sponsored by Zilliz maintainers of Milvus.