Cover Image for Building Time Series Foundational Models: Past, Present and Future
Cover Image for Building Time Series Foundational Models: Past, Present and Future
Avatar for Data Phoenix Events
365 Going

Building Time Series Foundational Models: Past, Present and Future

Virtual
Get Tickets
Past Event
Welcome! Please choose your desired ticket type:
About Event

​​​The Data Phoenix team invites you to our upcoming webinar, which will take place on July 18th at 10 a.m. PDT.

  • ​​​​​Topic: Building a time series foundational model: past, present, and future

  • ​​​​​Speakers: Leo Pekelis (Chief Scientist at Gradient)

  • ​​​​​Participation: free (but you’ll be required to register)

​Time series data is ubiquitous across industries: the startup COO predicts customer demand; a clinician in the ICU reads medical charts, the stock broker forecasts security prices. In the past, a technical and domain expert would build, train, and implement a new model for each task, in each industry's swim lane. This is a massive intellectual fragmentation bottleneck!

Luckily, transformer architectures, enabling zero-shot sequence modeling across modalities, are a perfect solution. We introduce a new frontier in transformer modalities - time series - where massive amounts of domain knowledge are taught to large time series models (LTSMs), forming a universal prior across forecasting, imputation, classification, and anomaly detection tasks.

Join us as we review the next frontier of AI, showcasing Gradient’s LTSM, a novel architecture, and massive time series dataset, achieving state of the art performance on time series tasks. Our foundational model and datasets are fully open sourced. Finally, we preview multimodal foundational time series models, where working with time series data is as easy as prompting ChatGPT.

​Key Highlights of the Webinar:

  • Cross-Industry Time Series Analysis: Learn how time series data is used in diverse fields—from retail for demand forecasting, in healthcare for monitoring ICU patients, to finance for predicting stock movements.

  • Introduction to Transformer Modalities in Time Series: Discover the transformative application of transformer architectures to time series data and how these models can perform zero-shot learning across different types of time series tasks to form perfect solutions.

  • State-of-the-Art Time Series Models: Get an in-depth look at Gradient’s LTSM (Large Time Series Model), a novel architecture, and massive time series dataset, achieving state of the art performance on time series tasks

  • Preview of Multimodal Foundational Models: Preview upcoming advancements in multimodal foundational models that promise to simplify working with time series data, making it as easy as prompting ChatGPT.

​​Speaker

Leo is a Chief Scientist at Gradient leading research and analytics, a full stack AI platform that enables businesses to build customized agents to power enterprise workload. Prior to Gradient, Leo led CloudTruck's ML and data science orgs pioneering applied ML to operational challenges. Before that, Leo held leadership roles across Opendoor, Optimizely, and Disney. Leo holds a bachelor's degree in economics from Stanford, as well as a masters and PhD in statistics from Stanford.

Please join DataPhoenix Discord and follow us on LinkedIn and YouTube to stay updated on our community events and the latest AI and data news.

Avatar for Data Phoenix Events
365 Going