Learn How to Reliably Monitor Your Data and Model Quality in the Lakehouse

Databricks
Databricks
11.1 هزار بار بازدید - 12 ماه پیش - Developing and upkeep of production
Developing and upkeep of production data engineering and machine learning pipelines is a challenging process for many data teams. Even more challenging is monitoring the quality of your data and models once they go into production. Building upon untrustworthy data can cause many complications for data teams. Without a monitoring service, it is challenging to proactively discover when your ML models degrade over time, and the root causes behind it. Furthermore, with a lack of lineage tracking, it is even more painful to debug errors in your models and data. Databricks Lakehouse Monitoring offers a unified service to monitor the quality of all your data and ML assets.

In this session, you’ll learn how to:

- Use one unified tool to monitor the quality of any data product: data or AI
- Quickly diagnose errors in your data products with root cause analysis
- Set up a monitor with low friction, requiring only a button click or a single API call to start and automatically generate out-of-the-box metrics
- Enable self-serve experiences for data analysts by providing reliability status for every data asset

Talk by: Kasey Uhlenhuth and Alkis Polyzotis

Connect with us: Website: https://databricks.com
Twitter: Twitter: databricks
LinkedIn: LinkedIn: databricks
Instagram: Instagram: databricksinc
Facebook: Facebook: databricksinc
12 ماه پیش در تاریخ 1402/05/03 منتشر شده است.
11,176 بـار بازدید شده
... بیشتر