Azure AI Content Safety Image Moderation

Microsoft Developer
Microsoft Developer
332 بار بازدید - 2 ماه پیش - Azure AI Studio provides the
Azure AI Studio provides the capability for you to quickly try out moderation tests on image content. The moderate image content tool considers several factors such as the type of content, the platform's policies, and the potential effect on users. You can run moderation tests on sample content and configure filters to rerun and further refine the test results.

In this demo, we'll demonstrate the basic functionality of running a simple test and how you can configure the filters and adjust the harm category threshold levels to suit your needs.

Disclosure: This demo contains an AI-generated voice.

Chapters:
00:00 - Introduction
00:40 - Test safe content
01:22 - Test harmful content
01:42 - Test AI-generated image

Resources:
Azure AI Studio - https://ai.azure.com
Harm Categories - https://aka.ms/harm-categories
Learn Module - https://aka.ms/aacs-studio-workshop
2 ماه پیش در تاریخ 1403/03/15 منتشر شده است.
332 بـار بازدید شده
... بیشتر