BSLIVE / Using ControlNet and Stable Diffusion AI for Animated Dance Video

Jimmy Gunawan
Jimmy Gunawan
11.3 هزار بار بازدید - پارسال - In this video, I am
In this video, I am sharing my update progress on my experiment with ControlNet and OpenPose for Stable Diffusion AI Tool, this time I use it to generate dance video and not just funny pose.

Ideally every computer artist has motion capture tool and dance our 3D avatar. We need to ensure a good looking avatar etc. Then render it out using Blender or other renderer or just dance inside game engine. Then publish as video or other way.

But with AI Tools, and with ControlNet, we almost can simulate and generate some kind of augmentation layer from just a video and turning it into something else. It's really powerful. Of course this has always been possible, you can always trace or hand drawn anything. But you cannot just do it overnight. Usually this takes many days or weeks.

But with AI, you might be soon able to have a character that you like, that you train yourself and make it to dance or even acting. It's almost like realtime IG or Tiktok filter. AI can turn you into a different representation. For now, there is still some processing to get the best quality, but in the future? Maybe realtime!

* Pose:
https://huggingface.co/spaces/jonigat...
https://avatarposemaker.deezein.com

* Tutorial by Olivio Sarikas on how to setup Stable Diffusion ControlNet to generate animated video:
ControlNET to Video - Stable Diffusio...
پارسال در تاریخ 1401/12/02 منتشر شده است.
11,332 بـار بازدید شده
... بیشتر