Giter Site home page Giter Site logo

avatarify's Introduction

▶️ Demo

▶️ AI-generated Elon Musk

Avatarify

Photorealistic avatars for Skype and Zoom. Democratized.

Based on First Order Motion Model.

Created by: Ali Aliev and Karim Iskakov.

Disclaimer: This project is unrelated to Samsung AI Center.

News

  • 17 April 2020. Created Slack community. Please join via invitation link.
  • 15 April 2020. Added StyleGAN-generated avatars. Just press Q and now you drive a person that never existed. Every time you push the button – new avatar is sampled.
  • 13 April 2020. Added Windows support (kudos to 9of9).

Table of Contents

Requirements

To run Avatarify smoothly you need a CUDA-enabled (NVIDIA) video card. Otherwise it will fallback to the central processor and run very slowly. These are performance metrics for some hardware:

  • GeForce GTX 1080 Ti: 33 fps
  • GeForce GTX 1070: 15 fps
  • Mac OSX (MacBook Pro 2018; no GPU): very slow ~1 fps

Of course, you also need a webcam!

Install

Download network weights

Download model's weights from Dropbox, Mega, Yandex.Disk or Google Drive [716 MB, md5sum 46b26eabacbcf1533ac66dc5cf234c5e]

Linux

Linux uses v4l2loopback to create virtual camera.

  1. Install CUDA.
  2. Download Miniconda Python 3.7 and install using command:
bash Miniconda3-latest-Linux-x86_64.sh
  1. Clone avatarify and install its dependencies (sudo privelege is required):
git clone https://github.com/alievk/avatarify.git
cd avatarify
bash scripts/install.sh
  1. Download network weights and place vox-adv-cpk.pth.tar file in the avatarify directory (don't unpack it).

Mac

(!) Note: we found out that in versions after v4.6.8 (March 23, 2020) Zoom disabled support for virtual cameras on Mac. To use Avatarify in Zoom you can choose from 2 options:

  • Install Zoom v4.6.8 which is the last version that supports virtual cameras
  • Use latest version of Zoom, but disable library validation:
codesign --remove-signature /Applications/zoom.us.app

For Mac it's quite difficult to create a virtual camera, so we'll use CamTwist app.

  1. Install Miniconda Python 3.7 or use Homebrew Cask: brew cask install miniconda.
  2. Clone avatarify and install its dependencies:
git clone https://github.com/alievk/avatarify.git
cd avatarify
bash scripts/install_mac.sh
  1. Download network weights and place vox-adv-cpk.pth.tar file in the avatarify directory (don't unpack it).
  2. Download and install CamTwist from here. It's easy.

Windows

Video tutorial is coming!

This guide is tested for Windows 10.

  1. Install CUDA.
  2. Install Miniconda Python 3.7.
  3. Install Git.
  4. Press Windows button and type "miniconda". Run suggested Anaconda Prompt.
  5. Download and install Avatarify (please copy-paste these commands and don't change them):
git clone https://github.com/alievk/avatarify.git
cd avatarify
scripts\install_windows.bat
  1. Download network weights and place vox-adv-cpk.pth.tar file in the avatarify directory (don't unpack it).
  2. Run run_windows.bat. If installation was successful, two windows "cam" and "avatarify" will appear. Leave these windows open for the next installation steps. If there are multiple cameras (including virtual ones) in the system, you may need to select the correct one. Open scripts/settings_windows.bat and edit CAMID variable. CAMID is an index number of camera like 0, 1, 2, ...
  3. Install OBS Studio for capturing Avatarify output.
  4. Install VirtualCam plugin. Choose Install and register only 1 virtual camera.
  5. Run OBS Studio.
  6. In the Sources section, press on Add button ("+" sign), select Windows Capture and press OK. In the appeared window, choose "[python.exe]: avatarify" in Window drop-down menu and press OK. Then select Edit -> Transform -> Fit to screen.
  7. In OBS Studio, go to Tools -> VirtualCam. Check AutoStart, set Buffered Frames to 0 and press Start.
  8. Now OSB-Camera camera should be available in Zoom (or other videoconferencing software).

The steps 11-12 are required only once during setup.

Setup avatars

Avatarify comes with a standard set of avatars of famous people, but you can extend this set simply copying your avatars into avatars folder.

Follow these advices for better visual quality:

  • Make square crop of your avatar picture.
  • Crop avatar's face so that it's not too close not too far. Use standard avarars as reference.
  • Prefer pictures with uniform background. It will diminish visual artifacts.

Run

Your web cam must be plugged-in.

Note: run Skype or Zoom only after Avatarify is started.

Linux

It is supposed that there is only one web cam connected to the computer at /dev/video0. The run script will create virtual camera /dev/video9. You can change these settings in scripts/settings.sh.

You can use command v4l2-ctl --list-devices to list all devices in your system. For example, if the web camera is /dev/video1 then the device id is 1.

Run:

bash run.sh

cam and avatarify windows will pop-up. The cam window is for controlling your face position and avatarify is for the avatar animation preview. Please follow these recommendations to drive your avatars.

Mac

  1. Run:
bash run_mac.sh
  1. Go to CamTwist.
  2. Choose Desktop+ and press Select.
  3. In the Settings section choose Confine to Application Window and select python (avatarify) from the drop-down menu.

cam and avatarify windows will pop-up. The cam window is for controlling your face position and avatarify is for the avatar animation preview. Please follow these recommendations to drive your avatars.

Windows

If there are multiple cameras (including virtual ones) in your system, you may need to select the correct one in scripts/settings_windows.bat. Open this file and edit CAMID variable. CAMID is an index number of camera like 0, 1, 2, ...

  1. In Anaconda Prompt:
cd C:\path\to\avatarify
run_windows.bat
  1. Run OBS Studio. It should automaitcally start streaming video from Avatarify to OBS-Camera.

cam and avatarify windows will pop-up. The cam window is for controlling your face position and avatarify is for the avatar animation preview. Please follow these recommendations to drive your avatars.

Note: To reduce video latency, in OBS Studio right click on the preview window and uncheck Enable Preview.

Controls

Keys Controls
1-9 These will immediately switch between the first 9 avatars.
Q Turns on StyleGAN-generated avatar. Every time you push the button – new avatar is sampled.
0 Toggles avatar display on and off.
A/D Previous/next avatar in folder.
W/S Zoom camera in/out.
Z/C Adjust avatar target overlay opacity.
X Reset reference frame.
F Toggle reference frame search mode.
R Mirror reference window.
T Mirror output window.
I Show FPS
ESC Quit

Driving your avatar

These are the main principles for driving your avatar:

  • Align your face in the camera window as closely as possible in proportion and position to the target avatar. Use zoom in/out function (W/S keys). When you have aligned, hit 'X' to use this frame as reference to drive the rest of the animation
  • Use the overlay function (Z/C keys) to match your and avatar's face expressions as close as possible

Alternatively, you can hit 'F' for the software to attempt to find a better reference frame itself. This will slow down the framerate, but while this is happening, you can keep moving your head around: the preview window will flash green when it finds your facial pose is a closer match to the avatar than the one it is currently using. You will see two numbers displayed as well: the first number is how closely you are currently aligned to the avatar, and the second number is how closely the reference frame is aligned.

You want to get the first number as small as possible - around 10 is usually a good alignment. When you are done, press 'F' again to exit reference frame search mode.

You don't need to be exact, and some other configurations can yield better results still, but it's usually a good starting point.

Configure video meeting app

Skype

Go to Settings -> Audio & Video, choose avatarify (Linux), CamTwist (Mac) or OBS-Camera (Windows) camera.

Zoom

Go to Settings -> Video and choose avatarify (Linux), CamTwist (Mac) or OBS-Camera (Windows) from Camera drop-down menu.

Slack

Make a call, allow browser using cameras, click on Settings icon, choose avatarify (Linux), CamTwist (Mac) or OBS-Camera (Windows) in Video settings drop-down menu.

Contribution

Our goal is to democratize deepfake avatars. To make the technology even more accessible, we have to tackle two major problems:

  1. Add support for more platforms (Linux and Mac are already supported).
  2. Optimize neural network run-time. Running network real-time on CPU is of high priority.

Please make pull requests if you have any improvements or bug-fixes.

Troubleshooting

  • My avatar is distorted: Please follow these recommendation for avatar driving.
  • Zoom/Skype doesn't see avatarify camera. Restart Zoom/Skype and try again.
  • Avatar image is frozen: In Zoom, try Stop and Start Video.
  • bash run_mac.sh crashes with "Cannot open camera": Try to change CAMID in run_mac.sh from 0 to 1, 2, ...
  • pipe:0: Invalid data found when processing input: Make sure CAMID in scripts/settings.sh is correct. Use v4l2-ctl --list-devices to query available devices.
  • ASSERT: "false" in file qasciikey.cpp, line 501. If you have several keyboard layouts, switch to English layout.
  • No such file or directory: 'vox-adv-cpk.pth.tar'. Please follow instructions Download network weights

avatarify's People

Contributors

alievk avatar giorgiop avatar karfly avatar martindelille avatar sagudev avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.