Skip to content

cosmicBboy/movenet

Repository files navigation

movenet

Generate raw audio from dance videos

Background

movenet is a research project for generating music from dance. The idea is to turn the human body into an instrument, converting sequences of images into raw audio waveforms.

Dataset

This project uses the Kinetics dataset to train the dance-to-audio generation model because it conveniently comes with a downloader that supports downloading video and audio.

Environment

This repo uses miniconda as a virtual environment.

conda create -n movenet python=3.9
conda activate movenet
conda env update -n movenet -f env.yml

Install youtube-dl depending on your system.

Resources

Download the Datasets

Creating Kinetics Dataset

Clone the downloader

git clone https://github.com/hai-labs/kinetics-downloader

If you want to reconstitute a fresh dataset, download it with:

cd kinetics-downloader
python download.py --categories "dancing" --num-workers <NUM_WORKERS> -v
cd ..
cp -R kinetics-downloader/dataset datasets/kinetics

Downloading the Data

You can also download the datasets from google drive. For example you can dump the kinetics_debug directory into dataset/kinetics_debug.

Running the Trainer

python movenet/pytorch_lightning_trainer.py --dataset datasets/kinetics_debug --n_epochs 1

Running an Experiment on gridai

The experiments directory contains makefiles for running jobs over various exprimental setups.

source env/gridai
make -f experiments/<makefile> <target>

About

Generate raw audio from dance videos

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published