Posted On August 12, 2022
Mariel Pette brought her passions for machine learning and arts together to create Beyond Imitation, an open-source tool for movement-based research. She uses dance and a motion capture studio to train a variational autoencoder (VAE) model that can run in the browser via TensorFlow.js. This ML model then learns to generate new movements in the same style as the dancer that compliments their movements. She shares how the idea came to life and the future endeavors for the project.
Try it for yourself:
Live demo and learn more → https://goo.gle/3FY0Ozd
Mirror exercise → https://goo.gle/3NwjRD7
Want to be on the show? Use #MadeWithTFJS or #WebML to share your own creations on social media and we may feature you in our next show.
Catch more #MadeWithTFJS interviews → http://goo.gle/made-with-tfjs
Subscribe to the TensorFlow channel → https://goo.gle/TensorFlow