×
Home About Updates Deep Filters Youtube Github Contact

algorithmic rotoscope

Experimentation in applying texture to video using machine learning via APIs.

Highlighting Algorithmic Transparency Using My Algorotoscope Work

I started doing my algorotoscope work to better understand machine learning. I needed a hands-on project that would allow me to play with two overlapping slices of the machine learning pie–working with images and video. I wanted to understand what people meant when they said texture transfer or object recognition, and quanitfy the differences between machine learning providers, pulling the curtain back a little on a portion of machine learning, helping establish some transparency and observability.

Algorotoscope allows me shine a light on machine learning, while also shining a light on the world of algorithms. I’m still learning what is possible, but the motivations behind my Ellis Island Nazi Poster reflection, and my White House Russian propaganda leaflet snapshot are meant to help me understand machine learning texture transfer models, and apply them to images in a way that helps demonstrate how algorithms are obfuscating the physical and digital world around us. Showcasing that algorithms are being used to distort the immigration debate, our elections, and almost every other aspect of our professional and personal lives.

I understand technology by using it. Black box algorithms seem to be indistiguishable from magic for many folks, while they scare the hell out of me. Not because they contain magic, but becuase they contain exploitation, bias, corruption, privacy, and security nightmares. It is important to me that we understand the levers, knobs, dials, and gears behind algorithms. I am looking to use my algorotscope work help reduce the distortion field that often surrounds algorithms, and how their various incarnations are being marketed. I want my readers to understand that nothing they read, no image they see, or video they watch is free of algorithmic influence, and that algorithms are making the decision about what you see, as well as what we do not see.

Algorotscope is all about using machine learning to help us visualize the impact that algorithms are making on our world. I have no idea where the work is headed, except that I will keep working to generate relevant machine learning models trained on relevant images, then experiment with the application of these models as filters on images and video in a way that tells a story about how algorithms are distorting our world, and shifting how we view things both on and offline. I’m looking to move my API Evangelist storytelling to use 100% algorotscope images, as I keep scratching the surface of how algorithms are invading our lives via the web, devices, and everyday objects.