I used runwayml to reconstruct this scene from The Last Jedi by having DeepLab recognize all the objects in each frame, and then having SPADE try to paint the original frame from the objects.
(if you're really sensitive to flashing images, you might want to play this frame-by-frame - the algorithm couldn't make up its mind about the background)
youtu.be/sEd1EO8eIfw

More on the reconstruction process here: aiweirdness.com/post/186539822
This is the first step, where DeepLab segments an image into objects. "Lightsaber" is not a category it has, so it improvises.

Next, SPADE has to paint a picture just from DeepLab's labeled image. It sees person with a baseball bat, so it looks a lot like it tried to paint Luke in a baseball jersey.

Follow

Here's another scene from TLJ as segmented by DeepLab (which sees people with teddy bears hiding behind the rock) and SPADE (which dutifully has to draw the teddy bears)

Sign in to participate in the conversation
Wandering Shop

The Wandering Shop is a Mastodon instance initially geared for the science fiction and fantasy community but open to anyone. We want our 'local' timeline to have the feel of a coffee shop at a good convention: tables full of friendly conversation on a wide variety of topics. We welcome everyone who wants to participate, so long as you're willing to abide by our code of conduct.