Home

Does the Internet suck? Or do just the parts we get to see suck?

In the new world order of robots and talking heads, a reality exists that’s part life, part cinema, and part algorithm. But as the algorithmic curators silently curate, an alternate reality also emerges, starring the humans who win at losing the losing social media game. A secret cinematic world, seen only by robots. Who do social media algorithms render invisible?

What the Butler Saw meets social media robots.

‘What the Robot Saw’ is a perpetually-generated, live streamed robot film. The Robot uses contrarian ranking algorithms to curate some of the least attention-grabbing new videos on online media. These videos are rendered largely invisible by commercial social media ranking algorithms — and so may only be seen by the robots that rank them. Using face and image analysis algorithms to curate videos and study their subjects, ‘What the Robot Saw’ assembles its film and identifies its “talking head” performers in a robofantastical cinematic style. It’s a Sunday drive through the awkward intersections of performance, surveillance, voyeurism, and robots in the age of the talking head.

^^^Please turn on the sound.^^^ ‘What the Robot Saw’ streams in HD, so please check that the player is set for the highest resolution your network connection will handle at the moment. It’s intended for fullscreen viewing, if you can. If the stream’s not live, or to scrub through the live or recent videos, view them on the Twitch videos page.

Revealing a mix of underacknowledged media makers, performed selves, and obsessive surveillance algorithms, ‘What the Robot Saw’ enters into the complicated relationship between the world’s surveillant and curatorial AI robots and the humans who are both their subjects and their stars. It’s not a video about how robots actually see. It’s a response to processes of representation in the contemporary collision of performed selves, screen-centric perceptions — and robots.

Using computer vision, neural networks, and other robotic ways of seeing, hearing, and understanding, the Robot constantly generates its film, algorithmically curating, editing, and titling clips according to Amazon Rekognition’s demographically-obsessed algorithms. The clips are algocurated from among the least viewed and subscribed YouTube videos uploaded in recent hours. These videos are rendered nearly invisible by YouTube’s engagement-centered ranking algorithms, which favor attention-grabbing content. For less eye-catching, polished, search-optimized or sensational videos, robots may be the primary audience.

Humans have a complicated relationship with robots. An invisible audience of software robots continually analyze content on the Internet. Videos by non-“YouTube stars” that algorithms don’t promote to the top of the search rankings or the “recommended” sidebar may be seen by few or no human viewers. In ‘What the Robot Saw,’ the Robot is AI voyeur turned director: classifying and magnifying the online personas of the subjects of a never-ending film. A loose, stream-of-AI-“consciousness” narrative is developed as the Robot drifts through neural network-determined groupings.

WTRS Screenshot 01

Image 1 of 15

Robot meets Resting Bitch Face and Other Adventures. As it makes its way through the film, the Robot adds lower third supered titles. Occasional vaguely poetic section titles are derived from the Robot’s image recognition-based groupings. More often, lower-third supers identify the many “talking heads” who appear in the film. The identifiers – labels like “Confused-Looking Female, age 22-34” – are generated using Amazon Rekognition, a popular commercial face recognition and analysis library. (The Robot precedes descriptions that have somewhat weaker confidence scores with qualifiers like “Arguably.”) The feature set of Rekognition offers a glimpse into how computer vision robots, marketers, and others drawn to off-the-shelf face analysis products often choose to categorize humans: age, gender, and appearance of emotion are key. The prominence of these software features suggest that in a robot-centric world, these attributes might better identify us than our names. (Of course Amazon itself is quite a bit more granular in its identification techniques.) The subjects’ appearance and demeanor on video as fully human beings humorously defy Rekognition’s surveillance and marketing enamored perceptions.

What the Robot Saw’s title is a reference to What the Butler Saw films. Neither the butlers nor the Robot could really understand the objects of their obsession. Despite their self-satisfaction, all they had was a squinted glimpse at a peep show.

The live streams run throughout the day; there are brief “intermissions” every few hours (and as needed for maintenance.) Extensive archives auto-documenting previous streams — and thus a piece of YouTube in realtime — are available on the Videos page.

‘What the Robot Saw’ is a non-commercial project by Amy Alexander