Home

‘What the Robot Saw’ (v 0.9) is a perpetually-generated documentary, durational online performance and archive by a social media robot, using contrarian ranking algorithms to make visible some of the people on online media that commercial social media algorithms hide — and so may usually only be seen by robots. Using face and image analysis algorithms, ‘What the Robot Saw’ assembles its film and identifies its performers — as the Robot saw them.

Revealing a mix of underacknowledged media makers, performed selves, and obsessive surveillance algorithms, ‘What the Robot Saw’ enters into the complicated relationship between the world’s surveillant and curatorial AI robots and the the humans who are both their subjects and their stars.

On March 18, 2020, YouTube began blocking the Robot. So the Robot has moved to Facebook Live Since the Robot thinks it’s particularly useful to document the current cultural moment in less seen videos, it’s doing live streams daily (but not yet 24/7) to Facebook Live. Newer archives are there too. The streaming setup is a bit rough on Facebook for now… please bear with the Robot…




‘What the Robot Saw’ streams in HD, so please check that the YouTube player is set for the highest resolution your network connection will handle (gear icon on lower right of the player.) Fullscreen or Theatre Mode is recommended (net conditions permitting.) If the stream isn’t live, you can find recent archives here.

Algorithms asking nosy questions: What about us do algorithms (and their makers and users) obsess about? The Robot continuously makes its way through low engagement online video, organizing and labeling the people and scenes it features as it live streams its film back to YouTube 24/7. Using computer vision, neural networks, and other robotic ways of seeing, hearing, and understanding, the Robot constantly generates its film, algorithmically curating, editing, and, most “culturalgorithmically” of all, titling clips from among the least viewed and subscribed YouTube videos uploaded in recent hours. These videos are rendered nearly invisible by YouTube’s engagement-centered ranking algorithms, which favor attention-grabbing content. For less eye-catching, polished, search-optimized or sensational videos, robots may be the primary audience.

Humans have a complicated relationship with robots. Especially the ones that work in the media. An invisible audience of software robots continually analyze content on the Internet. Videos by non-“YouTube stars” that algorithms don’t promote to the top of the search rankings or the “recommended” sidebar may be seen by few or no human viewers. In ‘What the Robot Saw,’ the Robot is AI voyeur turned director: classifying and magnifying the online personas of the subjects of a never-ending film. A loose, stream-of-AI-“consciousness” narrative is developed as the Robot drifts through neural network-determined groupings. In its surveillant cinematography, the Robot scans, edge detects, and stares into subjects’ eyes, periodically titling what it sees.

WTRS Screenshot 02

Image 2 of 15

Robot meets Resting Bitch Face and Other Adventures. As it makes its way through the film, the Robot adds lower third supered titles. Occasional vaguely poetic section titles are derived from the Robot’s image recognition-based groupings. More often, lower-third supers identify the many “talking heads” who appear in the film. The identifiers – labels like “Confused-Looking Female, age 22-34” – are generated using Amazon Rekognition, a popular commercial face detection/recognition library. (The Robot precedes descriptions that have somewhat weaker confidence scores with qualifiers like “Arguably.”) The feature set of Rekognition offers a glimpse into how computer vision robots, marketers, and others drawn to off-the-shelf face detection and recognition products often choose to categorize humans: age, gender, and appearance of emotion are key. The prominence of these software features suggest that in a robot-centric world, these attributes might better identify us than our names. (Of course Amazon itself is quite a bit more granular in its identification techniques.) The subjects’ appearance and demeanor on video as fully human beings humorously defy Rekognition’s surveillance and marketing enamored perceptions.

Dated archives are generated on YouTube for each daypart livestream, offering a theoretically endless, on-demand archive of the videos few humans get to see, as robots might, and sometimes do, see them.

The live streams run throughout the day; there are brief “intermissions” every four hours (and as needed for maintenance.) An extensive archive auto-documenting previous streams — and thus a piece of YouTube in realtime — is available on the Videos page and the YouTube Channel.

Although the YouTube live stream is central to the project, the technical limitations of live streaming mean the image and sound quality are not ideal and may vary with network conditions. A high quality stream can be generated locally for art installations and screenings.

‘What the Robot Saw’ is a non-commercial project by Amy Alexander