Home

‘What the Robot Saw’ is a perpetually-generated documentary, durational online performance and archive by a social media robot. It’s a drive through the awkward intersections of performance, representation, and robots. The Robot uses contrarian ranking algorithms to curate some of the less attention-grabbing people on online media that commercial social media algorithms hide– and so may usually only be seen by robots. Using face and image analysis algorithms to study its subjects, ‘What the Robot Saw’ edits its film and identifies its performers — as the Robot saw them.

Revealing a mix of underacknowledged media makers, performed selves, and obsessive surveillance algorithms, ‘What the Robot Saw’ enters into the complicated relationship between the world’s surveillant and curatorial AI robots and the the humans who are both their subjects and their stars. It’s not a pedagogical video about what robots actually see. It’s a response to processes of representation in the contemporary collision of performed selves, perceptions of onscreen others, and robots.

^^^Please turn on the sound.^^^ ‘What the Robot Saw’ streams in HD, so please check that the player is set for the highest resolution your network connection will handle. It’s intended for fullscreen viewing, if you can. To scrub through the live or recent videos, view them on the Twitch videos page.

Using computer vision, neural networks, and other robotic ways of seeing, hearing, and understanding, the Robot constantly generates its film, algorithmically curating, editing, and titling clips according to Amazon Rekognition’s algorithm’s demographically-obsessed algorithms. The clips are algocurated from among the least viewed and subscribed YouTube videos uploaded in recent hours. These videos are rendered nearly invisible by YouTube’s engagement-centered ranking algorithms, which favor attention-grabbing content. For less eye-catching, polished, search-optimized or sensational videos, robots may be the primary audience.

Humans have a complicated relationship with robots. Especially the ones that work in the media. An invisible audience of software robots continually analyze content on the Internet. Videos by non-“YouTube stars” that algorithms don’t promote to the top of the search rankings or the “recommended” sidebar may be seen by few or no human viewers. In ‘What the Robot Saw,’ the Robot is AI voyeur turned director: classifying and magnifying the online personas of the subjects of a never-ending film. A loose, stream-of-AI-“consciousness” narrative is developed as the Robot drifts through neural network-determined groupings.

WTRS Screenshot 01

Image 1 of 15

Robot meets Resting Bitch Face and Other Adventures. As it makes its way through the film, the Robot adds lower third supered titles. Occasional vaguely poetic section titles are derived from the Robot’s image recognition-based groupings. More often, lower-third supers identify the many “talking heads” who appear in the film. The identifiers – labels like “Confused-Looking Female, age 22-34” – are generated using Amazon Rekognition, a popular commercial face detection/recognition library. (The Robot precedes descriptions that have somewhat weaker confidence scores with qualifiers like “Arguably.”) The feature set of Rekognition offers a glimpse into how computer vision robots, marketers, and others drawn to off-the-shelf face detection and recognition products often choose to categorize humans: age, gender, and appearance of emotion are key. The prominence of these software features suggest that in a robot-centric world, these attributes might better identify us than our names. (Of course Amazon itself is quite a bit more granular in its identification techniques.) The subjects’ appearance and demeanor on video as fully human beings humorously defy Rekognition’s surveillance and marketing enamored perceptions.

The live streams run throughout the day; there are brief “intermissions” every few hours (and as needed for maintenance.) Extensive archives auto-documenting previous streams — and thus a piece of YouTube in realtime — are available on the Videos page.

Although the live stream is central to the project, the technical limitations of live streaming mean the image and sound quality are not ideal and may vary with network conditions. A high quality stream can be generated locally for art installations and screenings.

‘What the Robot Saw’ is a non-commercial project by Amy Alexander