Home

Does the Internet suck? Or do just the parts we get to see suck?
What if viral algorithms ran in reverse and showed us the least popular people?
Between algorithms and poetry: Humans still go on all the time, don’t they?

Who doesn’t get seen on the internet? In the new world order of robots and talking heads, a reality exists that’s part life, part cinema, and part algorithm. But as the algorithmic curators silently curate, an alternate reality also emerges, starring the humans who win at losing the losing social media game. A secret cinematic world, seen only by robots.

‘What the Robot Saw’ is a perpetually-generated robot documentary live stream and durational online performance and archive. It’s a Sunday drive through the awkward intersections of performance, surveillance, voyeurism, and robots — in the age of the talking head. The Robot uses contrarian ranking algorithms to curate some of the least attention-grabbing new videos on online media — videos hidden by commercial social media ranking algorithms, which may usually only be seen by robots. Using face and image analysis algorithms to curate videos and study their subjects, ‘What the Robot Saw’ assembles its film and identifies its performers — as the Robot saw them.

Revealing a mix of underacknowledged media makers, performed selves, and obsessive surveillance algorithms, ‘What the Robot Saw’ enters into the complicated relationship between the world’s surveillant and curatorial AI robots and the humans who are both their subjects and their stars. It’s not a video about how robots actually see. It’s a response to processes of representation in the contemporary collision of performed selves, screen-centric perceptions — and robots.

^^^Please turn on the sound.^^^ ‘What the Robot Saw’ streams in HD, so please check that the player is set for the highest resolution your network connection will handle at the moment. It’s intended for fullscreen viewing, if you can. If the stream’s not live, or to scrub through the live or recent videos, view them on the Twitch videos page.

Using computer vision, neural networks, and other robotic ways of seeing, hearing, and understanding, the Robot constantly generates its film, algorithmically curating, editing, and titling clips according to Amazon Rekognition’s algorithm’s demographically-obsessed algorithms. The clips are algocurated from among the least viewed and subscribed YouTube videos uploaded in recent hours. These videos are rendered nearly invisible by YouTube’s engagement-centered ranking algorithms, which favor attention-grabbing content. For less eye-catching, polished, search-optimized or sensational videos, robots may be the primary audience.

Humans have a complicated relationship with robots. An invisible audience of software robots continually analyze content on the Internet. Videos by non-“YouTube stars” that algorithms don’t promote to the top of the search rankings or the “recommended” sidebar may be seen by few or no human viewers. In ‘What the Robot Saw,’ the Robot is AI voyeur turned director: classifying and magnifying the online personas of the subjects of a never-ending film. A loose, stream-of-AI-“consciousness” narrative is developed as the Robot drifts through neural network-determined groupings.

WTRS Screenshot 14

Image 14 of 15

Robot meets Resting Bitch Face and Other Adventures. As it makes its way through the film, the Robot adds lower third supered titles. Occasional vaguely poetic section titles are derived from the Robot’s image recognition-based groupings. More often, lower-third supers identify the many “talking heads” who appear in the film. The identifiers – labels like “Confused-Looking Female, age 22-34” – are generated using Amazon Rekognition, a popular commercial face detection/recognition library. (The Robot precedes descriptions that have somewhat weaker confidence scores with qualifiers like “Arguably.”) The feature set of Rekognition offers a glimpse into how computer vision robots, marketers, and others drawn to off-the-shelf face detection and recognition products often choose to categorize humans: age, gender, and appearance of emotion are key. The prominence of these software features suggest that in a robot-centric world, these attributes might better identify us than our names. (Of course Amazon itself is quite a bit more granular in its identification techniques.) The subjects’ appearance and demeanor on video as fully human beings humorously defy Rekognition’s surveillance and marketing enamored perceptions. What the Robot Saw’s title is a reference to What the Butler Saw films. Neither the butlers nor the Robot could really understand the objects of their obsession. Despite their self-satisfaction, all they had was a squinted glimpse at a peep show.

The live streams run throughout the day; there are brief “intermissions” every few hours (and as needed for maintenance.) Extensive archives auto-documenting previous streams — and thus a piece of YouTube in realtime — are available on the Videos page.

‘What the Robot Saw’ is a non-commercial project by Amy Alexander