Is there an essence to being human?
Philosophers and physicists have attempted to answer this elusive question throughout the ages. Arguably with relatively little success. But as digital minds have a number of advantages over biological minds, especially in terms of processing vast sets of data, recent developments in computational technology have made the topic subject to quantification by semi-autonomous processes, who measure opinion in the global currency of likes and activity in the form of clicks and views.
If these novel methods show anything, it is that whatever it means to be human is seemingly changing, and that this change, as always, is inseparable from the means of measuring it, bringing forth new questions such as -What are we left with when all has been quantified?
If a true AI would awaken, capable of not only seeing, but of actually comprehending this data, how would humanity appear in the digital eyes of this alien other?
iLove Humans is a speculative video work and part of the transmedial storytelling project Po[e]litics From the Anthropocene. The content for iLove Humans have been created using a variety of techniques according to a logic of form reflecting content, much of which is drawn from footage found on social media, humanoid 3D models generated using the software MakeHuman, as well as other 3D models available for free download on sites such as Turbosquid and CGtrader.
The voice track is generated from cut-up texts that mixes transcripts of various viral internet videos with Wikipedia entries on related topics. The resulting texts have later been processed through text-to-speech software (some with built in translation functionality), and mixed with other found audio. The composition and post-processing of the footage, Audio and 3D have been made using Cinema 4D, Unity 3D, Photoshop, After Effects, Ableton Live & MaxMSP/Jitter.