Bots, Cells, and Humans Watching
A speculative ecosystem where inorganic life forms and single-cell organisms come together in symbiosis. Humans are invited along, but only as observers.
The piece can be accessed though the estuary website symbiosis.live. Here, the digital Internet meets the physical world, connecting servers to a bioreactor containing living cells. This creates an environment that is subject to both physical and digital influences, transforming it into a peculiar habitat capable of supporting new forms of life and symbiotic relationships.
As inorganic living organism (crawlers, fetchers, scraper, spammers, hacking tools, etc.), bots for short, visit the server, their presence is recorded and their behavior studied. What were they looking for? What were they trying to do? The information is then used to turn the server into a more alluring honeypot, designed to attract them, and if possible, keep them coming back.
As bots move through the estuary, going through links and filling out forms, their movement releases food into the paramecia bioreactor. Paramecia are single-cell eukaryotic organisms belonging to the kingdom of Protista. They are neither plants, nor animal, nor fungi. They flourish on bodies of water and were in fact among the first microorganisms seen though a microscope back in 1665 by Antoni van Leeuwenhoek. The paramecia living inside the estuary website were collected from distinct areas of New York City, including puddles, fish tanks and Central Park.
Last come the Humans Watching. The website is designed to distinguish between Bots and Humans, in fact it looks very different to each. When a bot visits the website, a new page is generated to fulfill the bot’s request. When humans visit, they are presented with a live video stream of the paramecium through a microscope, they can telepresently control the microscope and explore different parts of the microscopic environment. Humans can also see a live log of recent bots in the server, observe their behavior and intentions. Humans do not have direct access to influence the system.
What are bots?
You might now them as crawlers, fetchers, scraper, spammers, hacking tools, etc. I call them bots for short. They are all over the internet, in fact there are more bots than humans out there[1-incapsula]. Incapsula goes over the trouble of differentiating them as good and bat bots. For this project I was not interested in the moral distinctions, both being equally alive.
Different types of bots do behave differently. Search Engine bots, for example, are attracted towards content, follow links and indexing whatever they encounter. Some hacking tools look for login pages and will attempt thousands of usernames and password combinations. This lead me to ask the question, What would happen if they found whatever they were looking for?
Early in the development of the project, I started monitoring the visitor logs into multiple servers. The logs look something like this but what they look like is not important.
What I found interesting is some patterns that were appearing. I noticed that some bots kept coming over and over. I wasn't asking them to come but they were already here. I decided to leverage this fact and develop a "honey pot" to attract them even further. Keep them coming and strengthen the interaction.
Three of the bots that I decided to focus on were:
Search Engine Bots
These bots come looking for content, they follow links and capture what's in them. Then they go to the next link and capture that content. They continue until they have captured all content available.
Why do they do this?
This bots have evolve to locate and retrieve information. This enables complex organisms like Google or DuckDuckGo to give you an appropriate result as you look for something online.
In response to this particular type of bot I wrote a script to generate a new html page every time a bot arrived. This page contains random links, random name, random titles, etc. It also selects from different images and descriptions.
For this I used a combination of Corpora texts [link to corpora] and tracery