Darwinian Spaces

Darwinian Spaces

 The interactor enters the installation space and wears an EEG device. The presence of the interactor lights up the wired LED screens. The LED screens are visually representing the major sentiment activation points of the human brain. The screens then start to display momentary representation of dreams and texts taken from the social media. The EEG device activates various sounds depending on the brain electrical activity of the interactor. It also affects the dream representation being displayed - if the interactor is anxious, the machine will give him collective dream types with a negative sentiment as displayed on the screens, and if he relaxes, this will lead to calm dreams and sounds. The result is constant rewiring of the interrelationship between the individual`s inner consciousness and the collective consciousness. As all humans and artefacts get more and more closely connected through the Internet, it's becoming more obvious to think that everything on earth is rewiring into some kind of a global brain. Why don`t we let ourselves be aware of what is happening to us? And that has brought me to propose this installation – to demonstrate our evolving connectedness, the Darwinian Spaces. 


Ants are collectively more intelligent and effective when they collaborate as a colony than the individual ants themselves. Each individual ant is responding to signals of another ant and has the spontaneous ability to adapt its performance for the good of the colony.There`s also a collective consciousness with ants – although we assume that individual ants have particular roles and are only automatically functioning based on such, in reality, they have awareness of the entire structure of the colony and begin to function as one entity. Unfortunately unlike ants, it seems that we have developed a mechanism against our awareness of the collective which we are part of. This raises a question of what we humans really are as collectives, not just as individuals.

What if our reality is in fact a collective consciousness, acting in an unconscious world as in a dream, and we are mere characters in other people’s dreams?

When viewing at reality of our world as a shared dream – our thoughts are the universe`s thoughts. Our collective dreams are narrating us something symbolically. The events and circumstances unfolding in the dream world and in the physical reality can be viewed as a collective representation of our inner state of being. Similarly, events can be translated to an individual level to reflect a person`s inner state, the collective level, just as Darwinian Spaces maps a collective dream visualisation on the participant`s brain space representing the collective map of people`s dreams and consciousness functioning as one entity.

Looking at Science and Technology, many scholars are claiming that we are in the age of information overload and our brains are rewiring as a result. Internet is accelerating connections among individuals and groups. As all humans and artefacts get more and more closely connected, it's becoming more obvious to think that everything on earth is rewiring into some kind of a global brain. 


Technical Requirements

- 18 x 2.4 inch TFT screens or alternative (uses 7 digital pins per TFT screen)

- 36 x LED Matrices with MAX7219 (6 visual interfaces for each activation point, 12 total, see Figures)

- 12 x mini arduino speakers 8 ohms, at least 2 watts.

- 5 x Arduino Mega Atmel Atmega2560 MCU Board 

-  4 x PCF8574 I2C Port Expander (PCF8574AN)

- Arduino UNO R3 (UNO-R3)

- 4 x Arduino Wifi Shield - AdaFRUIT CC3000

- Bluetooth USB Dongle, Bluetooth Modem - BlueSMiRF Silver, Logic Level Converter Bi-Directional

- Long electric wirings, various colours, about 30 meters total and heat sinks (for neat wiring)

- Soldering irons and tools

- 1 commercial EEG device, pref. Neurosky Mindwave with Bluetooth module 

- High-speed Wireless Network

- Development Environment – preferably Mac OSx


- Processing / Java or equivalent (for Visualisation / LED /TFT displays)

- Phyton/ JavaScript for Google Sentiment API 

- Arduino (board programming and interactions

- PureData for sounds/dynamic images - Jerobeam Fenderson is collaborating, see his work at youtube.


Mapping Software Architecture:

  1. Sentiment app. Each dream story is fed to a sentiment API.  The application provides a sentiment score, theme and entities. There are few open-source sentiment API that can be found in the Internet, I listed some of them in my “Initial notes on development section” and many of them provide these functions.
  2. Dream Codes Mapping app. Once we got the entities and emotional tone of the dream theme, we will try to find a match word in the text file that I will be providing, the Hall-Van dream code categories. In that text, I will provide the words and the corresponding activation points in the brain and the potential image/s related to the word if any (otherwise, these are shared from the Social Media)
  3. Dream Visualisation app. The Visualisation app is the Arduino code that displays the text (alternating text of the story, theme, entity) sound and related images to the physical installation.  We are going to divide the screens somehow based on the location of the activation points in the actual brain. I will later on provide a detailed drawing / sketch of the divisions.
  4. Brain-Computer Interface app. The BCI app is the one that gathers data from the user/interactor`s brain and process them as:

         (3 potential threshold values that one can get from a commercial EEG device; there will be more, just a matter of setting the range, my plan is to take 12 range values):

       - Theta (4-7Hz)  - Drowsiness

       -  Alpha (8-13Hz) - Relaxation

       -  Beta (13-30Hz) - Alertness

The BCI simply determines if the person is in a positive or negative emotional state and matches them to the previous dream sentiment score (Positive BCI score takes dreams with similar positive range).

Reference information and BCI implementation source code, here.




Prof. van Wedeen`s Brain mapping set-up


List of References and Links


1. Multiple displays and multiple sound channels

Good reference to make Multiple display set-up to work with Wireless Connection:

- Arduino TFT LCD Screen + Arduino Wifi Shield

- 4 LCD displays on 1 Arduino

- TFT and ILI9325

- Controlling an Arduino wirelessly from Pd/Max uses Adafruit CC3000

-  UTFT Library

- LED matrix wiring


2. Hacking MindWave Mobile


- Mindwave mobile open-source app


3. Semantic API

- SemantAPI - SemantAPI is free and open source which can take all the sematic analysis results from several sources for comparison

- Google is also providing a Sentiment API  (Google Prediction API) platform: 


Looking for collaborators!

1.) Developers with interest in coding Arduino (for electronics)

2.) Developer with interest in Java / C++ (for collective Internet data / semantic data processing). 

3.) Pure Data developer for Audio and Sound

4.) Someone with electrical and schematics wiring  experience particularly with multiple displays, shifters, TFT and bluetooth modules, BCI, etc. 

5.) Artists, Sculptors, Designers, Crafters, and Electronic hobbyist will be of great in the physical installation. 

6.) Researcher/technologist interested in Semantic Data processing.

7.) Anyone interested in the project!


So far this is the partial list of Collaborators... and there`s more people interested:

Javier Nieves - Database and Mapping Software

Arron Moore - Installation Structure Design

Victoria Brugada  - Scientific Facilitation

Diego Lopez  - Scientific Facilitation

Ivan Ferrer - Hardware Development

Alvaro Sanchez del Castillo - Installation Structure Design

Chema Blanco - Technical Facilitation

Jerobeam Fenderson  - Sounds


25 colaborators







Project status
Joint valuation