“DJ in the Jungle” is an interactive music device based on collaborative creation, which offers people an immersive co-create experience by interacting with the columns to generate music and light effect through the change of distance.
The whole system consists of two main parts, music synthesis and music visualization. It will consist of a varying number of columns hanging on the tree which include ultrasonic sensors, sound sensor, LED strips, Arduino control board and audio equipments. Users can interact with different height of ultrasonic sensors on columns by changing the distance, which will trigger the system to improvise music. Between the different columns are different melodies and EDM sound effects, and by the cooperation of multiple people, an upbeat and complete song will be produced. The light columns will visualize the music act as an aid and will create lighting changes based on the intensity of the music, thus enhancing the atmosphere and giving participants a better experience.
The project focuses on the domain of collaborative creation, which means users can create an innovative result through cooperation. The diversity of people and cooperative interaction can bring different interesting results. The device is expected to bring responsive music and lighting effects, which will provide corresponding feedback to users' actions in order to create an attractive, immersive, cooperative, and fulfilling experience. During the interaction, the device will display different lighting and music effects according to different user behaviors. Under the premise of multi-person cooperation, the result of music will be full of fun and uncertainty.
It is expected that the device can help to improve the connections between music and lights, people and music, then increase the human-connections cross the boundary among strangers through a non touch interaction mode under social distancing condition. The exhibit will last from 4pm to 8pm, so the conversion of light and dark will also be an effective factor for people to use the device integrated into the UQ Bloom Festival. It's worth to mention that the final product is to be installed into the natural space with trees surrounded. The luminous pillars hanging on the branches are considered as part of the natural environment, which will give the user a more realistic feeling of being in the jungle and build connection between human and nature.
Technical Development & Construction
The whole installation consists of two parts, Music Synthesis System and Music Visualisation System and each system is controlled by one Arduino Uno board. These two systems are
connected by the sound sensor, so that they can work at the same time and create the interactive environment of this installation.
• Music Synthesis System
This system can be divided into hardware part and software part. The hardware part is used to realise the physical input of non-touch user behaviours, and the software part is used
to realise the synthesis of music and the communication of hardware and music synthesis software.The hardware part consists of an Arduino Uno board and 6 ultrasonic sensors, we use
Ableton Live to realise the music synthesis function, and through Max for Live to communicate between the hardware and software.
When the user interacts with the device, the ultrasonic sensor will convert the detected distance data into ASCII code and send it to the computer serial port. Max for Live will
receive the ASCII code in the serial port, then converts it into binary data and sends it to Ableton Midi to realise the control of each audio track through data changes. For Ableton,
we use ourselves pre-designed music to achieve aesthetic control during the music co-creation experience.
• Music Visualisation System
This system mainly visualises the music that users improvise together. We used the changing LED lighting effects to display the rhythm of the music, hoping to make this installation
more attractive and create an immersive interactive experience for users in this way. Among this system, the sound sensor is used to receive the music processed by Ableton, and the
Arduino control board converts the volume change received by the sensor into different signals, and transmits the signals to the LED strips pins.
Final Statement
As you can see from the picture of our installation, compared to the previous installation, we added a circle of LED strips on the ground which can be changed the light effect by changing music. This is because after testing, we found that the ground was too dark at night, affecting users' actions. And the lights on the ground can remind users of our device range and protect our columns.
In terms of user experience, our initial ideal goal is to change the music and lighting effects through cooperation between strangers, so as to achieve enchanting and open-ended results. From the results of the exhibition, we basically achieved our expected results. Users were attracted by our device and were interested in using it. Each user who used the installation stayed long enough to explore the usage of the device and turned on different sound effects. People were happy to work together to explore a different combination of music and lighting, and after using the device, they strenthed the connection and gained happiness with strangers. Due to the different heights of our sensors, our users tried to jump up to activate the sensors, which is in line with our expectations.
In addition, through the exhibition, we also found some shortcomings of the installation. First of all, although we marked the location of our sensors with reflective tape, their location is still not conspicuous at night, which makes it difficult for some users to discover how to change the music in the first place. Second, due to cooperation, the device produced multiple sound effects at the same time, and it is difficult for users to distinguish if the sound effects are caused by their own actions. Therefore, in the future, we consider it better to add some interactions to the sensors on each column. For example, When the user activates the sensor, the sensor will produce a bright light, which can prompt the user the position of the sensor and let the user perceive that they have activated the sensor.
In the observation of users, we found some interesting phenomena. For example, some users can touch two sensors at the same to produce two sound effects, which we did not expect before. In addition, children enjoyed our installation, but because the height of the device is set for adults, their parents needed to lift up their children to trigger sound effects. These interesting observations have brought us some new inspirations.
It is worth mentioning that the external environment has affected the effect of our installation. Unlike the quiet environment during the test, the environment was very noisy on the day of the event, which caused the sound of our device to be completely covered by the music of the external environment. In addition, our device is too concealed, making it difficult for people passing by to notice our device at daytime. Also due to the location, we found that many people passing by would pay attention to our device, but they rarely approached to use it. This may be due to our device being too far away from the main road. But after we started distributing brochures, this situation changed.