Music Forest

by Maximum Chill

Step. Sound. Sync.

Our project redefines music as an interactive, multi-sensory experience. Using pressure sensors, dynamic lighting, and feedback through touch, visuals, and sound, we let users see, feel, and shape music. The concept revolves around Data physicalisation, enhancing rhythm perception through intuitive interaction, and Augmented collaboration, enabling real-time co-creation. Even without musical experience, users can experiment, connect, and create together. Our goal is to make music-making more accessible, immersive, and social.

1. Physical Form and Material

The sound interactive device has two parts and presents a circular interactive platform for multiple participants. The upper part is a central dome structure with six flexible sensors hanging down, and the lower part is six vertically arranged transparent cylinders, each connected to a manual air pump at the bottom.

  • Cylinder part: Handmade with transparent PVC material, the top and bottom are fixed by 3D printing, which is convenient for installing sensors and stabilizing air pump connections.
  • Internal elements: A glowing ping-pong ball is suspended inside each cylinder, which is pushed up and down in the cylinder by the air pump.
  • Layout form: The six cylinders are evenly arranged around the central device in a circular artificial grass area, forming an immersive device with interactivity and a visual centre.

2. Technology Hardware

  • Bend Sensor × 6: Installed on the top dome, each responds to different gesture bending angles, used to trigger different tracks or modulate music effects.
  • Distance sensor × 6: Installed on the top of each cylinder, used to detect the vertical position of the luminous ping-pong ball in real time and control the triggering of different pitches or parts.
  • LED light strip × 12: Six strips are embedded in the cylinder, and six strips are used to connect the cylinder and the air pump. They light up synchronously with the status of the track, and the colour changes according to the part (treble, midrange, bass) to enhance the visual effect.
  • Air pump × 6: Connected to each cylinder, the user controls the movement of the ping-pong ball in the cylinder by stepping.
  • Arduino × 2 (Leonardo & Uno): Used as the main controller to collect sensor data and drive sound and light feedback.
  • Power supply system: An external regulated power supply is used to power the LED and Arduino board, and the air pump is driven by the foot without power.

3. Technology Software

  • Sensor data processing: Distance sensor data is mapped to pitch (or part), and the audio player is triggered to play specific notes according to the real-time height. Bend Sensor data is mapped to different musical changes according to the degree of bending.
  • Sound control logic: Ableton live and Leonardo: Different sensor inputs correspond to different parts: bass, middle, and high, forming a coordinated performance.
  • LED synchronous feedback: Each cylindrical LED changes colour synchronously with its part status, such as red (bass), blue (middle), and green (high).
  • Multi-user concurrent logic: Supports multiple users to operate different air pumps and bend sensors at the same time. The system can process the input of each channel in parallel to achieve a concerto effect.
  • Expandability: Can communicate with a computer or mobile device through a serial port or Bluetooth to expand the visual interface or record interactive data.

4. Function

  • Interactive audio function:
    • Real-time sound generation: Bend Sensor controls melody and sound effect changes; the air pump pushes the ping-pong ball and maps the part and pitch through the distance sensor.
    • Sound module control: Different sensor inputs correspond to different parts, such as drums, guitars, melodies, and other instruments. For each instrument, based on the height of the object detected by the sensor, it turns to bass, midrange, and treble to form a coordinated performance.
  • Visual feedback system: RGB LED lights change colour in real time according to the current part status, and synchronize with audio feedback to enhance immersion.
  • Music recording and playback: Ableton live and Leonardo: The music clips generated by each interaction will be recorded (stored via SD card or server). After the interaction, the system automatically generates a QR code, and users can scan the code to download their music files (such as .wav or .mp3 format) as a souvenir or to share.
  • User feedback system: After scanning the code, the page guides users to fill out a simple experience feedback form, including the following items:
    • Mood experience during use (such as excitement, calmness, confusion, etc.);
    • Ease of operation (air pump control, sensor response);
    • Clarity of function understanding;
    • Perceptual and tactile feedback;
    • Suggestions for improvement of device functions and interactions.
    Feedback will be used for subsequent iterations and optimization of the design, and will help the team evaluate the quality of the user experience.