Access to unique experiences and sentimental value has become a major focus in our lives. People today often document the emotional value of experiences through photos and videos. But these are only in the visual dimension. We hope to give users more diverse dimensions to experience and record, that is, the olfactory dimension. This project provides users with the opportunity to record environmental smells so that they can better retain, experience and feel the environment.
The domain of this project is an extension of the senses. Our team designed a device based on this theme. This device can support users to collect colors in the environment through the handle, and generate and display unique animation images according to the colors selected by users. Users can print pictures through our device, each color in the picture is mapped to a specific smell, and then the device converts it into a mixed smell and assigns it to the picture. Users can end up with a picture rich in a unique aroma. We encourage users to sense the environment through their sense of smell, further extending their sensory experience.
Aromart device mainly consists of 2 systems: the object capturer and the scent generator.
For the object capturer, users can scan objects with a joystick, and the system will visualize the object’s scent on the screen. The joystick is achieved with a 3D printed frame, which is similar to the HTC VIVE controller, and an Adafruit Circuit Playground for colour detecting with its sensor. A Unity application running on a PC is developed to visualize the scent users have captured through COM communication with the joystick. Each time users scan an object, a piece of colour will appear on the PC screen.
The scent generator system starts working when users finish capturing objects, and it is an Acrylic box with a Raspberry Pi 4 in it to control the system. With Python programming, the Pi controls buttons to trigger the system, a web camera to capture the image of the Unity application, and an Epson printer to print the image. OpenCV is used in the program to analyze the colour of the image, convert RGB to HSV, and sort colours into 7 groups. Meanwhile, a message will be sent to the Unity application on the PC through the TCP socket (PC and Pi are in the same network) to trigger the processing animation on the screen. And also 2 pixel strips are controlled by Pi to indicate different statuses of the processing. As users insert the printed image and press the scent button, the Pi will control 7 different servos to rotate corresponding times according to the colour proportion, and then push spray bottles to generate scents on the inserted paper. Servos, buttons, and pixel strips are connected to Pi through GPIOs, so it is easier to control them with Python libraries. Considering the large number of servos controlled by Pi, and the highest power on GPIO is 5V, a power bank is used as an external power source to provide more power.
In consideration of the reusability of the system, programs in both Unity and the Pi are all in loops and will be reset after one user finish operation, to automatically get ready for the next user.