Lightbang is an interactive external device. People can use it to have a party at home to encourage them to interact with it and other people by body language. It gives people a novel interaction with traditional light/sound, they can also interact with other people to show the different lighting and sound. Users can call their friends to interact with it together to get more outcomes, it depends on the relative position between users setting node and camera. We combined Kinect drive, Touch Designer and Arduino IDE as our software and camera, laptop sound, Arduino as our hardware to create a novel interaction environment for home party.
The project leads to a social and fun experience through the movement of the limbs. As the user starts to wave their body, the lights and sounds start to change and the user can also interact with their friends to show different lights and sounds. This encourages users to interact with their friends together and encourages social interaction.
lightbang is designed to run in a home party environment and the main form of interaction is to encourage multiplayer interaction and a fantastic experience. Not only that, but users can dominate the party as they see fit, for example by creating a confessional atmosphere or a cinematic techno feel that can be achieved with different lights and sound effects. Users manipulate the lights and music through their movements, as if they have 'Super Ability'.
The final project used that day still has many shortcomings, and if used in a real-world scenario, the build will be many different. First of all, the project has certain requirements for the environment, so some construction needs to be done on the day, because it is not a product that can be exhibited alone. We did some decorations, including a tarp tent, a sofa for resting, and decorative light strips. These are all to create a nightclub atmosphere.
hardware used consists of LED strips, kinect, arduino board and speakers. The characters entering the screen are captured by the Kinect camera. Whenever a person enters the screen, it will be marked by the computer. When the user waves his body, the screen is transmitted to the computer and output to the arduino through the software. The light and sound will change according to the user's posture. When multiple people are in the picture, you can also unlock new lights and music through interaction. This encourages users to interact with their friends, creating a happy club atmosphere.
The exhibition day was still a great success in terms of results.
this day we saw the successful running of the project that the whole team had completed after a semester of work. All of us were excited at the moment this happened. We were so happy that it went as planned.
We had done a lot of preparation for the exhibition, but we still found a lot of shortcomings when it came to actually running it.
Firstly, when it came to the user experience, we found that the project was quite demanding in terms of the environment and that the camera could not have some other passers-by in it they would cause the program not to recognise the user properly, which was different from what we had envisaged at the beginning. We also saw that the product had height and distance requirements between the camera and the user, which would have forced the user to move around in a fixed area. The camera and sensor always exist mistakes if people want to interact with it very precisely because these machines will identify all behavior by users, the sequence However, we can give our audience a novel interaction with it. interact with our project by their bodies, we all know the traditional products must be controlled by buttons.
We found the response from the public to be more positive and engaging than expected. It was very interesting for many users to understand how the project works and then experience and interact with it as they see fit. Throughout the day we heard comments such as "Oh my God, this is so cool" from users. It was great to see how well our product was received by the users and how they found that it was their behaviour that was changing the product.
The first thing that needs to be fixed if the project is to continue in the future is to improve the system recognition to avoid the need to face the camera into unoccupied areas. Another area that needs to be fixed is to optimise the capture of movements to allow for richer interactions when multiple people are playing at the same time. In the end, we had a memorable semester thanks to the combined efforts of everyone in our group.