Team Konpanions

a personal tidying and decluttering companion for young adults and students.

Intended Concept

UmNum is a personal tidying and decluttering companion for young adults and students. When users are in a rush and do not have time to put their items away, they can quickly store their things inside his bin rather than placing them on their floor, bed or another convenient surface which would create clutter. When a user is removing items from UmNum he will provide a prompt or tip to help users to declutter thoughtfully.

UmNum has various personalities and expresses emotion through voice, light, sound and facial expressions. UmNum is predisposed to be happy when your room is tidy. This makes his negative change in emotions to be much more impactful when he gets upset because your room is messy. He also becomes joyful initially whenever an item is added to him. He may remain happy when there aren't many objects stored in him, but his emotional state begins to deteriorate if items are stored in him for a long time as he wants to be emptied. This is to reflect the mental and physical health outcomes of living in clutter.

Users are intended to feel enticed to declutter their space through the bond they develop with UmNUm. It was intended that the user would feel a bond with their decluttering companion, and when UmNum displayed signs of anger, sadness (or other semi-negative emotions outside it's normal happy state), the user would feel compelled to resolve whichever issue is causing this change in emotions.

While caring for their companion, users are then passively learning healthy decluttering habits. It is intended that these habits will cement themselves, and decluttering their personal space will become a regular occurrence.

Image depicting the main stressors in young adults caused by a lack of motivation
Image depicting the intended conceptual flow of UmNum

Technical Development

Within the development of UmNum, there were several stages, or breakpoints, wherein different technologies were used to achieve certain effects or interactions. These breakpoints can be formed into the following three stages:

  1. Conceptual Stage
  2. Prototype Stage
  3. Exhibit Stage

In the conceptual stage, UmNum was described and illustrated as a robotic companion who would move around your space, stealing your clothes and items from the floor, and using them as ammunition to verbally berate you for your cluttered space. The technology proposed for this design came from a place of very limited understanding, as this was in the very early stages of UmNum's development. Shown to be needed for this concept were some method of movement, as well as a way to 'steal' your items, and finally a method of interacting with the user through some form of speech. The group proposed this as a potential scenario to utilise several servo motors and wheels for movement, hydraulics for the trap-door and scoop for stealing items, and finally a speaker and voice recognition powered through an on-board chip.

In the prototype stage, the concept of UmNum had changed due to user feedback and technical limitations. It was no longer the intention to have it 'steal' the user's items where users may become overly frustrated with their companion and stop using him. UmNum changed to exist as a more helpful companion who could be utilised to store items if the user is in a rush, as opposed to tossing things on the floor. UmNum took the shape of a load cell weight sensor to measure the amount of things stored in his head, as well as two NeoPixel ring LED's to display visual change in emotion, and also a speaker module to project voice lines at the user. This was all controlled via an Arduino Uno, located beneath the base of the load cell weight sensor, with a simple cardboard tube holding the makeshift "stuff". In this prototype, when an item was removed from UmNum he would ask if the user wanted to "keep", "donate", or "discard" the item and why. The team had to act as UmNum's brain, dictating via keypress which option the user chose and if their reason was good enough.

In the final exhibit stage, major changes were made from the prototype stage. The main functionality of having the load cell weight sensor act as a scale is still in use. However, instead of detecting the amount of objects within UmNum by having a fixed object being placed into it, Umnum responds dynamically to any objects of varying weights being placed inside of him. The NeoPixel LED rings were deemed inappropriate for showing adequate emotional change, so these were substituted for a Lenovo tablet (firstly) with hand-illustrated eye animations and a body luminosity colour change using Neopixel LED strips around UmNum's body. This tablet was found to be too difficult to use, so this was promptly swapped for a team member's phone, which was on a video call with a shared screen from another team member's laptop (viewing the website of these video animations). The speaker module was upgraded to contain voice recognition, wherein UmNum could process speech from the user and respond with an accurate voice line. Finally, Umnum has an assortment of voice lines for each set interaction which are randomised to provide unique experiences for each user.

Form & Function

Seen below are some examples of annotated diagrams explaining both the form and function of UmNum at his current, (close to) exhibition-ready stage.

Post-Exhibit | Technical Description

UmNum is comprised of several 'moving parts', which all come together to provide the baseline experience of forming healthy habits around decluttering a user's space. The most prominent, and arguably most integral to it's design, is the load cell weight sensor. This works by measuring the bend of the metal bar from the force applied from above, this is then amplified with a load-cell amplifier, and sent to our Arduino Uno (1 of 2) in the form of a number between 0 and 250.

As mentioned just previously, the team used two Arduino Uno's, each plugged in to one another through pin-to-pin communication. This was to allow the use of two laptops, so that one laptop could write to the serial monitor, and the other could read from the serial monitor at the same time (as an Arduino Uno board only has one active USB-Serial Port). These previously mentioned load cell numerical values would be processed within the Arduino's code and be promptly translated to UmNum's new emotional state based on whether an item was added or removed; this was done with the use of an LED strip, as well as a combination of a speaker, and an SD Card reader module. The LED strip was used to convey a colour change, or a form of visual stimulation to the user - while the speaker would play files from the attached SD Card Reader Module; these files would contain voice lines created through text-to-speech software, with the pitch artificially increased.

After an emotional change or removal of an item from UmNum, the user would be prompted (with a voiceline) to make a decision around what they wanted to do with their item in regards to storing or discarding it, etc. This section used some clever voice recognition through Windows' in-built software, combined with an external C# script run through Visual Studio to listen for specific words spoken into a microphone. This microphone would pick up phrases and individual keywords that had been hard-coded to send a keypress to the serial monitor of one of the Arduino Unos. As may be expected, these keypresses would signal a voiceline and emotional change based on the current state of the process.

This was then paired with a form of object recognition software that was coded to specifically detect a certain colour within a space, and if detected, would send a keypress to the serial monitor. This colour was chosen to be red, with a Python Script and the OpenCV library being used to both take and analyse an image every 1/100th of a second to check if there were any present conglomerations of red on that image. This would input to the Serial monitor constantly, to allow UmNum to enter an angry state under certain circumstances of neglect to the user's space, etc. This was promptly interfered with by the changing light within the exhibit, with no quick-fix; for the sake of a working exhibit, this was stopped for the second-half.

Finally, UmNum's physical form took the shape of a short laundry-hamper like standing 'bin', with velvet fabric forming the shell or outer layer. Chicken wire and string was used to form the skeleton, with several different types of cardboard used to form the core or inner-bin part wherein the load-cell platform sat.

Post-Exhibit | Final Statement

The exhibit was overall considered a success by the team. Not only did the project work (mostly) as intended, some of the people we got to demo it to were really excited to see what we'd produced. Many members of the audience claimed they could use something like UmNum in their own life, while others stated that while they wouldn't use it themselves, it was certainly something they could see being in every young person's home they visited in future.

As for the outcomes from this exhibit, there are some obvious, and then some not-so-obvious outcomes stemming from UmNum's public debut. Firstly, the elephant in the room, Team Konpanions placed third in the people's choice awards with close to 20 votes from those attending the exhibition! This is a massive achievement for the team and each member was very proud of the effort taken to get the team this far. Another obvious outcome from the exhibit was the fact that the team's proof of concept that is UmNum was considered a success by not only team members, but also those who viewed the exhibit on the day. This shows that a decluttering companion could be useful to our target audience both currently, as well as in-future when developments have been made. Some non-obvious outcomes also included insights the team can use to improve the design, as well as the general public's reaction to and usability preferences towards UmNum.

Overall, the team is ecstatic that UmNum came together in the end, despite all of it's attempts to stifle his own debut. Each member worked super hard on the proof of concept, and this exhibition was evidence enough that UmNum not only is a viable concept, but that we as team members could make something seen as so succesful. The public response was overwhelmingly positive, with many general and technical questions being asked and answered by those in similar industries/fields on the night.

As for what's next with this joyful companion of ours. The team identified several weakpoints from the exhibition, as well as some general insights that could factor into future designs. Specifically, the speaker we had used was rather quiet on the night; particularly due to the abundance of chatter and noise, in future this would be replaced with a bluetooth speaker or perhaps even subtitles for clarity. Another weakpoint identified was that the Object Recognition script meant to work in conjunction with UmNum's 'angry' emotion when clutter was detected on the ground, was considered to be doing more harm than good on the night. A combination of scattered light and general code logic meant the interaction would work incorrectly with the script running; in future this would be polished to work correctly - so user's could have the full intended experience.