Marine Learning

Team Zooju

Bringing immersive, hands-on marine education to the classroom

Marine Learning is an immersive educational system created to support and enhance the delivery of marine biology curriculum for middle school children in a future mundane classroom. The concept aims to deliver a playful and meaningful learning experience that simulates hands-on experience whilst providing access to a vast range of marine life. By interacting with the system, users can engage with marine ecosystems and conservation through exploratory and guided learning.

Marine Learning is an interactive system comprising two modules: the aquarium and the study desks. The aquarium module is a large malleable interface intended for collective study and demonstrations (typically performed by a teacher or demonstrator). This component will display highly realistic visuals of a live marine ecosystem of which the user may perform simulations on, such as overfishing, oil spillage or increase in ocean temperatures. Students may also explore organisms of the aquarium ecosystem closer by tapping a specific organism on the screen. This will initiate it to protrude from the screen and make it into a self-contained 3D object that recreates the feel and movement of the live organism. The student can choose to physically grab this organism and take it to the second component, i.e. the study desk, for individual study and exploration.

The study desk component is a smaller representation of the aquarium with an isolated ecosystem using the same malleable interface. Students may add the organisms from the aquarium by dropping these down on the desk surface, initiating the desk interface to absorb the organism. At the study desks, students may engage with smaller-scale, focused exploration and study through physical interactions such as pairing the organism with other organisms to explore interactions within the desk’s ecosystem or peeling back layers to reveal its anatomy. In these interactions, educational content will be delivered through visual, audible and sensory feedback.

Technical Description

Below is a description of the materials, hardware and software used to build the prototype of the marine learning concept for both the wall aquarium and study desk.

Wall Aquarium

The frame of the touch wall was constructed with non-structural pine and held together by gold angle brackets. The “screen” area has a 4:3 ratio. 6 flex sensors (2.2”) were slotted into corflute, which were then stapled to the frame, and then covered by stapling sheer white fabric onto the backside. The flex sensors were connected to an Arduino UNO. Unity3D was used to create an underwater scene, which was back projected onto the screen with an EPSON short throw projector.

Study Desk

The octagonal study desk enclosure was constructed from 3D printed PLA+ plastic corners and connectors, and clear cast acrylic panels. Neopixel strips were run underneath each panel to produce the blue LED lighting, controlled by an Arduino UNO microcontroller programmed with a simple Arduino script. The enclosure was decorated with purchased coral models, plastic aquarium rocks and flora, larger rocky structures fabricated from foam, and several bathroom mats fashioned into sea anemones and miscellaneous corals. Finally, a unique, A4 paper-printed fiducial AR target marker was stuck to the inside of every second panel, with an additional marker centred on the floor of the enclosure, for a total of 5.

The physical clownfish and butterfly fish were both constructed from purchased toys that were cut open and modified to insert/attach hardware and mechanisms. For each fish, these included a 3D printed plastic tail attached to a servomotor, which was connected to a Adafruit HUZZAH ESP32 Feather microcontroller that was programmed with a script to communicate with the Unity AR app via Bluetooth and control the position of the plastic tail to effect haptic feedback in the way of variable thrashing of the fish. This tail module was housed in a 3D printed plastic frame, which served to keep the motor in a static position and mount an AR tracking target to the top of the fish. The AR target was comprised of a small 3D printed cube with two holes in the base for interfacing with the plastic frame, and custom Vuforia image targets stuck to the five other faces of the cube.

The AR overlays and behaviour of virtual objects was implemented using a Unity application in conjunction with the Vuforia AR development engine, which was deployed to a Samsung smartphone mounted in a Google Daydream VR headset. Vuforia uses the AR targets to track the position of the study desk enclosure and physical fish to place inanimate water and animate fish AR overlays respectively. The application was scripted to communicate with the ESP32 microcontrollers to control the tail thrashing speed of each fish, and to detect the proximity of the fish to each other and specific objects in the study desk (e.g. sea anemones, other corals) and trigger events accordingly, such as playing audio feedback or executing an attack animation.

Final Statement

For our team, the exhibition was largely a success and reward for our work over the semester. The prototype was mostly functional throughout the exhibition, and the minor errors that were present, such as projections not being visible due to sunlight, flex sensors picking up non-bends and audio clips cutting short, did not significantly detract from our intended concept experience. Considering we had a large crowd present at most times in our marquee and no downtime during the visitor hours, the user interaction with our prototype went better than expected. However, we feel that the limitation of only having one visitor interacting with the study desk at once did impact us. It was difficult to entertain a group of visitors whilst they all waited eagerly to see what was happening in the AR view. To accommodate this, we planned to connect the phone with the AR application to an external monitor so that visitors would be able to see what the user was experiencing whilst having the headset on. However, on the day, the connection proved to be quite unstable and drained the battery of the phone further. If we had tested this further prior to the exhibition, we could have screen-recorded the interactions and displayed a video on the monitor as a mitigation plan.

The greatest challenge we had during exhibition day was to setup on-time and manage the battery consumption of all our components. A handful of times we had to kindly ask visitors to visit other teams whilst we prepared our prototype, e.g. when creating new Unity builds or letting our projector rest. We should have managed our time better to avoid this, yet we fortunately improved on this for the evening event.

Although we had some minor hurdles, the public response to our prototype was overwhelming and mainly positive. We were surprised to find that each visitor had a unique interaction with our prototype as we observed a range of emotions throughout the day. Some users would laugh or exclaim when they first saw the AR fish animation, others would throw the physical fish toy away as they were frightened by the AR butterfly fish/clownfish interaction and the younger audience looked in awe as they pet the fish animations we projected onto the canvas of our aquarium. A number of visitors commented on the potential/future of our concept and expressed an interest in the use of our work. One visitor mentioned that they were a primary educator and loved the idea of bringing a tool such as ours into the classroom as this is very far from the current delivery mode.

The next steps for our project would be to explore the prototyping of other interactions in our intended concept to create a more holistic teaching tool, such as running simulations on the aquarium and exploring food webs or anatomy at the study desks. After further iteration and user testing, we would hope to deploy our concept in a natural environment such as a classroom setting.