Dev Blogs

Showcase Final Touches
April 19, 2024

This week we have a showcase for our EECS 440 class at the University of Michigan. We have spent most of this week working to polish up parts of the project so that it is ready for users to use this evening. We have put a lot of time into creating more tasks for the users to do so that there are tasks to do in the ocean. We have also spent a lot of time working to improve player guidance. Prior to this week, there was not a lot of guidance implemented for players. The team heavily relied on one of us being there to explain the controls to whoever was playtesting at the time. To solve this, we implemented some widgets with diagrams showing what each button on the controllers do.


The new tasks that were added this week are trash collection and a sunken ship. The trash collection task is about finding pieces of plastic bags in the ocean to pickup. The sunken ship task is about taking a picture with the sunken ship. Some other smaller changes were also made this week. When you are moving on the ocean floor, sand now gets kicked up around you. There is also a dive computer that the user can bring up to view stats like their time diving and current depth and move speed.

We Need To Go Deeper
April 15, 2024

This week the team put in a lot effort to rework and improve the initial underwater experiences that were started on last week. First of all, the simulation now features new sounds for the waves on the beach, breathing underwater, and air bubbles coming up from the water. The other main areas of focus were on reworking player movement, underwater tasks, and expanding the depths of the ocean.


The player now has its own collider and physics, as a result, the movement mechanics of the player have also been reowrked. The underwater movement now more acturately reflects what it would be like to move around with all of the equipment on underwater. The player now also collides with objects underwater, so the player will no longer be able to clip through the ocean floor.


The underwater tasks also received a major overhaul. Tasks that require the player to go to a specific place or collect objects now have exclamation points over the target locations and objects to help guide the player to complete the tasks. There is now also a wider variety of tasks to complete. There is a new task for collecting trash that is caught in a bunch of seaweed and another task for taking a picture at a shipwreck.


Lastly, we have begun work on implementing a deeper section of ocean to help simulate the difference between diving in shallow water and deep water. Most of the effects for deep water diving are not currently implemented and are a major task for the upcoming week.

Direction Change
April 7, 2024

This week we decided to pivot and change the direction of the project. We are no longer gonig to be making a VR exercise platform, instead we are making an application to teach the fundamentals of scuba diving. The main idea behind the application is that we will teach users the basics of communication and movement while scuba diving. Then, we will give them common tasks that you will run into while scuba diving in order to test their knowledge of the basics. We are doing this in order to make learning how to scuba dive more accessible to those who don't want to have to travel and buy a bunch of equipment to try scuba diving out.


This week we have started to create a new tutorial level where the user will learn how to move and communicate while scuba diving. We are currently in progress of creating hand recognition software to recognize when the user has sucessfully done a communication gesture. This software is essential because the application is meant for one person, so there is not another person there who is able to interpret the sign. The current tutorial level consists of an ocean like environment free of obstacals for the user to move around in. Next week, we plan to add some more decoration to make the tutorial environment feel more immsersive. In addition, we plan to start to create the first couple of task levels where the user can gain confidence in their basic scuba diving skills in practical situations.

Project Start
March 31, 2024

Welcome to the first dev blog of the project! In this week's dev blog we are going to cover what the project is and what features the team as started to implement.

The project is a VR athletic training companion. Do away with exspensive, bulky machines and instead pop on your VR headset and get all of the same training all in one spot. The main focus of this project is to replace reaction, coordination, and speed workouts that don't require weights. These types of workouts are easily recreatable in VR and allow the team to have a high degree of customization.

The first features the team started working on are a button reaction workout, juggling workout, a main hub, and color blind mode. We want the user to be easily able to navigate between different workouts, to accomplish this the team is working to develope a main hub where all of the different workouts can be accessed from so that the user can get a sense for what all of the different workouts are and pick which ever one they want to try out.

Image Source
The button reaction workout is the first workout that the team has focused on implementing and was the core inspiration for the projcet. It is inspired by machines like the one pictured above. Different buttons light up and you have to hit them. The team is working to recreate this game by creating buttons that appear around the user that will light up and make a sound when you need to hit them.

Next is the juggling workout. The team decided to work on a juggling outwork as the second workout for the project to provide a different type of outwork. The main goal of the juggling workout is to improve hand-eye coordination.

The last feature the team started work on this week was implementing a color blindness mode. We wanted to start out working on accessibility along the way in development to ensure that the final product is accessible for everyone. The main use of the color blindness mode at the moment is for the lights in the button reaction workout.

Thanks for tuning in to this weeks dev blog. We will be writing these posts weekly throughout the development of the project.