Handling Input From Perception Neuron

Perception Neuron is a MOCAP (Motion Capture) hardware which uses 32 little sensors called neurons. I worked on a project in a team of 6, along with another programmer to make a game using the Neuron in its single arm configuration using only 9 neurons. Check out the demo video of the game below.

Handling the input from the neuron is a simple task for both Unreal and Unity. The plugins provided by the Neuron team for both the engines are easy to understand and use. However, the issue a developer might face while developing for the neuron is accuracy of the hardware itself.

Some developers, including me experienced slight issues due to drift in the position and a decline in accuracy due to rapid movements. If a player would perform a certain action repeatedly, the neuron might behave inappropriately after some time. However, calibrating the device again solves the problem. The only issue is that you don’t want a player to have to stop playing just so that they could recalibrate their neuron.

The way I approached to this issue was that I decided to rely on input from the neurons that behave more accurately. For instance, the neurons over the thumb or the little finger seemed to lose accuracy or behave weird even after calibration sometimes.

I logged the rotations of different bones on the index, middle and ring fingers to see how they vary as the player closes and opens his hand. Using the bones that seemed most appropriate to get data on rotation of the fingers, I toggled booleans which represent whether each of these fingers are closed or open. It would be a good idea to have a margin for the rotation thresholds since the closing of hands seems to get worse over time. So the hand won’t close properly with a longer period of time of use without calibration.

Using the input from these three fingers, I toggled a boolean representing whether the hand is closed. If any two of the three fingers are closed, then this boolean is set. After testing for a while, it seemed to be a good approach to handle the player input since we could have multiple playthroughs without requiring to calibrate the hardware.

Contact me here if would like to know more about this project.

Advertisements

Inventing A Game Mechanic Using Augmented Reality

1

Afterlife is a project I really enjoyed working on in my fall semester at FIEA, UCF. It is a 2.5D platformer for Android in which a ghost is trying to reach the grave it belongs to. The player has to try to reach the grave before the ghost fades out of existence as in the picture below. To survive the player will have to collect the supernatural crytals in his path which replenish the timer for the ghost’s existence.

2

However, the above is just the gameplay which provides the player with clarity regarding the game’s objective. In addition to this, augmented reality adds more complexity and possibilities using the smartphone’s camera. The camera on the phone can be used to generate platforms in the game by pointing it to specific images (using Vuforia plugin for Unity). However, as soon as the image is moved out of the camera’s view, the platform would disappear.

The images which may be used for generating platforms in game have been pasted over the six sides of a physical cube which would be held by another person for helping the player as a companion. This creates a whole new game mechanic since the player will have to coordinate their jumping with their companion’s action of rotating the cube. Moreover, the levels have been designed such that the cube would have to be rotated through the jump in order to switch between platforms.

Also, the platforms generated by different faces of the cube give rise to multiple paths. Thus, inducing curiosity in the player to replay a level. They make the player want to look for alternate paths and know what else is present in a level. After all there is an Easter egg to find!

Check out the demo video below. Contact me here to know more about this project.