This week we continued working on Oz VR. As an animator, I figured there were not yet many opportunities to work on character animations since we are still in the early stages of environment and character development. Since I have an additional interest in Houdini, I decided to take up a little side project for possible use in my team's game section.
As seen above, I started creating some animated growing vines using a tutorial posted below. If I could get these working correctly, I think they would look nice animating along the path of the Lion's section and contributing to the eerie atmosphere. I am having issues with manipulating the procedural workflow to randomly rotate the leaves, and I am also hoping to offset the animation but the vine shape is coming along nicely. This next week I will continue working on this to make it look realistic.
I also started into research on the Perception Neuron MOCAP system and its potential uses in our VR project. I attended the PN demo held in the MOCAP studio and learned how the equipment works, how the equipment is set up and how the system is calibrated.
The system detects motion by utilizing lightweight sensors that are barely 12mm square. These sensors capture your gestures and body movements and reflect them in your virtual world for an immersive experience. The makers of the motion capture suit boast that there are an unlimited number of combinations in which the sensors can be applied. You can also remove sensors to limit data should you want less data.
An interesting thing I discovered about the PN is that multiple people can use it at the same time, allowing for some amazing collaboration and interaction possibilities. This makes me wonder if we can start thinking about incorporating a multi-player experience into our OZ experience. We have discussed utilizing puzzle mechanics to travel from one section to the next... perhaps some of them can require the use of two people to complete... a sort of team based situation.
Another issue we are encountering with this experience is the concept of traveling and the breaking of immersion while teleporting. Since the PN system is wireless, it makes me wonder if the worlds can be designed in a way that can be simulated in a real life space so that players may actually walk through the entirety of our experience. Another way around this would be to potentially bring in a type of treadmill for the final presentation of our piece... if the roads are kept relatively straight in the game, I feel that players could still get the proper experience while walking in a straight line.