Juan-Fernando Morales (juprmora)
Qingyang Sun (qsun20)
Rohan Nikhil Venkatapuram (rnvenkat)
Project 1B Writeup: Integrating Virtual Reality
Project Description
The goal of this project was to create a VR experience using our projects that utilized Three.js from CSE 160. This was done in two phases, the first (done in part 1A) was about turning the project into a virtual reality space. The second phase (done here) focused on adding in features that allowed you to explore and interact with the virtual space.
Goals
Our plan was to individually add a shared set of features to our projects. We all had fairly different code structures in our projects, so we felt as though this was a solid approach. The primary feature we wanted to add was the ability to teleport around the virtual reality environment. This was crucial as all of our projects had spaces that were larger than a normal boundary safe area. So, to fully experience the scenes we needed this extra movement. An additional stretch feature was to add some object interactions using the controllers.
Results
As we had three different projects, this section will be split into results by project.
Juan (jupromora)
My VR project was focused on implementing teleportation. In order to do that, I had to figure out how to find the intersection between a ray and the hitbox of an object. To keep things simple, I only allowed teleportation to the floor. My implementation consisted of adding a controller object to the three.js scene, with event listeners for starting and ending of a trigger event. At the end of a trigger event, a ray is cast, and I only have to check if it collides with the floor. If so, I translate and rotate the player matrix to that object. If I was to rewrite this assignment, I would decouple it from the render function, as right now there is a global variable that the render function writes to every frame, that updates the point intersected by a ray cast from each controller. Then, my trigger event handler is responsible for telling the game to move my camera's position. This is serviceable, but I think it would be more valuable to chain together functions from the button press, and that way not have to rely on a global variable.
Henry/Qingyang (qsun20)
My project1b includes converting 160 projects into the form of VR, and adding the functions of transmitting and generating cubes to realize 3D scene and models, dynamics, navigation and interaction. In this world, players can move through the controller and also interact with the world through their hands.
The function of teleport creates two controller objects (controller1 and controller2) for interaction in a VR environment. Teleport involves using a raycaster, calculating the intersection point between raycaster and the ground, and moving the camera to the intersection position. The onSelectStart () and onSelectEnd () functions are used to determine the trigger start and the end event. The selection start event is triggered when the user clicks the controller button in VR, and the selection end event is triggered when the button is released. In the selection end event, if there is an intersection, an offset is calculated and a new reference space is set for teleportation . The teleport function uses XRControllerModelFactory to implement the operation of buttons, in this case, both triggers on the controller.
The hand generation function is divided into two modules. The right index finger and thumb pinch to generate a square at the tip of the finger, while the left hand can "grab" the generated square when a pinch is detected with the left index finger and thumb. The collideObject () function is used to detect whether the finger intersects the cube. The two main functions in program 1b build foundations for our final project, because we need a way to manipulate point cloud data.
Rohan (rnvenkat)
Similar to Henry’s, I wanted to add both teleportation and object interaction. However, before doing that, I had to integrate controller support. Using the three.js examples as a reference, I added both the controllers and the hand models. Following that, I changed the controller listener functions that were defined in the controller example to instead create an offset to the reference space the player is in. The new listeners can be found in the “#setupControllerEvents” function in “VRSystem.js”, the file that contains all the VR-related code for the project. The amount of offset was determined by two steps. First, perform a raycast from the controller to the ground if the trigger on the controller was held down. If the ray hits the ground, a marker appears showing where the user will teleport to. Upon release, the base reference space is offset by the difference between the player and the point of intersection between the ray and the ground.
The second feature, object interaction, was implemented differently from Henry’s. Rather than creating new objects, I wanted to manipulate some of the objects that were already in my scene, that being my rocks. Similar raycasting strategies were used to determine which rocks I was pointing the controller to, this time adding a distance limit to make only nearby rocks appear “grabbable”. When you are close and you can grab a rock (shown by the rock having a red highlight), you can press the trigger to grab and move the rock. This is also handled in “#setupControllerEvents”, but specifically in “controllerOneOnSelectStart” (the other controller does the teleportation). I was originally planning on adding physics to these rocks, allowing them to be thrown around, using the Ammo.js library. However, that feature ended up needing to be dropped.