RRCP Final

 

RRCP Final Post: Caroline, Soheyla, Olivia

 

We used the virtual reality technology to design a cruiseship room. Our goal was to find out: Does being in the immersive environment change people’s preferences for room color?

We used the discovery system to rapidly prototype colors in the room and tested the experiment on 40 people. When the participant first arrived to the study, we asked them to fill out a brief questionaire, with questions asking gender, age, visual impairments (color blindness). They were then presented with a grid of paint swatches consisting of 20 colors. We asked them which color they would most likely paint a bedroom according to their color preference. They were then placed in a room with the Discovery System, to test whether their color preference changed when immersed in a virtual reality environment. In the discovery room, we asked the participant to sit in a chair in the middle of the Discovery System monitors, and to wear 3D glasses. Given a playstation controller, we demonstrated to the participant that the D-pad buttons moved you through our room grid. Left and right moved you to rooms of different colors, while up and down moved you to different saturations of the colors. The grid consisted of 20 different rooms with 5 total colors, and four saturations of those five colors. What we hoped to find was that people color preference would change when immersed in a more realistic setting versus solely relying on a paint swatch to choose a room color. We found that participant’s color preferences changed from what they have selected before from the color swatch. Out of 40 total participants that were in our study, only 15% chose the exact same color in the virtual reality system as they had using swatches. 27.5% chose the same color (red, blue, green, purple, or yellow), but chose a different saturation of that color. In contrast, 32.5% chose a different color entirely, but chose the same level of saturation for that color as they had chosen in the paint swatch originally. And lastly, 25% of participants chose an entirely different color, at an entirely different saturation. Our data tells us that being immersed in a virtual reality system is actually helpful in choosing paint selections for interiors.

We all did our best to bring our individual skill sets to the project. As a group and as individuals, we worked on different tasks:

 

Caroline was key in helping jump-start our project by creating a shared google drive in which we could organize our information, make checklists of tasks, and add helpful information as we worked through our project. In addition to becoming IRB certified, she was fundamental in helping write the IRB. She was a great team member, as she contributed to trouble-shooting our sketch up model, and helped figure out light-baking in 3DS Max.

 

Soheyla was great in doing research in what was previously studied around this topic. She enlisted the help of a current professor in the field, and met with them to figure out which method was the best way to design our experiment. With prior knowledge on how to use 3DS Max for lighting, Soheyla was helpful in lighting our rooms, and learned how to light-bake, making sure that our rooms looked more realistic to the viewer.

 

Olivia worked on modeling our rooms. She learned how to use Sketch Up, and constructed 20 rooms with furniture. She chose 20 colors by collecting color swatches, and used those to color each room. Although something weird occurred where she wasn’t able to be added to the IRB, she went through IRB course learning the appropriate ways to conduct a study.

 

As a group, we worked with each across all areas of the project, meeting with each other on different occasions to work towards creating our experiment.

We’re generally happy about the outcome experiment. We believe the result was really interesting, and matched what we expected. Developing our experiment took longer than we had thought, and we wish we would have been able to get into the Discovery System earlier to work out the kinks we had in our evironment i.e. floating furniture. Overall, we accomplished what we set out to do, so it’s peachy.

Our biggest hurdles were being unfamiliar with the design systems that we had to use in order to accomplish creating our study, such as Sketch Up and 3DS Max. Also, we had also originally wanted to be in the CAVE, but due to scheduling difficulties, it delayed our schedule. Obtaining IRB certification also was a lengthy process, along with writing and submitting it. We also played around with the Emotiv System which turned out to be not as helpful to our experiment as we had hoped. Light baking took some time, as it was a learning curve trying to figure out which lighting would work best, and how to bake it. This, among with other software/coding bugs proved to be our biggest difficulties.

We feel like we met our project goals to a T. We just wish we were able to meet them in a tighter time frame, instead of the lengthy, drawn out process that was our project. Our goals were not always met when it came to deadlines.

If we had more time, we would do…a lot of things. We would work more on the programming side of the experiment, making sure that our space was not 0-gravity-i.e. the furniture was floating and lined up correctly. We would also like to have visual indicators in the system when they were not “in-bounds” of our rooms. We would like to also experiment more with color and potentially wallpaper if we had more time. And of course, more participants would be interesting to test.

 Overall, we felt like our experiment has greater potential for color prototyping for interior spaces.

Team VEX Final

1. What is our project and how does it work:
We created three Garry’s Mod gamemodes. They are a Hidden in Plain Sight mode, Zombie Survival mode, and a Team Fight mode. All of the modes are written in Lua using Garry’s Mod functionality.

Our game modes can be viewed at : http://youtu.be/1qPHmMA1EAc

Hidden in Plain Sight:
There are two possible goals here. One is that you find the other player and kill them. The catch is that the other player looks just like the other NPCs in the map and finding them is a challenge. The other goal is for one of the player to try to kill all of the NPCs and the other player to determine who the murderer is.

Zombie Survival mode:
There are two different goals here as well. In one map you are just trying to kill all of the zombies. In the other map, the map with a lighthouse, you are trying to escape the zombies by getting to the base at the top of the lighthouse. By reaching this base you summon a helicopter that is supposed to take you away. In this game mode the zombies drop weapons, ammo, and health items that will help the players further kill more zombies.

Team Fight mode:
Here we have the players using NPCs as their own personal army. The NPCs start at opposite ends of the map, and each side is already part of a player’s army. You can convert NPCs by shooting them with the pistol. This means you can steal NPCs away from the other player. There isn’t a set goal here, but it is a fun mode to play.

2. Each Team member’s role and contributions:
We all worked on pretty much everything. There were no set roles. Everyone had to set up the Tridef software to run Garry’s Mod with.

Cory: Cory did extensive research on Garry’s Mod and gamemode creation. He tested the Oculus Rift compatibility with Garry’s Mod, eventually settling on using Tridef 3D Ignition software to use the Oculus Rift. Cory worked on various things for playing the game in multiplayer and players and NPCs using weapons. He spent a lot of time programming NPC behavior. This included making them act randomly, react to getting shot at, and making the NPC and players look like each other. He also did a lot of work on minor issues with Garry’s Mod, like how to remove the info about a player that is shown when you target that player. Cory focused on the Hidden in Plain Sight and Team Fight modes.

Jacob: Jacob came up with most of the different game modes that we decided between. Jacob did research on Portal maps and Unity programming to help decide which platform we should program in. This included creating Portal maps to test, creating Portals in Unity, and setting up a Unity example of an online FPS. He also did a lot of research into creating portal zones in Garry’s Mod, but we did not end up using those. Jacob also worked with NPC interactions, including having the NPCs drop items and having the NPCs like you. He also did a lot of work with additional maps for the zombie mode and overall balance issues with the zombie mode. Jacob focused mainly on the Zombie mode, but helped with all three.

Giang: Giang also did research on Portal maps as an option for creating our levels. He was able to create maps for Portal, but we did not end up using these. he looked into gamemode creation like Cory and Jacob did. He looked into how to create a base system to use for goals in the zombie game mode. This was used for one of the maps in the zombie game mode. He was also able to get another computer that lets us demo. Giang focused mainly on the Zombie mode, but helped with the Hidden in Plain Sight mode.

3. How do we feel about the project:
We are pretty happy about the final product we have produced. We would rather a few more things were refined, but overall we are happy with what we have.

4. Largest hurdles:
We discovered that portals were difficult to implement in Garry’s Mod. Making transparent portals is next to impossible. We overcame this by removing the portal element from our gamemodes.

The Garry’s Mod documentation is pretty poor. We had to rely heavily on how other peopel wrote code to solve their issues. A lot of the time we just had to test thigns and see what they did. Also, the Facepunch forums proved very useful to answering some small questions.

NPC behavior is inconsistent. Some of the NPCs act as you expect them too. Several NPC things have been deprecated, but the documentation is not properly updated to reflect this. This resulted in us going around in circles for a while. Eventually we worked out all of the isssues with NPCs.

The Tridef 3D Ignition is limited to Microsoft Windows operating systems. We need this software to use the Oculus Rift with Garry’s Mod. We solved this by limiting the demo computers to Windows only computers.

The class period is not long enough to accomplish things in, especially since we do a stand-up beforehand. We were often left with only 20 or 30 minutes in class, which is not really enough time to do things in class.

5. Did our project meet our original description and goals?
Our general goal was just to have two people in an environment using the Oculus Rift. We have accomplished this goal. However, the details have changed significantly since the project began. Initially we were going to create a game where the two players chased each other using Portals to make obstacles. We completely moved away from this and made the gamemodes described before.

6. What else would we do if we had more time?
We would probably add some UI elements to make the goals of the game more clear. This includes a waiting room system for when the game starts to allow other players to connect and play without anything happening beforehand. Also, this would include an automatic win condition and resetting the game back to the initial state being programmed in.

We would add more maps to the gamemodes. Right now we only have a few different maps.

I would probably try to create more customized NPCs so that I have better control of what each NPC does.

Music Visual Final Post

1. For our project we developed a music visualizer to the song ‘Hack’ by Sam Armus. The experience is immersive, taking advantage of virtual reality to surround the user with the visualizations. The program combines recorded video with a basic spectrum analyzer generated from any music file the user chooses. The video files are played using the video object code provided by Kevin. The music analysis is done using Fmod. The visuals are generated using the spectrum data and drawn using OpenGL. The rest of the program is based on the default Discover system code.

The project uses the Discover system to make a music visualizer that uses the design of the system to create a 180-degree viewing field of the visualization. In the center of the screen, columns of squares represent the different frequencies, and the longer the column, the more prominent that frequency.

In addition to the spectrum we created a custom video for the track we selected, making this project more specific to a certain song. The video compilation is live footage from shows at Moogfest 2014. We played around with the coloring of the video, inverting the spectrum at certain points to create a more immersive environment for the users.

2. Simon’s role on the project was to code up the program, including the graphical elements. This involved learning basics of OpenGL and adapting some Fmod tutorial code to work with the rest of our program. I also helped come up with concepts for what the program would do near the start of the project.

Tim mainly dealt with trying to find different aspects of the song to analyze. He found a potential program to use, and while it had a lot of potential and options for us, in the end the program wasn’t exactly what we were looking for. Other than this, he helped brainstorm ideas for the visualization at the beginning of the project.

Chelsi took live footage for the video and compiled them for the background of the visualizer. The video was created with iMovie, utilizing the filters and transitions within the program to create something that encompassed the feeling we were trying to get across. The video is approximately 7:30 long, and is separate from the spectrum and music tracking. The video loops at the end, allowing the user to experience the visualizer for as long as they’d like.

3. We are happy about how the project turned out. Many elements of its design turned out to be more complicated than we had hoped, but we had also planned for this and had many simpler ideas to fall back on. OpenGL and the music analysis were the trickiest aspects to successfully implement. The end result we think was not too tricky to design but still is effective in achieving our original goal, to make a cool virtual reality music visualizer.

In addition we think it’s great that we got to incorporate multiple aspects within the visualizer, including the song, spectrum and video. We are extremely happy with how the video turned out, because that was something we were very unsure of at the start. Overall the final product exceeded all of our expectations and we are proud of the end result.

4. Learning how to code things using OpenGL was tricky at first, especially because it was hard to test the code at home. We couldn’t find a good C++ development tool to use on my home computer, and even with one that works it’s necessary to test the code out in the Discover system itself to know if it is effective. The music analysis also took a long time to figure out, but we found an effective method by the end of the project.

We also had a very difficult time trying to understand sonic visualizer, the program we were going to use to analyze the music. We think that it is designed for something more than what we needed, but there were a lot of really interesting features that could have been implemented. However, the program was designed for someone with more background in this area, and the Fmod worked very well as a replacement.

5. The final project did not have as much interactivity as the initial idea, but otherwise I think it remained relatively faithful. We did not have enough time or knowledge to implements music that changes as the user interacts with the virtual scene; additionally, without having the project files for the music we used this would have been even more difficult.

One addition we would have liked to make would be if the user could have moved around in the space, and being in different spaces caused different parts of the song to become more prominent. Also to have a greater variety of shapes within the spectrum, but we are happy to at least have color responses within.

We also had some issues in the beginning of how to start off – none of us had ever worked on a project like this and were hesitant to begin without knowing much about the process. In the end we were all able to find a part in the project and put everything together successfully.

6. With more time, we think making the environment more interactive would be exciting. Additionally, making more different types of visualizers and including more kinds of recorded video and music would give the program much more variety. We would have also liked to implement some of the information we could get from the sonic visualizer, but that would change the way the visualizer was set up significantly. Another plan would be adding more interactivity with movement and audio changes as a result of that movement. There could be some change that happens as a result of a gesture or hand wave.

Overall having more time to pay attention to details and perfect the visualizer more would have been beneficial. We built our time structure off of creating a simple machine and adapting it as time allotted, so there is still the possibility to improve even after our final presentation

Final Class Monday

We will have our final class on Monday 5/12/14, from 5:05 – 7:05 PM at 1125 Nancy Nicholas Hall.  Remember you need to have one final post and a short presentation (which you can use your post for) due for class.  Details can be found here: https://blogs.discovery.wisc.edu/designinvr-14/2014/05/01/about-the-end-of-the-semester/

We will split the time up into 30 minute windows as follows:

5:15-5:45  Music Visual Team

5:45-6:15 Rapid Room Color Prototyping

6:15-6:45 Team VEX

Feel free to invite friends and family for the demonstrations.  Also, if you would like snack items, please comment on https://blogs.discovery.wisc.edu/designinvr-14/2014/05/07/food-for-monday/

 

Vex 5/9

VR-5_9_small

This week:

Cory – Made the player alyx model look more like the npc model and made it so that you do not see the name of players when you put the cursor on them possibly, but that needs testing.

Giang –

Jacob – Downloaded and played with additional maps. Played around with the balance of the zombie gamemode.

Accomplishments:

We worked together to test multiplayer within class Wednesday and it worked well.

Problems:

Having a the gun sights over a character will reveal if the character is an npc or person.

Also, Making an NPC like you does not seem to mean that they will defend you. This may affect the Player vs Player Armies mode.

Still on schedule?

We plan on testing multiplayer with the Rift in class today. If that works then we are on time. If it does not then we are behind schedule.

Plans for next week:

Group – Get the Rift system working on multiple computers

Cory – Look into Team creation for the Player vs Player Armies mode. Currently, every player and NPC is independent of each other.

Jacob – Create maps for the different game modes. THis means setting the spawn points for the zombies for a specific map. Make sure Tridef 3D Ignition software works correctly on his computer.

Giang – Make sure Tridef 3D Ignition Software works correctly on his computer.

RRCP_Last WEek

Light BakingLight Baking 2

Accomplishments:

In this week we did a lot of work.

– Olivia and Caroline checked that the system can run the file

– Caroline tested the light baking-  Soheyla and Olivia Have been added to IRB

– Soheyla tested the light baking

Problems:

– We tested the light baking to see how long it will take for the whole rooms. in our test we anticipate it will take several hours

– We experienced some problems with file rendering. We weren’t able to save the file as OSGB format as the tutorial wanted. We need to confirm that our file format works properly

Next Plan:

– Olivia will finalize the rooms in Sketch up

– Soheyla will do the final light baking

– Caroline will submit the IRB

 

Music Team week of 5/8

preview

This picture is a part of the video that we have playing in the background of the music visualizer.

Chelsi: For this week I complied live video footage to run behind our visualizer. I also researched stereo systems/wireless headphone for the system.

Simon: This week I refined the spectrum analyzer, added beat detection, and combined the video player code with our project code (which mostly works).

Timmy: This week I looked into some more information with textures, but we decided that we didn’t need anything more complicated than what we had. Also, I helped out with demoing the spectrum visualizer as well as brainstorming any ideas for the visualization.

We had a few problems getting the video and the music visualizer to run at the same time. Currently the program will freeze up, but the audio will continue to play.

We currently are close to on schedule, probably a workday or two behind at the most. For Monday our plan is to get everything running smoothing and to ensure the video, audio, and music visualizer are properly synced up.

Food for Monday

Post comments of suggestions of food items you would like for Monday’s presentations.  No guarantees that I will be able to get all of the items, but I can see what I can do.

Course evals

As our class is not overly large it is extremely important that everyone finishes their course evaluations.  We will have this count as an assignment, so you will get some credit for completing it.  Once you have finished your course evaluation, make a comment to this post that says “complete” and I can mark you off the list.