RRCP Final

 

RRCP Final Post: Caroline, Soheyla, Olivia

 

We used the virtual reality technology to design a cruiseship room. Our goal was to find out: Does being in the immersive environment change people’s preferences for room color?

We used the discovery system to rapidly prototype colors in the room and tested the experiment on 40 people. When the participant first arrived to the study, we asked them to fill out a brief questionaire, with questions asking gender, age, visual impairments (color blindness). They were then presented with a grid of paint swatches consisting of 20 colors. We asked them which color they would most likely paint a bedroom according to their color preference. They were then placed in a room with the Discovery System, to test whether their color preference changed when immersed in a virtual reality environment. In the discovery room, we asked the participant to sit in a chair in the middle of the Discovery System monitors, and to wear 3D glasses. Given a playstation controller, we demonstrated to the participant that the D-pad buttons moved you through our room grid. Left and right moved you to rooms of different colors, while up and down moved you to different saturations of the colors. The grid consisted of 20 different rooms with 5 total colors, and four saturations of those five colors. What we hoped to find was that people color preference would change when immersed in a more realistic setting versus solely relying on a paint swatch to choose a room color. We found that participant’s color preferences changed from what they have selected before from the color swatch. Out of 40 total participants that were in our study, only 15% chose the exact same color in the virtual reality system as they had using swatches. 27.5% chose the same color (red, blue, green, purple, or yellow), but chose a different saturation of that color. In contrast, 32.5% chose a different color entirely, but chose the same level of saturation for that color as they had chosen in the paint swatch originally. And lastly, 25% of participants chose an entirely different color, at an entirely different saturation. Our data tells us that being immersed in a virtual reality system is actually helpful in choosing paint selections for interiors.

We all did our best to bring our individual skill sets to the project. As a group and as individuals, we worked on different tasks:

 

Caroline was key in helping jump-start our project by creating a shared google drive in which we could organize our information, make checklists of tasks, and add helpful information as we worked through our project. In addition to becoming IRB certified, she was fundamental in helping write the IRB. She was a great team member, as she contributed to trouble-shooting our sketch up model, and helped figure out light-baking in 3DS Max.

 

Soheyla was great in doing research in what was previously studied around this topic. She enlisted the help of a current professor in the field, and met with them to figure out which method was the best way to design our experiment. With prior knowledge on how to use 3DS Max for lighting, Soheyla was helpful in lighting our rooms, and learned how to light-bake, making sure that our rooms looked more realistic to the viewer.

 

Olivia worked on modeling our rooms. She learned how to use Sketch Up, and constructed 20 rooms with furniture. She chose 20 colors by collecting color swatches, and used those to color each room. Although something weird occurred where she wasn’t able to be added to the IRB, she went through IRB course learning the appropriate ways to conduct a study.

 

As a group, we worked with each across all areas of the project, meeting with each other on different occasions to work towards creating our experiment.

We’re generally happy about the outcome experiment. We believe the result was really interesting, and matched what we expected. Developing our experiment took longer than we had thought, and we wish we would have been able to get into the Discovery System earlier to work out the kinks we had in our evironment i.e. floating furniture. Overall, we accomplished what we set out to do, so it’s peachy.

Our biggest hurdles were being unfamiliar with the design systems that we had to use in order to accomplish creating our study, such as Sketch Up and 3DS Max. Also, we had also originally wanted to be in the CAVE, but due to scheduling difficulties, it delayed our schedule. Obtaining IRB certification also was a lengthy process, along with writing and submitting it. We also played around with the Emotiv System which turned out to be not as helpful to our experiment as we had hoped. Light baking took some time, as it was a learning curve trying to figure out which lighting would work best, and how to bake it. This, among with other software/coding bugs proved to be our biggest difficulties.

We feel like we met our project goals to a T. We just wish we were able to meet them in a tighter time frame, instead of the lengthy, drawn out process that was our project. Our goals were not always met when it came to deadlines.

If we had more time, we would do…a lot of things. We would work more on the programming side of the experiment, making sure that our space was not 0-gravity-i.e. the furniture was floating and lined up correctly. We would also like to have visual indicators in the system when they were not “in-bounds” of our rooms. We would like to also experiment more with color and potentially wallpaper if we had more time. And of course, more participants would be interesting to test.

 Overall, we felt like our experiment has greater potential for color prototyping for interior spaces.

RRCP_Last WEek

Light BakingLight Baking 2

Accomplishments:

In this week we did a lot of work.

– Olivia and Caroline checked that the system can run the file

– Caroline tested the light baking-  Soheyla and Olivia Have been added to IRB

– Soheyla tested the light baking

Problems:

– We tested the light baking to see how long it will take for the whole rooms. in our test we anticipate it will take several hours

– We experienced some problems with file rendering. We weren’t able to save the file as OSGB format as the tutorial wanted. We need to confirm that our file format works properly

Next Plan:

– Olivia will finalize the rooms in Sketch up

– Soheyla will do the final light baking

– Caroline will submit the IRB

 

RRCP 4/25

RRPCFri2b RRPCFri3b

Rapid Room Color Prototyping 4/25

Caroline: Worked collectively in 3DS Max to implement lighting into the room. Was instrumental in pushing forward the IRB-which is on its way to being completed soon.

Soheyla: Used prior knowledge of 3DS Max to implement lighting into the room.

Olivia: Finished modeling the virtual room in Sketch Up, and exported to 3DS Max for lighting. Tried to help with lighting.

Accomplishments: This week, we made great strides by solidifying the fundamentals of our experiment i.e. room design/modeling. As a group, we worked together in wording the experiment in the IRB, and incorporating lighting and color into the models.

Problems: The main thing we need to make sure of is that the rooms have realistic lighting with correct coloring-this will take some tweaking in 3DS Max to produce our different colored rooms at different saturations.

Schedule: We are feeling ok on where we are right now. There are elements of the experiment that definitely need to be finalized, but we know what those are and are working through those details.

Next week: Submit IRB, complete all of the room prototypes with lighting, have survey complete, mock experiment/test all of the rooms

Music Team 4/25

IMG_3684

 

Simon: This week I coded some new shapes into the virtual environment and worked on different ways to change the colors and shapes. I also investigated some ways to use OpenAL in our program.

Tim: This week I have been trying to find a way to export data from a spectrogram of the song into a CSV. However I haven’t been able to find anything on the currentprogram that I am using. I tried unsuccessfully to find another program for the job. Next week I plan on finding more information about sonic visualizer and other
music analyzation software.

Chelsi: Chelsi is gone this week but is researching what type of sound system to get for our experiment. She also got some good background footage to use and is getting more this week.

Accomplishments this week: This week we implemented some more prototype environments and did color testing. Additionally, we tested some more music analysis features and decided on how we want to play recorded video in the environment.

Problems: The methods we have found for music analysis so far don’t really work. We still need to find a good program to use.

Schedule: We are behind schedule because the music features have yet to be implemented. All the other major parts of our program are mostly working.

Next week: Implement recorded video and some form of music analysis.

RRCP April 18

Photo of Joy Dohr.

This week we made great progress on finalizing our study design. Soheyla consulted with a research design expert and was able to get good feedback on how we should set up the study. Caroline consulted with a color theory expert and got many resources to help us pick out our test colors. She also tested the emotive more and thinks it won’t fit the scope of this project. Olivia continued work on preparing the rooms in Sketchup.

Problems encountered included realizing we needed to switch technologies from the cave to the Discovery system in order to get the requisite number of people, and needing to use 3DS Max for proper color saturation display.

The project is on schedule, but have to adjust our schedule up in order to leverage the people at the fashion show.

This next week we will finish the room modeling, finalize the study design details, and hopefully test the system on ourselves.

 

  • One piece of media related to your work (image, video, audio, etc)
  • What each individual in the group worked on over the last week
  • A description of the accomplishments made
  • A description of the problems encountered
  • An analysis whether or not the project is still on schedule
  • Plans for the upcoming week

Music Visual

Graphics on Screen

This week:

Simon: Mostly I just researched how to best implement timing for beat-detection and other things like that. I also did some more testing on what we already have in place. There’s still some issues that I’m not sure how to fix, like the 3D and the keyboard input, but I think the plan we have in place is on the right track.

Tim:  I am still trying to figure out what to analyze, aka take actually useful information from the song to visualize.

Chelsi: Working on rendering video for the background. Attempting to become familiar with OpenGL and find code that allows us to render a real time video. I’ll be at a music festival next week where I plan to get actual footage (lazers, lights, etc.) that will go along cohesively with what we are building

Overall we have the sphere moving and the ability to change the shapes and color. We are looking at what the best way to analyze the music is (BPM, frequency, etc.) for simplicity and effectiveness. Simon has prepared the sphere for music data, so that it is ready to adapt according to that.

By next week we hope to have the music analyzation decided. We are pretty much on track from our first plan and essentially half way done.

Music Visuals Updates

sphere2

This week:

Simon – I attached all the colors in the sphere to an array, in this case initialized to random values. The values in the array can be changed in groups, so as an example they could change in patterns with the music. I’d like to do testing in the actual lab so I can get to work on the more complicated parts.

Tim – For this week I have been working on familiarizing myself with music analysis and different features of the program. By next week I hope to start working on finding different options for analyzation of the song we selected.

Chelsi – researched geometric inspiration now that we have a set shape (videos below). Also contacted artist about getting individual track pieces for easier analyzation and should receive them over the weekend. I downloaded a sound analyzer and started to work with that – I have a few questions about how this will translate into OpenGL. I also briefly looked into using real video footage in OpenGL and it looks like rendering is a possibility (example: https://www.youtube.com/watch?v=2AVh1x-Uqjs)

Visualizer created in OpenGL:

Simple Geometric visualization:
https://vimeo.com/47085682

Another based off a 3D triangles:
https://vimeo.com/90972800

The one offers a way to include movement:
https://vimeo.com/67248097

We are a bit behind, but hope to catch up within the next week. By next Friday we hope to have the sphere moving and have our execution for the music analyzation figured out.

RRCP

This Week: Aprill 4th

 

Caroline-Finish IRB training, dig in to the fundamentals Emotive software (Affectiv), some Sketch Up?

Soheyla-Continue in the research of similar color experiments, find potentially adoptable procedural standard for experiment

Olivia-Finish IRB training as well, find potential google warehouse spaces, familiarize self with SketchUp

 

As a group, we used this week as a learning curve-to continue to familiarize ourselves with the tools/software needed to conduct our experiment so we can dive into the nitty gritty of our experiment starting next week.

 

Accomplishments: Finishing the fun IRB training, and further familiarizing ourselves with the software will make it possible to leave the learning phase—and to really start getting into our project.

 

Problems: Sketch Up’s free trial download allows for eight hours of work time. Without paying for the software, it gives us 24 hours of work time between the three of us, plus any extra computers we can acquire if needed to build our 3D space. Just kidding, you only receive 8 hours free of Sketch Up Pro-basic Make Up lasts forever.

 

On Schedule?:

Roughly. We could be more comfortably further ahead, however being unfamiliar with our tools, it was important to take the time to build fundamental knowledge about the programs we are using in order to accomplish our experiment. This will allow us to forge ahead in the next weeks. More concrete imagery to come!

 

Next Weeks Plans:

  • Finalize room/color decisions
  • Use Sketch Up to build/make adjustments to spaces-so much fun
  • Test run the Emotiv system and software
  • Draft experiment design
  • Draft survey questions
  • Solidify interactive activity of subject while in environment

Team Vex Prototype

Team Vex:

Before break we split up the task of researching the different possibilities, we were considering for our project.

Giang – was tasked with researching Portal 2 level creation as an option

Cory – was tasked with researching Garry’s Mod level creation as an option

Jacob – followed up on the research to create a fully custom game

Accomplishments:

Due to existing commitments, the time left in the semester, and a desire to complete the project on time we decided to prototype a few levels in the Portal 2 level creator that would test the game concept and then to move on to create a full Garry’s Mod Gamemode using custom scripts.

Giang was unable to research Portal 2 level creation as the level editor requires the game. Jacob downloaded the level editor and made an example game.

Problems:

Are we still on schedule? Yes. The project is still on schedule.

Garry’s Mod may not actually allow the portals to be transparent.

Media:

http://steamcommunity.com/id/guitarboy667/myworkshopfiles?appid=620

Revisions from plan from March 14th:

At this point, it seems unlikely that we will complete a fully custom game, but considering the fact that I already completed a Portal 2 level, there may be some hope.

Weekly Plans:

4/4: Team members spend time learning to create Portal maps and
familiarizing themselves with Garry’s Mod Gamemodes. The latter involves
learning Lua.

4/11: Set up a basic Gamemode with the correct rules. Set up Oculus Rift
compatibility.

4/18: Work on including portals into the Gamemode.

4/25: Finish working on portals. Design and include different maps for the
game.

5/2: Finishing touches on the game.

5/9:

Contingency plans:

We already finished a Portal 2 level, so just turning that in would be a contingency plan. Ultimately, we can pivot to any interesting game on the Garry’s Mod platform.

Team Vex

Team name: Vex (Virtual EXperience. Clever, I know)
Jacob Hanshaw, Corey Groom, Giang Nguyen

Equipment: Oculus Rift

General Description:

We want to build a multi-player virtual experience. Creating immersion through story and emotion is likely too complex for class and requires more assets. As such our experience will likely focus on action.

Skills:

Jacob- General Programming, Unity and C# Experience

“I’m confidant that I can design a good game, create environments, and use general physics.”

Corey- General Programming
Giang- General Programming

Challenges:

Creating realistic character models is a monumental task in and of itself. Animating them correctly and avoiding uncanny valley is a million dollar industry.

Online real-time multiplayer is one of the biggest challenges in computer science today. It involves keeping multiple complex physics systems in perfect synchronization.

Creating and using portals properly is something that a lot of the smartest programmers in the world (at Valve) spent a lot of time doing and released research papers about. Making a seamless transition from one place to another is hard. The portal must show the proper perspective to potentially multiple people looking at it. Without perspective differences it will be a painting instead of a window. Also, the transportation must be seamless or other players will notice the portal. This means that the player must be in two places at once while transitioning through the portal. If in one place or the other, then there will be a discontinuity in the player’s body.

Creating an FPS by itself means unique gun holding physics, gun animations and reloads, and shooting physics. Shooters in particular are hard for real-time online multiplayer games as they require precise timing.

What is your first step:

Research!

I researched these concepts and found that there are existing libraries that may help abstract some of the harder details of this project. Though experts with one of the tools said it would take 2 weeks to make a simple offline FPS level, so this project may still be significantly out of scope.

Option B:  A God Among Men game without portals
Option C: A God Among Men game without portals or internet
Option D: Create levels for Portal 2 and play them on the Rift (Likely actually possible!)

Games:

Runner and Gunner

RunnerAndGunner

Runner and Gunner takes inspiration from Mirror’s Edge, Portal, The Matrix, and Inception to create a novel gameplay experience.

The idea is that there are two players:

A runner whose goal is to get to a predetermined location as soon as possible. The runner can see and turn off or on pre-existing portals placed around the world.

A gunner whose goal is to shoot the runner. The gunner may get trapped in a certain area due to portal placement. It is the gunner’s job to notice when they are trapped and take control of another person around the runner.

Potential features and difficulty adjustments could involve allowing players to place portals, making the runner’s goal a moving target, giving the runner multiple goals, limiting the number of portals a runner can use, changing the runner’s speed, adding camera shake to the gunner’s gun when moving, limiting the gunner’s ammo, showing the gunner the runner’s target(s) or not, allowing the gunner to destroy the runner’s target(s), etc.

Chill AKA Fiend AKA Horror Game
or
A God Amongst Men

HorrorGame

Other ideas are to create a game (Chill) to scare the Occulus Rift user. This design is more iterative as a scary game could be created, then another player controlling the scariness locally could be added, and finally the game could be made online multiplayer.

The final idea is that Chill could be slightly distorted to a mythology style game where players fight ginormous enemies. The optional extra player could then control the enemy, add extra enemies, or provide aid in the form of power-ups.