RRCP April 11

rrcp picture

Accomplishments:

As a group in this week we really started getting into our project.

– We tested the Emotive system and software- so much fun- We also started to become more familiar with the software by reading the software instruction
– We tried to finalize the number of rooms and make color decision
– Furniture or none? We’ll test both on ourselves first. We’re leaning towards neutral furniture such as in a cruise ship cabin.
– Furniture will color same as wall
– We’ll design 4 rooms off of a central room and allow the participant to walk between the rooms at will. We’ll measure how much time they spend in each room.
– white, red, blue, green will be the colors we’ll test. We discussed trying yellow to cover the primary colors, but decided it was too unusual a room color given the saturation we wish to test.
– Using sketch-up, we made our sample room.
– We made an appointment with a research method professor for the next week in order to get help for drafting the right survey questions, and reached out to the emeritus professor of the class “Color Theory: Environmental Context”
– We came up with some idea to solidify interactive activity of the subject while in the environment
– Tell participants task at hand in advance
– “Which hotel room would you most enjoy having for your vacation?”
– Integrate controller training in lobby
– Participant will verbally say their preference – numbered
– And will offer an adjective for their favorite room
– we will use standard prompt if indecisive
– finally, we’ll compare the results vs. a survey we give to people on paper showing color pictures of the room

We hope we will find out Does being in the immersive environment change people’s preferences for room color?

Next Week’s Plans:
– Testing the Emotive system – what does it add to our study? How would we make it work with test subjects?
– Complete Draft of experiment design
– Complete Draft survey question
– Working on the interactive activity of the subject while in the environment

Note: post written by Soheila Mohamadi

Team VEX 4/11

header

 

This week:
Cory – Set up the Oculus Rift to work with Garry’s Mod using Tridef 3D Ignition software. Went through the game mode creation tutorials
Giang – Went through the game mode creation tutorials and learned lua.
Jacob – Experimented with a Unity example online FPS

Accomplishments:

Accomplishments would be that we are able to run Garry’s Mod using the Oculus Rift and we can start creating a game mode.

Problems:
We didn’t have any specific problems this week.

Still on schedule?
We think we are still on schedule.

Plans for next week:
For next week, we will be creating a basic game for our game mode.

Music Visuals Updates

sphere2

This week:

Simon – I attached all the colors in the sphere to an array, in this case initialized to random values. The values in the array can be changed in groups, so as an example they could change in patterns with the music. I’d like to do testing in the actual lab so I can get to work on the more complicated parts.

Tim – For this week I have been working on familiarizing myself with music analysis and different features of the program. By next week I hope to start working on finding different options for analyzation of the song we selected.

Chelsi – researched geometric inspiration now that we have a set shape (videos below). Also contacted artist about getting individual track pieces for easier analyzation and should receive them over the weekend. I downloaded a sound analyzer and started to work with that – I have a few questions about how this will translate into OpenGL. I also briefly looked into using real video footage in OpenGL and it looks like rendering is a possibility (example: https://www.youtube.com/watch?v=2AVh1x-Uqjs)

Visualizer created in OpenGL:

Simple Geometric visualization:
https://vimeo.com/47085682

Another based off a 3D triangles:
https://vimeo.com/90972800

The one offers a way to include movement:
https://vimeo.com/67248097

We are a bit behind, but hope to catch up within the next week. By next Friday we hope to have the sphere moving and have our execution for the music analyzation figured out.

Reading 9

For class 4/14/14
Post Comments by 11:59 pm on 4/13/14

Your avatar, your guide
S Murphy – Scientific American Mind, 2011 – nature.com
http://vhil.stanford.edu/news/2011/sciam-your-avatar.pdf

AI seduces Stanford students
K Poulsen – Wired News, 2005 – vhil.stanford.edu
http://vhil.stanford.edu/news/2005/wired-aiseduces.pdf

Why digital avatars make the best teachers
J Bailenson – The Chronicle of Higher Education, 2008 – vhil.stanford.edu
http://vhil.stanford.edu/news/2008/che-avatar-teachers.pdf

Discussion Article 1 (Pick 3)

  1. What are your overall impressions of this piece?  Do you find the approaches mentioned more promising or concerning?  Describe why.
  2. What are your thoughts on Doppelgänger avatar therapy?  Does this model make sense to you?  What kind of fields could you see this be useful for besides those mentioned in the piece?
  3. Discuss the results of the “whale” experiment.  Do the results surprise you?  Do you feel like the same effect would happen on adults (to a lesser degree)?
  4. Discuss the concepts of self in relation to this article
  5. You may find a term or topic that you are unfamiliar with.  If you would like to, do a little research on the subject (wikipedia would be a good place to start). Briefly describe your understanding of this concept and how confident you are about this understanding.

Discussion Articles 2 and 3

Formulate a brief response to each of these articles (1 to 2 paragraphs).  Whether your response is positive or negative is up to you.

 

Some news on Diminished Reality

Just a quick follow-up on Diminished Reality,

Rover just announced a prototype featuring an ‘invisible bonnet’: http://www.telegraph.co.uk/motoring/car-manufacturers/land-rover/10752488/Land-Rovers-invisible-bonnet-technology.html which allows the driver to see the road or track conditions through the hood of his car.

This reminded me of the next-generation HMDs in fighter jets, such as the F-35. Northrop Grumman developed a ‘Distributed Aperture System‘, which provides full 360 degree (or more correctly: full hemispherical) vision through distributed cameras. The images are displayed on the helmet’s visor and depend on the pilot’s head position, so the pilot is able to see through the floor of the aircraft. However, integrating this system is quite complex and it has been named as one of the major risk factors for success in this already way-over-budget project.

 

RRCP

This Week: Aprill 4th

 

Caroline-Finish IRB training, dig in to the fundamentals Emotive software (Affectiv), some Sketch Up?

Soheyla-Continue in the research of similar color experiments, find potentially adoptable procedural standard for experiment

Olivia-Finish IRB training as well, find potential google warehouse spaces, familiarize self with SketchUp

 

As a group, we used this week as a learning curve-to continue to familiarize ourselves with the tools/software needed to conduct our experiment so we can dive into the nitty gritty of our experiment starting next week.

 

Accomplishments: Finishing the fun IRB training, and further familiarizing ourselves with the software will make it possible to leave the learning phase—and to really start getting into our project.

 

Problems: Sketch Up’s free trial download allows for eight hours of work time. Without paying for the software, it gives us 24 hours of work time between the three of us, plus any extra computers we can acquire if needed to build our 3D space. Just kidding, you only receive 8 hours free of Sketch Up Pro-basic Make Up lasts forever.

 

On Schedule?:

Roughly. We could be more comfortably further ahead, however being unfamiliar with our tools, it was important to take the time to build fundamental knowledge about the programs we are using in order to accomplish our experiment. This will allow us to forge ahead in the next weeks. More concrete imagery to come!

 

Next Weeks Plans:

  • Finalize room/color decisions
  • Use Sketch Up to build/make adjustments to spaces-so much fun
  • Test run the Emotiv system and software
  • Draft experiment design
  • Draft survey questions
  • Solidify interactive activity of subject while in environment

Music Team, 4/4

sphere1

 

Simon: worked on prototyping the virtual environment in OpenGL

Chelsi: worked on concepts and the music we will use

Tim: researched different ways to analyze the music

Accomplishments this week: A basic prototype of the environment was created. I wasn’t able to get as much done as I hoped because I couldn’t test in the lab, so many aspects of the program are still waiting on that. The final music file was acquired and we are on track to start analyzing the music.

Problems encountered: The lab computer crashed when I went to go use it, and I couldn’t test with the code I wanted to. Testing had to be done at home, which had many limitations. This meant that many of the more complicated parts of the program are still up in the air without being able to test in the lab.

Schedule: We are slightly behind schedule because our program is not running in the lab yet. The music analysis is still on schedule, we will decide on a method of analysis and implement it next week.

Next week: Beginning of music analysis. Testing in lab and adding more parts to the environment. More experimentation in OpenGL.

Vex 4/4

Fight map
This week:
Cory – Set up Oculus Rift. Set up computer to test Jacob’s Portal 2 co-op map. Initial testing of Garry’s Mod native Oculus Rift support.
Giang – Spent some time learning to create maps in Portal 2.
Jacob – Set up computer to test Jacob’s Portal 2 co-op map.

Accomplishments:
We were able to successfully test Jacob’s Portal 2 co-op map. This involved setting up Portal 2 to work with the Rift using Virieo’s Perception software. The end result was that we were able to play against each other online while both using the Oculus Rift.

Problems:
Garry’s Mod native Oculus Rift support crashes the game. We suspected that this may be a problem, but I still wanted to try my own tests before moving on to alternative choices. Next I will be testing Garry’s Mod using the Tridef Ignition software, which is supposed to do a good job of bringing Oculus Rift support to Garry’s Mod.

Still on schedule?
We may be a little behind schedule, as our team was not able to start learning to create Garry’s Mod Gamemodes this week. However, we have been working on issues that come later in our plan, such as setting up Oculus Rift compatibility. I think we will be able to make up the small amount of lost time.

Plans for next week:
Familiarize selves with creating Garry’s Mod Gamemodes and finish setting up Oculus Rift compatibility.

Reading 8

For class 4/7/14
Post Comments by 11:59 pm on 4/6/14

This week we will be reading two papers from IEEE VR 2014.  These papers can be found at:

https://uwmadison.box.com/s/7rerpo60yioc9618hyqj

You will need to use your campus ID to get access to them.  There is also a video included for the first article.

Discussion

This week we will do an unstructured discussion.  For each paper write 3-4 paragraphs (6-8 paragraphs total).  Items in which you can discuss

  • Your overall impressions of the article
  • Strengths of the work
  • Weaknesses of the work
  • Questions you have and/or items you would like clarified
  • Implications of the work
  • What you think the next study should be based on this work