This week:
Simon: Mostly I just researched how to best implement timing for beat-detection and other things like that. I also did some more testing on what we already have in place. There’s still some issues that I’m not sure how to fix, like the 3D and the keyboard input, but I think the plan we have in place is on the right track.
Tim: I am still trying to figure out what to analyze, aka take actually useful information from the song to visualize.
Chelsi: Working on rendering video for the background. Attempting to become familiar with OpenGL and find code that allows us to render a real time video. I’ll be at a music festival next week where I plan to get actual footage (lazers, lights, etc.) that will go along cohesively with what we are building
Overall we have the sphere moving and the ability to change the shapes and color. We are looking at what the best way to analyze the music is (BPM, frequency, etc.) for simplicity and effectiveness. Simon has prepared the sphere for music data, so that it is ready to adapt according to that.
By next week we hope to have the music analyzation decided. We are pretty much on track from our first plan and essentially half way done.