{"id":969,"date":"2016-12-18T23:43:10","date_gmt":"2016-12-19T05:43:10","guid":{"rendered":"http:\/\/blogs.discovery.wisc.edu\/vr2016\/?p=969"},"modified":"2016-12-19T15:16:25","modified_gmt":"2016-12-19T21:16:25","slug":"prock-final-post","status":"publish","type":"post","link":"https:\/\/blogs.discovery.wisc.edu\/vr2016\/2016\/12\/18\/prock-final-post\/","title":{"rendered":"Prock Final Post"},"content":{"rendered":"<p><img loading=\"lazy\" class=\"alignnone size-medium wp-image-972\" src=\"http:\/\/blogs.discovery.wisc.edu\/vr2016\/files\/2016\/12\/Screen-Shot-2016-12-18-at-11.42.02-PM-300x282.png\" alt=\"screen-shot-2016-12-18-at-11-42-02-pm\" width=\"300\" height=\"282\" srcset=\"https:\/\/blogs.discovery.wisc.edu\/vr2016\/files\/2016\/12\/Screen-Shot-2016-12-18-at-11.42.02-PM-300x282.png 300w, https:\/\/blogs.discovery.wisc.edu\/vr2016\/files\/2016\/12\/Screen-Shot-2016-12-18-at-11.42.02-PM.png 420w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/p>\n<p><b>Prock VR IDE<\/b><\/p>\n<p><span style=\"font-weight: 400\">Final Project Post<\/span><\/p>\n<p><span style=\"font-weight: 400\">Mickey Barboi \u00a0| \u00a0Ken Sun \u00a0| \u00a0Logan Dirkx<\/span><\/p>\n<p><span style=\"font-weight: 400\">12 . 18 . 16<\/span><\/p>\n<p><b>Motivation<\/b><\/p>\n<p><span style=\"font-weight: 400\">Humans are wired to interact with their physical surroundings efficiently through learned and innate spatial reasoning skills. \u00a0Programming languages use syntax to communicate a programmer\u2019s intent that is far removed from either the resulting operation or the level at which the programmer reasons. \u00a0Modern, high level languages offer increasingly more powerful abstractions and tools with which to program, however, these languages are still far removed from the world of mocks, flowcharts, and specifications that drive software development. <\/span><\/p>\n<p><span style=\"font-weight: 400\">Before any project, many programmers begin with a whiteboard sketch of the class hierarchy and interactions he or she will need to implement. \u00a0We believe that physical representations of the intangible abstractions in a program help designers reason about functionality. \u00a0Instead of being forced to switch from an intuitively drawn model to complex syntax, we want to allow programmers to use the same pseudo-physical objects to manipulate code in virtual reality. \u00a0Specifically, we believe this offers distinct benefits when writing, reading, or running code. <\/span><\/p>\n<p><b>Contributions<\/b><b> \u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0<\/b><\/p>\n<p><b>Mickey Barboi: <\/b><\/p>\n<p><span style=\"font-weight: 400\">Started off by embedding a Python 2.7 runtime into Unreal Engine 4, using \u00a0a metaprogrammer to generate the cpp classes to wrap python objects. Set up physics, threw together some basic meshes, and spawned it all in the world. Worked on debugger and fancy splines, but didn\u2019t finish them for the demo. \u00a0<\/span><\/p>\n<p><b>Ken Sun: <\/b><\/p>\n<p><span style=\"font-weight: 400\">Designed input mappings and implemented teleportation, box and line grabbing and box spawn. Mickey and I had a long discussion on how the world should look like. Shall we use the full room scale or just use a table top model. For example, shall we encourage the user to sit on chair and operate or encourage them to walk around and operate. We committed to the later idea and decided to pack as much information in the world as we can. Therefore we implemented teleportation. By pressing trackpad in vive or operating joystick in oculus touch, user can initiate teleportation and change the front direction. <\/span><\/p>\n<p><b>Logan Dirkx: <\/b><\/p>\n<p><span style=\"font-weight: 400\">Before starting this project, we knew that we\u2019d be wrestling with profound design and ergonomic challenges. \u00a0Code is the near-definition of abstraction. \u00a0We attempt to interpret the physical processes of a computer in a series of increasingly abstract languages, that once combined, allow us to control the operations of a computer. \u00a0Over time, languages have become more human friendly. \u00a0However, programming still requires extensive knowledge in the mind in order to manipulate the design of a program. \u00a0VR allows us to create an immersive platform to manipulate virtual objects in intuitive ways. \u00a0What\u2019s more, clever design allows us to embed additional knowledge into the virtual world to make control and manipulations align with real-world expectations. \u00a0\u00a0\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">Natural interfaces and intuitive designs have increased the accessibility and usability of tools for the masses. \u00a0Computers used to be a specialized tool for individuals trained in a litany of command-line prompts. \u00a0Then, Apple introduced the GUI that made computing power accessible to non-specialist users. \u00a0Simultaneously, GUI made human-computer interactions more streamlined for specialist users. \u00a0With this in mind, I worked on designing an interface to map with real-world interactions. \u00a0Through intuitive, error-proofed design, I hoped to design an environment that was not only more useful to a veteran programmer, but also accessible to those without extensive programming knowledge. \u00a0\u00a0\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">All together, I wanted to design a space that accomplished the following:<\/span><\/p>\n<ol>\n<li style=\"font-weight: 400\"><span style=\"font-weight: 400\">A human-centered 3DUI with natural mappings <\/span><\/li>\n<li style=\"font-weight: 400\"><span style=\"font-weight: 400\">Physical representations of abstract objects<\/span><\/li>\n<li style=\"font-weight: 400\"><span style=\"font-weight: 400\">Tiered hierarchy of code specificity<\/span><\/li>\n<li style=\"font-weight: 400\"><span style=\"font-weight: 400\">Auto-complete functionality to streamline development<\/span><\/li>\n<\/ol>\n<p><span style=\"font-weight: 400\">First, I wanted to ensure that the unique advantages of an immersive world are captured. \u00a0Namely, the interactive 3D space. \u00a0So, I designed the \u2018code track\u2019 to wrap around the user. \u00a0In this way, the user can focus on what is immediately relevant directly in front of them, while being able to gaze to their peripherals and see the previous \/ upcoming code. \u00a0The 3DUI also enables users to pull the appropriate code into their field of view. \u00a0We feel like this is a more natural way of traversing the thread of execution in a program. \u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">One of the grand challenges of coding in VR is the lack of a keyboard. \u00a0Keyboards allow users to access any function, variable, or command by simply typing the corresponding command. \u00a0In VR, we needed to find a way to display only the most relevant commands \/ objects to the user. \u00a0Our preliminary design included auto-populating boards that show only the physical representations of valid code to the user (similar to autocomplete or input validation offered by many IDEs). \u00a0Furthermore, we designed the shape &amp; color of the object to create physical limitations for invalid user input. \u00a0In the example environment, the d.insert function is orange with circular holes. \u00a0The valid input of int(a) and int(b) spheres auto populate the backboard in the same color informing the user of the validity of these arguments. \u00a0What\u2019s more, the list(z) object is a cylinder of a different color suggesting that the input could be valid, but it needs to be modified (with a get function for instance). \u00a0By physically error-proofing incorrect input, the VR IDE reduces the knowledge in the mind required by developers. \u00a0We also identified that sometimes written confirmation is helpful to a user to ensure that the physical representations match with their expectations. \u00a0So, we included terminal windows to show this information to the user. <\/span><\/p>\n<p><span style=\"font-weight: 400\">Finally, we realized that most projects are far too large to navigate via an extremely precise pull mechanism. \u00a0So, we also designed an expandable network of increasing granularity to allow the users to select the scope of the program they wish to view. \u00a0In this way, users can visualize the connections between classes or libraries. \u00a0Then, he or she can selectively pull themselves into any class \u2018orb\u2019 for further analysis of the variables &amp; methods in that class. \u00a0Once the desired section of the code is found, users simply walk into the control room discussed above to edit and modify the code. \u00a0We feel that the \u2018big picture\u2019 view grants users a more comprehensive understanding of the interactions between the web of classes. \u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">All together, I feel like this design offers a first step into physical programming in a virtual environment. \u00a0I anticipate that the main limitation we will experience during implementation is the auto-completing backboards. \u00a0Determining what objects are most relevant to the users and populating them in real-time seems like a massive technical challenge. \u00a0Also, finding meaningful shapes and colors for increasing complex programs without creating confusion will not be trivial. \u00a0Finally, programming languages use nesting \/ indentations to represent layering. \u00a0So, handling local vs global variables of the same name add an additional level of complexity to visual representations. \u00a0\u00a0\u00a0\u00a0<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><b>Outcomes<\/b><\/p>\n<ul>\n<li style=\"font-weight: 400\"><span style=\"font-weight: 400\">Describe the operation of your final project. What does it do and how does it work?<\/span><\/li>\n<li style=\"font-weight: 400\"><span style=\"font-weight: 400\">How well did your project meet your original project description and goals?<\/span><\/li>\n<li style=\"font-weight: 400\"><span style=\"font-weight: 400\">As a team, describe what are your feelings about your project? Are you happy, content, frustrated, etc.?<\/span><\/li>\n<\/ul>\n<p><span style=\"font-weight: 400\">At the start, we had 3 functionality goals we wanted to implement into our IDE.<\/span><\/p>\n<ol>\n<li style=\"font-weight: 400\"><span style=\"font-weight: 400\">Reading: Read in source code from python and populate a VR environment with meaningful objects<\/span><\/li>\n<li style=\"font-weight: 400\"><span style=\"font-weight: 400\">Ritting: Make changes to the environment \/ objects and save the new code back out<\/span><\/li>\n<li style=\"font-weight: 400\"><span style=\"font-weight: 400\">Running: \u00a0Run the code in the environment at real time and watch the interactions occur via animations. \u00a0Tie in a debugger to physically see where errors occur.<\/span><\/li>\n<\/ol>\n<p><span style=\"font-weight: 400\">These goals are definitely not trivial, but we were able to make significant progress towards all three. \u00a0The initial setup of dissecting an AST and transforming the objects into VR was the critical backbone we needed to perfect before moving on to modifications in VR. \u00a0By the end of the semester, we were able to read in basic arithmetic from a python script and visualize the code in boxes and lines in VR.<\/span><\/p>\n<p><b>Problems encountered<\/b><\/p>\n<p><span style=\"font-weight: 400\">We have to deal with an arbitrarily complex AST, in a language agnostic way, without implementing a programming language or syntax checking. Deciding what information is most valuable to show to the user, then representing it in a meaningful way is far from trivial. \u00a0Though we started with simple arithmetic, more interesting programs exponentially more complex once control flow and nesting are intertwined with simple function calls and operations. \u00a0As complexity increases, simple questions like what to show and where to put objects quickly become roadblocks for implementation. \u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400\">Additionally, we realized there there must be two separate representations of the code. \u00a0One for writing &amp; editing code, and one for observing execution. \u00a0The same design and interactions are unlikely to be valuable for both circumstances. \u00a0Similarly, the way that people \u2018white-board\u2019 can differ dramatically. \u00a0So, creating a one-size-fits-all physical representation might only aid those who share the same mental model of code. \u00a0<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"font-weight: 400\">From a technical perspective, we know that text input is not a reasonable way to name objects or make commands. \u00a0So, all new objects will likely be named incrementally. \u00a0At the time, we cannot think of a way to make variable naming more intuitive in VR than with a keyboard. \u00a0The best choice now is randomly assigned letters. <\/span><\/p>\n<p><span style=\"font-weight: 400\">Organizing the visible nodes in the game world in an extensible way is tricky. We\u2019re not sure if there\u2019s a good static way to do it, but for now we\u2019re relying on a crude attraction\/repulsion physics model. The point of it is to not have to worry about coordinates for each in-game node while still having them space appropriately. It might not scale in terms of performance, and it doesn\u2019t work while scaling. Best bet here is a very simple home-grown physics simulation instead of relying on the engine\u2019s. <\/span><\/p>\n<p><span style=\"font-weight: 400\">Lastly, game engines are not designed to zoom in. \u00a0So, it becomes very challenging to create functionality to \u2018blow up\u2019 and block of code and see increasingly granular information. Specifically, we\u2019d love to be able to leverage LOD settings already built into the engine to control actor culling when zoomed way outside the scope of a given actor. This is going to have to be done in house. <\/span><\/p>\n<p>&nbsp;<\/p>\n<p><b>Next Steps<\/b><\/p>\n<p><span style=\"font-weight: 400\">Now that we can read code into a virtual environment as boxes and lines, we want to be able to create new boxes and lines to add to the existing code. \u00a0Then, we can connect the newly created code objects to the existing thread and write the code back out to a python script via the AST. This could get tricky as far as checking for \u201cvalid\u201d changes to the AST tree, not to mention the 3 or 4 layers that the AST ends up being pulled into (Python AST node, cpp wrapper, in-game actor, and higher-order representation.)<\/span><\/p>\n<p><span style=\"font-weight: 400\">After we\u2019ve implemented writing basic operations, we can continue to expand the \u2018vocabulary\u2019 of our IDE to handle control flow, classes, instances, collections, and invocation. \u00a0To do this, we\u2019ll need to design meaningful ways of representing these things in VR that make intuitive sense to the user. \u00a0This involves designing and creating meshes and customizing existing functionality. <\/span><\/p>\n<p><span style=\"font-weight: 400\">Finally, we want to allow the user to execute the code in VR and watch the objects physically interact in VR. \u00a0The culmination of all efforts will allow the user to create a \u2018Hello, World!\u2019 program and watch it execute in real time- all from VR. This means hooking into a running python runtime, which turned out to be surprisingly easy. When running, we\u2019ll move the code graph to the background and only show the variables in scope. Then, just like a debugger, we\u2019ll step through each statement in our AST graph. Where the static code has lines connecting nodes we\u2019re instead going to animate nodes, making them fly together, then stepping the real debugger and updating the values represented in the game world, and finally animating the nodes apart, <\/span><\/p>\n<p>&nbsp;<\/p>\n<p><b>Video Content:<\/b><\/p>\n<p><span style=\"font-weight: 400\">Prock Project Demo: \u00a0<br \/>\n<iframe loading=\"lazy\" width=\"1170\" height=\"878\" src=\"https:\/\/www.youtube.com\/embed\/RdVaZoD8ptc?feature=oembed\" frameborder=\"0\" allowfullscreen><\/iframe><br \/>\n<\/span><\/p>\n<p><span style=\"font-weight: 400\">Prock Physics Demo: \u00a0<br \/>\n<iframe loading=\"lazy\" width=\"1170\" height=\"878\" src=\"https:\/\/www.youtube.com\/embed\/SrJamnDWqHo?feature=oembed\" frameborder=\"0\" allowfullscreen><\/iframe><br \/>\n<\/span><a href=\"https:\/\/youtu.be\/SrJamnDWqHo\"><span style=\"font-weight: 400\">https:\/\/youtu.be\/SrJamnDWqHo<\/span><\/a><\/p>\n<p><span style=\"font-weight: 400\">Future State Design: \u00a0https:\/\/drive.google.com\/file\/d\/0B9mWTz97pl0hY3N1ajRTSjBTcU0\/view?usp=sharing\u00a0<\/span><\/p>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Prock VR IDE Final Project Post Mickey Barboi \u00a0| \u00a0Ken Sun \u00a0| \u00a0Logan Dirkx 12 . 18 . 16 Motivation Humans are wired to interact with their physical surroundings efficiently through learned and innate spatial reasoning skills. \u00a0Programming languages use syntax to communicate a programmer\u2019s intent that is far removed from either the resulting operation [&hellip;]<\/p>\n","protected":false},"author":165,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":[],"categories":[41],"tags":[],"_links":{"self":[{"href":"https:\/\/blogs.discovery.wisc.edu\/vr2016\/wp-json\/wp\/v2\/posts\/969"}],"collection":[{"href":"https:\/\/blogs.discovery.wisc.edu\/vr2016\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/blogs.discovery.wisc.edu\/vr2016\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/blogs.discovery.wisc.edu\/vr2016\/wp-json\/wp\/v2\/users\/165"}],"replies":[{"embeddable":true,"href":"https:\/\/blogs.discovery.wisc.edu\/vr2016\/wp-json\/wp\/v2\/comments?post=969"}],"version-history":[{"count":2,"href":"https:\/\/blogs.discovery.wisc.edu\/vr2016\/wp-json\/wp\/v2\/posts\/969\/revisions"}],"predecessor-version":[{"id":983,"href":"https:\/\/blogs.discovery.wisc.edu\/vr2016\/wp-json\/wp\/v2\/posts\/969\/revisions\/983"}],"wp:attachment":[{"href":"https:\/\/blogs.discovery.wisc.edu\/vr2016\/wp-json\/wp\/v2\/media?parent=969"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/blogs.discovery.wisc.edu\/vr2016\/wp-json\/wp\/v2\/categories?post=969"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/blogs.discovery.wisc.edu\/vr2016\/wp-json\/wp\/v2\/tags?post=969"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}