Faculty & Research

Profile

Michael  Chabin

Michael Chabin

Visting Lecturer

Contact Information

mschabin@indiana.edu
Informatics West 202

Courses Taught at SICE

  • Programming Virtual Reality
  • Introduction to Virtual Reality Design

Biography

I've developed interactive animations for the National Air and Space Museum, the American Bankers Association, the Harvard Smithsonian Center for Astrophysics, and others. I've also built an Alumni/Donor tracking and accounting system at UC Santa Cruz, a sea-going data acquisition system for the Institute for Geophysics at the University of Hawaii (where I never got over being sea-sick), and I did system-level design of automated fingerprint matching systems for SAGEM MORPHO, in Fontainebleau, France. I've also taught for a number of years.

Now I'm interested in non-game applications of virtual reality (VR) and augmented reality (AR). By VR, I mean immersive, interactive, computer generated experiences that depend on a head mounted display, headphones, and hand controllers of some kind. By AR, I mean tools that can add arbitrary information to the user's experience of the ordinary world.

I'm especially interested in connecting the virtual and real worlds through robotics and using VR and AR as assistive technology for the visually impaired.

My classes focus on creating non-game immersive and interactive virtual spaces that can be explored with the HTC Vive. We begin creating spaces on the first day of class and continue with projects that vary in length from 2 to 5 weeks. Examples include a Place of Solitude, a City in the Sky (Inspired by Miyazaki’s Castle in the Sky), a Texture Garden, an Underwater World, habitats on one of the newly discovered exoplanets, and a good deal more. Students choose the subject of their final projects.

Those with good programming skills may, at their option, create virtual worlds entirely from code. The language used is C#. 

The two most important tools we’ll use are the game engine Unity 3D (to create the space, control physics, etc) and Blender to create 3D assets to place in our spaces.  We’ll make some use of Audacity for sound editing, MakeHuman  for creating human-like characters to animate, and Photoshop for a variety of image editing tasks. 

With the exception of Photoshop, all these are free.

I assume students have no prior experience with any of these tools and that they have no programming background.

When you finish one of my classes, you’ll have a clear idea of how virtual reality is produced and how 3D assets are created and animated. You should have a better sense about how animated features like Pixar’s Cars were created and what drives their costs. You’ll have at least 40 hours’ experience creating virtual reality content and, if you are carful you'll have a remarkable addition to your professional portfolio.

My personal sense is that no one understands Virtual Reality. We don’t understand its emotional impact or how it affects judgement. We don’t understand how to link virtual experiences to the real world or how to teach with it. We don’t even know how to tell stories in it, at least partly because VR makes it difficult to control the visitor's point of view.

The VR I'm interested in is an experiment that began a year ago with the release of the Occulus Rift. To date something like 1.5 million people have bought headsets in order to experience a kind of digital dream. No one knows exactly how all those dreams will affect all those users.

I certainly don’t understand. What you’ll learn in my classes is how to create the dreams.