From: John Stone (johns_at_ks.uiuc.edu)
Date: Fri Oct 13 2017 - 15:05:33 CDT

Hi,
  No, in this case "prototypical" means that it works with a particular
open source driver on Linux, and with a specific back-end rendering approach
(parallel GPU-accelerated ray tracing on remote compute nodes), but that
it does not presently support any of the proprietary HMD APIs such as the
Oculus or Vive commercial SDKs. I have had a few people play around with
the prototype implementation that's in VMD now, but I'm holding off on
going further than the protype, until a little more HMD software
standardization occurs.

The plan I am following is to implement the "production" approach to HMDs
in VMD on upcoming cross-platform APIs, in particular I am hoping that the new
Khronos OpenXR API will be well suited to the needs of VMD, and will
provide the ability to support HMD hardware from multiple vendors on
multiple operating systems, which is something not presently available
from the existing APIs.

Here is basic information about OpenXR:
  https://www.khronos.org/openxr

The OpenXR standard is in-development, but I expect that a formal
specification and initial API will become available in the coming year,
perhaps already in early 2018. I have volunteered to become a member
of the Khronos OpenXR advisory panel, which would allow me to get early
access to OpenXR API specs, SDKs, and drivers in an early form
when they become available for alpha/beta testing and feedback.

Optimistically, I would expect to have adapted VMD for OpenXR very soon
after I get hands-on availability of early SDKs and drivers, maybe in
just a few weeks. Timing of releases will depend on many factors, but I
will definitely invest much more energy in HMD VR support once I have
a single API to code for that has a chance of running on every major
OS with good hardware support.

Best,
  John Stone

On Fri, Oct 13, 2017 at 02:40:44PM -0500, Dr. Eddie wrote:
> Pardon my ignorance, but does a prototypical implementation mean if I
> purchase a VR computer with a oculus rift or vibe I can get it to
> visualize in 3d?
> Sorry for the simple question and thanks!
> Eddie
> On Thu, Aug 24, 2017 at 3:37 PM, John Stone <[1]johns_at_ks.uiuc.edu> wrote:
>
> Hi,
> Â I have prototypical implementations of interactive VR rendering
> in VMD already, but they are based on interactive ray tracing and
> they only work with the OpenHMD toolkit thus far.
>
> Up to now I have been waiting for a bit of cross-platform
> cross-HMD-vendor standardization to occur, but it has been slow in
> arriving.
> At Siggraph last month I met with members of the nascent OpenXR VR
> standardization effort, and I'm hoping to be able to join the
> OpenXR advisory panel which would let me get early access to the
> in-development standard and early test implementations thereof.
> I expect to use OpenXR as the basis for the more general implementation
> of VR HMD support in VMD going forward, which will alleviate the
> considerable
> development costs that would be associated with having to write
> different
> implementations for the HTC Vive, Oculus Rift, and various other
> existing
> and forthcoming HMDs. It is clear that this is the right way to go
> when
> you see some of the upcoming HMD designs that are quite different from
> the existing ones, needing APIs that are a bit broader and more general.
>
> As a hedge, if OpenXR ends up taking too long to become available to the
> community, I might consider doing some kind of light weight
> implementation,
> again using either OpenHMD or perhaps using the "OpenVR" code by Valve,
> but I'm really hoping to only have to write the code once, and to base
> it
> on OpenXR if things go the way I hope they do.
>
> Best regards,
> Â John Stone
> Â [2]vmd_at_ks.uiuc.edu
> On Sat, Aug 12, 2017 at 04:29:49PM +0000, Christopher Neale wrote:
> >Â Â Dear all:
> >Â Â I am excited about the VMD developments that permit interaction
> with the
> >Â Â Occulus Rift (or similar VR tech), but the only text that I can
> find
> >Â Â (copied below) is vague enough that I don't really understand the
> breadth
> >Â Â of what can be done with VMD and the rift.
> >Â Â the site:
> [3]http://www.ks.uiuc.edu/Research/vmd/minitutorials/vrmovies/
> >Â Â has the text: One of the advanced features provided by VMD
> versions 1.9.3
> >Â Â and later is the ability to render omnidirectional stereoscopic
> 3-D images
> >Â Â and movies, useful to create so-called "VR" movies on YouTube and
> for VR
> >Â Â HMD movie players on devices such as GearVR, Oculus Rift, and
> others.
> >Â Â Am I right to understand that one has to first "render" an
> interactive
> >Â Â world that can then later be interacted with via the rift
> (rotating and
> >Â Â zooming the viewpoint interactively)? Or is this simply a
> stereoscipic
> >Â Â movie?
> >Â Â Can the rift be used at all without extensive pre-rendering? If
> so, is it
> >Â Â simply a passive steroscopic viewer or can it be more immersive?
> >Â Â Finally, since a stereoscopic view has two views, do I need two
> video
> >Â Â cards, or just one?
> >Â Â Thanks for the help,
> >Â Â Chris.
>
> --
> NIH Center for Macromolecular Modeling and Bioinformatics
> Beckman Institute for Advanced Science and Technology
> University of Illinois, 405 N. Mathews Ave, Urbana, IL 61801
> [4]http://www.ks.uiuc.edu/~johns/Â Â Â Â Â Â Phone: [5]217-244-3349
> [6]http://www.ks.uiuc.edu/Research/vmd/
>
> --
> Eddie
>
> References
>
> Visible links
> 1. mailto:johns_at_ks.uiuc.edu
> 2. mailto:vmd_at_ks.uiuc.edu
> 3. http://www.ks.uiuc.edu/Research/vmd/minitutorials/vrmovies/
> 4. http://www.ks.uiuc.edu/~johns/
> 5. file:///tmp/tel:217-244-3349
> 6. http://www.ks.uiuc.edu/Research/vmd/

-- 
NIH Center for Macromolecular Modeling and Bioinformatics
Beckman Institute for Advanced Science and Technology
University of Illinois, 405 N. Mathews Ave, Urbana, IL 61801
http://www.ks.uiuc.edu/~johns/           Phone: 217-244-3349
http://www.ks.uiuc.edu/Research/vmd/