From: John Stone (johns_at_ks.uiuc.edu)
Date: Thu Mar 15 2018 - 00:03:16 CDT

Hi,
  Dallas is correct that the spatial media tags are required by
YouTube and don't necessarily benefit any other VR movie players.
The particular details of the required movie resolution and
stereo pair layout will differ among VR players since this isn't
really standardized officially, but most players can play 360 movies
that use so-called equirectangular 360 projections, and they typically
do support over/under or side-by-side type arrangements for each eye.
If your player has trouble, you may want to try a different one.
Particularly on the phones there are many players, but only some
of them actually do a good job.

Best,
  John Stone
  vmd_at_ks.uiuc.edu

On Thu, Mar 08, 2018 at 09:10:17AM +1100, Dallas Warren wrote:
> I have never used the Spatial Media Metadata Injector from Google,
> never needed to (suspect you only need to do that if uploading to
> YouTube). I have generated a significant number of 360 3D animations
> that I have then viewed using a Samsung Galaxy S8 with Gear VR, works
> great. After I have generated the video using ffmpg simply copy over
> to the phone, load using Samsung VR app. The .mp4 file should include
> 3dpv in the filename, that tells the app the correct way to process
> and display it.
> Catch ya,
>
> Dr. Dallas Warren
> Drug Delivery, Disposition and Dynamics
> Monash Institute of Pharmaceutical Sciences, Monash University
> 381 Royal Parade, Parkville VIC 3052
> dallas.warren_at_monash.edu
> ---------------------------------
> When the only tool you own is a hammer, every problem begins to resemble a nail.
>
>
> On 8 March 2018 at 07:24, Per Larsson <larsson.r.per_at_gmail.com> wrote:
> > Hi again
> >
> > Thanks - this is helpful indeed.
> >
> > One more thing though, I can't seem to produce proper 360 videos when I view them using the Samsung VR Gear.
> >
> > I have a molecular dynamics trajectory that I load into VMD, and process it with representations etc???
> > Then I render with optiX, after having set the four env variables specified here:
> > http://www.ks.uiuc.edu/Research/vmd/minitutorials/vrmovies/
> >
> > my rendering command in the script is simply:
> > render TachyonLOptiXInternal $filename
> >
> > And VMD starts rendering, and outputs information to stdout:
> > Info) Rendering current scene to 'snap.0000.tga' ...
> > Info) Overriding VMD camera projection mode with spherical equirectangular projection
> > Info) Overriding VMD camera, enabling stereo
> > OptiXDisplayDevice) Total rendering time: 259.31 sec
> >
> > etc??? for all frames
> >
> > I then post-process with ffmpeg to turn the individual tga-images into a mp4 movie, and then inject that movie with metadata using the Spatial Media Metadata Injector from Google, to turn it into 360.
> >
> > The final video looks neat, but is clearly not 360 although it stretches some of the way. But there must be something I am missing, either in how I visualize my trajectory initially in VMD, how I use the OptiX renderer, or how I post-process the top-bottom images.
> >
> > Any help or hints are much appreciated
> > /Per
> >
> >
> >
> > 7 mar 2018 kl. 19:20 skrev "Vermaas, Joshua" <Joshua.Vermaas_at_nrel.gov>:
> >
> >> Hi Per,
> >>
> >> Rotations, scaling, and translation of the camera are stored in three
> >> matricies: center_matrix, rotate_matrix, and scale_matrix, which are
> >> accessible using the molinfo command. I *think* the save visualization
> >> state retains these though, so it may be that you have changed the
> >> screen height variable instead between your two machines (the default is
> >> 6, and you may have changed it in your VMDrc file to make one look close
> >> than another). The command to change it is under "display" in the user
> >> guide.
> >>
> >> -Josh
> >>
> >> On 03/07/2018 08:03 AM, Per Larsson wrote:
> >>> Thanks Joshua and Dallas
> >>>
> >>> Much appreciated!
> >>>
> >>> @Joshua: Saving the state to another machine works fine. Cold you maybe elaborate a bit on how I can control the viewport when I make my movie? Currently, looking at the trajectory in VMD it is quite zoomed in, but then the movie (after rendering and processing with ffmpeg) comes out not quite the same, it is more zoomed out than I'd like it to be. I assume this is something I can control, but I don't see how exactly?
> >>>
> >>> @Dallas: I'm sticking with the OptiX-renderer for now, but for another time maybe, how can I make any other renderer produce the side-by-side images (currently what comes out of OptiX is top-bottom)
> >>>
> >>> Cheers
> >>> /Per
> >>>
> >>> 6 mar 2018 kl. 21:23 skrev Dallas Warren <Dallas.Warren_at_monash.edu>:
> >>>
> >>>> To get 360 view you need to use the Optix renderer.
> >>>>
> >>>> If just after 3D SBS or TB then can use any renderer, assemble the
> >>>> generated frames into L / R videos (do this separately, as Movie
> >>>> Generator only makes .mpg, .mp4 is much better quality, especially for
> >>>> VR), then stitch those together as appropriate (SBS or TB).
> >>>>
> >>>> The former is a much easier process, latter works, but takes some work
> >>>> and tweaking.
> >>>> Catch ya,
> >>>>
> >>>> Dr. Dallas Warren
> >>>> Drug Delivery, Disposition and Dynamics
> >>>> Monash Institute of Pharmaceutical Sciences, Monash University
> >>>> 381 Royal Parade, Parkville VIC 3052
> >>>> dallas.warren_at_monash.edu
> >>>> ---------------------------------
> >>>> When the only tool you own is a hammer, every problem begins to resemble a nail.
> >>>>
> >>>>
> >>>> On 6 March 2018 at 23:52, Per Larsson <larsson.r.per_at_gmail.com> wrote:
> >>>>> Hi VMD-users
> >>>>>
> >>>>> I am trying to create movies from simulations trajectories for the Samsung
> >>>>> Gear VR system, and tried to follow the tutorial on this page:
> >>>>>
> >>>>> https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.ks.uiuc.edu%2FResearch%2Fvmd%2Fminitutorials%2Fvrmovies%2F&data=02%7C01%7CJoshua.Vermaas%40nrel.gov%7Ce812f419c99c4c3fe20608d5843c992f%7Ca0f29d7e28cd4f5484427885aee7c080%7C0%7C0%7C636560318167669492&sdata=m%2BDdn%2FagC6AaO%2FVjI570%2BSD%2BMvqonmMNMTRqG%2BbhQiA%3D&reserved=0
> >>>>>
> >>>>> However, it is unclear to me whether I am forced to use the TachyonL-OptiX
> >>>>> renderer to achieve this? I am currently working with VMD version 1.9.3
> >>>>> using macOS 10.13.3 with the built-in Iris Plus Graphics 640, but no CUDA.
> >>>>>
> >>>>> Is it then not possible to create VR-movies on this system, since it
> >>>>> requires specifically the OptiX-renderer?
> >>>>>
> >>>>> Is the alternative then to do something like what is discussed here, using
> >>>>> the VMD GUI to set colors, representations etc, save that state and copy
> >>>>> everything to the linux machine and render there?
> >>>>>
> >>>>> https://na01.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.ks.uiuc.edu%2FResearch%2Fvmd%2Fmailing_list%2Fvmd-l%2F27865.html&data=02%7C01%7CJoshua.Vermaas%40nrel.gov%7Ce812f419c99c4c3fe20608d5843c992f%7Ca0f29d7e28cd4f5484427885aee7c080%7C0%7C0%7C636560318167669492&sdata=y7G5Gt5oqC3ur3ksxgUCoWy3h0duVvBUbr27rEReiW4%3D&reserved=0
> >>>>>
> >>>>> Many thanks
> >>>>> /Per Larsson
> >>>
> >>>
> >>
> >
> >

-- 
NIH Center for Macromolecular Modeling and Bioinformatics
Beckman Institute for Advanced Science and Technology
University of Illinois, 405 N. Mathews Ave, Urbana, IL 61801
http://www.ks.uiuc.edu/~johns/           Phone: 217-244-3349
http://www.ks.uiuc.edu/Research/vmd/