Menu

Magic Leap Creator Program Assets

Thanks for considering RadVyXR for funding under the Magic Leap Creator Program!

This page will be updated periodically to include the latest work-in-progress build of the application, as well as screenshots & additional supporting documentation.

If a copy of this page was automatically scraped at the time the grant application was submitted, please re-visit this page (http://radvyxr.com/magicleap) prior to making your final decision. I spent the week after Christmas experimenting with particle and shader-based effects to render the radar gates as translucent vibrating mist. It looks good in 3D, but I’m still not happy with the way it appears in screenshots.


Here’s an older preview (made before I started my particle/shader experiments) that completely fails to convey what it looks like in 3D. It’s Hurricane Michael when the eye was close to the radar site near Mobile, Alabama. It’s actually showing multiple sweeps of the radar volume scan… higher sweeps are more translucent. The effect is a lot more dramatic when viewed in stereo with a proper VR headset that has 6dof tracking (so you can move around and view it from different angles and  vantage points).

Also, please note that this view of the hurricane, as dramatic as it is, represents a vastly larger area than a user would typically view (several hundred mile radius, spanning the entire Florida Panhandle, most of Georgia, and a large chunk of Alabama), and was rendered from lower-resolution long-range radar data that was further reduced in resolution to keep the total number of vertices below ~6 million and keep its head-tracking framerate acceptable on my Nexus 6P.

If the user were viewing TDWR data for an area the size of Broward County, it would be much higher resolution. As much as I’d love to be able to render hurricanes in their entirety at maximum resolution & detail with 90fps head tracking, it’s just not a realistic expectation for current mobile hardware.

Check back here in a couple of days to see better examples of RadVyXR at work.


For another example of my work, please check out the virtual reality chess game I recently published for Android — Pawnslaught VR Chess. I submitted a separate grant request related to Pawnslaught last week.

For what it’s worth, if my grant for RadVyXR gets approved, I’m planning to spend my first week or two porting Pawnslaught to Magic Leap as a warm-up exercise and publish it to MagicVerse as a free app… so it will be kind of like a “buy one (RadVyXr), get one free (Pawnslaught)” deal for Magic Leap.

Since the primary focus would be on RadVyXR, the work on Pawnslaught would be fairly minimal… edit the scene to remove background graphics that are pointless in a mixed-reality app, add enough code to let the user position the chessboard in his/her environment (and walk around to view it from different angles), re-color the dark assets to make them visible, and implement control that uses a combination of gaze-tracking for selection, and a button on the ML controller for triggering.

Even without the extensive enhancements I had planned for my original multi-month proposal, I think Pawnslaught would be a fun, worthwhile addition to the MagicVerse.

Another future project I have in mind (API-permitting) would be implementing my “MouseCode Keyboard” as an alternate IME for MagicLeap (possibly, just as a library app developers could include to enable one-handed two-finger eyes-free text-command entry using the ML Controller).

I have no illusions about MouseCode’s potential as a general-purpose text entry method (approximately “none”), but I do think it’s a handy way to handle AutoCAD-like keyboard commands. In fact, that’s how MouseCode originally came about… I got tired of having to constantly move my right hand from the mouse to the keyboard when using EagleCAD (same AutoCAD-like paradigm) for circuit board design, so I spent a few days hacking a quick & dirty extension to let me do text entry using only the mouse buttons & later decided to use it as the basis of an Android IME.