The March issue of Canadian Healthcare Technology contains an article about the work myself, and Quantum Capture, did with two doctors at Toronto’s Sunnybrook and SickKids hospitals. The project was a VR experience to help test and train doctors on performing a fibreoptic bronchoscope intubation on a patient in the trauma centre.
You can read this article as a blog post on the Canadian Healthcare Technology site, or go to page 20 of the digital version of the magazine here.
For more info, I wrote a short article about this work back in December 2016, when Bloomburg TV Canada filmed a segment at the Quantum Capture offices. Go to this article here.
Today we released a first look at our new game, The Field. The Field (may not be the final name) is a WW2 VR FPS inspired by Battlefield 1942. This means flag captures, soldiers in varying kits, and vehicles. In this video, we go through driving a jeep between the hedgerows of Normandy, and the taking of a base.
This video was posted to /r/oculus where it spent the entire day in the number one spot. It was great to see that so many people are as interested in BF1942 in VR as we are. More videos are coming as progress continues to be made.
For some time now, I’ve been helping the guys at Quantum Capture build out a virtual human toolkit for UnrealEngine 4. This includes head and eye movement, facial emotion display, and lip syncing based on audio clips. The latest demonstration of this tech is collaboration between Quantum Capture and Emteq.
Emteq produces a number of devices that allow for the wearer’s facial muscles to be tracked and used in a variety of applications, such as healthcare and entertainment. At this time their hardware is in the prototype stage, but it is great to see them procedurally drive one of Quantum Capture’s virtual humans (in the above image it is Alyssa) through the tech I worked on.
At AWE Europe 2017, Emteq formally announced the collaboration with Quantum Capture, and publicly showed off their tech. You can see their full demonstration video below.
May has been a busy month. First a demo for CVR2017, and now working with Quantum Capture to put together a demo for AWE2017.
The guys at Quantum Capture turned AWE co-founder and CEO, Ori Inbar, into a virtual human. We then put him into a number of situations depending on if the demoing player puts an Oculus Rift or a Microsoft Hololens on his head. In the house pictured above, when you put on the Hololens, Ori is surrounded by fish as part of a simulated, augmented reality environment.
That’s right, we put AR inside VR. That’s how we roll.
This demo, for the Oculus Rift, walks the player through assembling their own virtual human. You start off with a collection of heads to choose from, some of whom are easily distracted by bright objects. You may pick up each head and inspect it, all while the head and eyes stare into your soul.
MorganBot then has you choose a body and cloths for your head. Feel free to place and remove those heads at will! All-in-all a fun little demo to show off virtual humans in VR.
Over the last couple of months I’ve been working with Quantum Capture on a virtual reality training application for trauma center doctors at Sunnybrook and SickKids here in Toronto. This proof-of-concept simulation has you go through a fibreoptic intubation on a patient using your hands to feed the scope down the patient’s throat, while you look on a monitor to see what the scope sees.
There has been a lot more going on at Quantum Capture that I’ve been involved in, with a focus on programming virtual humans. Bloomberg TV Canada dropped by the Quantum Capture office, and their video provides a great overview:
As part of the TorontoVR community group, David Wyand presented Turtle VR to attendees on April 18 at FITC2016. For the entire afternoon, a constant stream of people enjoyed trying the HTC Vive while creating art within Turtle VR.
Showing Turtle VR at FITC 2016
Turtle VR will be available for the HTC Vive on Steam in the Summer of 2016.
Last night, David Wyand gave a talk on Turtle VR and the technology behind it at TorontoVR. He gave a summary of using UE4, Coherent UI, Google Blockly, Oculus DK1, and Razer Hydra in producing the code block programmable drawing application.
Photo by UnrealEngineTO
The event was packed as Oculus came by to demo their consumer Rift and Oculus Touch. It looked like most people played Bullet Train, a demo made by Epic using UE4.
The Vive preview version of Turtle VR was also available for attendees to try out thanks to Globacore.