Last night, David Wyand gave a presentation to the Toronto UE4 community about his approach to building a character’s locomotion art and IK for our multiplayer VR FPS, The Field. He talked about driving a game character’s full body give only head and hands input, while the character walks, runs, climbs, crouches, and uh prones (?) along. And to do all this with minimal network traffic. The provided solution is based on art, Blueprints, and code.
Today we released a third look at our new game, The Field, a WW2 VR FPS inspired by Battlefield 1942. This means flag captures, soldiers in varying kits, and vehicles. In this video, we head to the Highlands Obstacle Course to take a look how other players look and move. Run, crouch, belly to the ground, climb, and hand-over-hand. We’ve got it all.
Oculus sent us some nice goodies, including an Oculus Go and Rift, as part of their Start program. They had also requested that everyone send in their unboxing videos once the package was received. Well, rather than doing a plain, ordinary unboxing video I did in VR within our The Field game environment (our WW2 VR FPS). Enjoy!
Amazingly, both Hugo Barra, and Jason Rubin, (both at Oculus) tweeted about my video. What a day!
Today we released a second look at our new game, The Field. The Field (may not be the final name) is a WW2 VR FPS inspired by Battlefield 1942. This means flag captures, soldiers in varying kits, and vehicles. In this video, we head out to the Greenwood Range to test out some of the American rifles and talk about some of the available kits. An important safety lesson is presented following the credits.
This video was posted to /r/oculus and it looks like once again the game is holding the top position on the subreddit. Woohoo! It is great to see so many people as excited for a BF1942-like game in VR as we are.
The March issue of Canadian Healthcare Technology contains an article about the work myself, and Quantum Capture, did with two doctors at Toronto’s Sunnybrook and SickKids hospitals. The project was a VR experience to help test and train doctors on performing a fibreoptic bronchoscope intubation on a patient in the trauma centre.
You can read this article as a blog post on the Canadian Healthcare Technology site, or go to page 20 of the digital version of the magazine here.
For more info, I wrote a short article about this work back in December 2016, when Bloomburg TV Canada filmed a segment at the Quantum Capture offices. Go to this article here.
Today we released a first look at our new game, The Field. The Field (may not be the final name) is a WW2 VR FPS inspired by Battlefield 1942. This means flag captures, soldiers in varying kits, and vehicles. In this video, we go through driving a jeep between the hedgerows of Normandy, and the taking of a base.
This video was posted to /r/oculus where it spent the entire day in the number one spot. It was great to see that so many people are as interested in BF1942 in VR as we are. More videos are coming as progress continues to be made.
For some time now, I’ve been helping the guys at Quantum Capture build out a virtual human toolkit for UnrealEngine 4. This includes head and eye movement, facial emotion display, and lip syncing based on audio clips. The latest demonstration of this tech is collaboration between Quantum Capture and Emteq.
Emteq produces a number of devices that allow for the wearer’s facial muscles to be tracked and used in a variety of applications, such as healthcare and entertainment. At this time their hardware is in the prototype stage, but it is great to see them procedurally drive one of Quantum Capture’s virtual humans (in the above image it is Alyssa) through the tech I worked on.
At AWE Europe 2017, Emteq formally announced the collaboration with Quantum Capture, and publicly showed off their tech. You can see their full demonstration video below.
May has been a busy month. First a demo for CVR2017, and now working with Quantum Capture to put together a demo for AWE2017.
The guys at Quantum Capture turned AWE co-founder and CEO, Ori Inbar, into a virtual human. We then put him into a number of situations depending on if the demoing player puts an Oculus Rift or a Microsoft Hololens on his head. In the house pictured above, when you put on the Hololens, Ori is surrounded by fish as part of a simulated, augmented reality environment.
That’s right, we put AR inside VR. That’s how we roll.
This demo, for the Oculus Rift, walks the player through assembling their own virtual human. You start off with a collection of heads to choose from, some of whom are easily distracted by bright objects. You may pick up each head and inspect it, all while the head and eyes stare into your soul.
MorganBot then has you choose a body and cloths for your head. Feel free to place and remove those heads at will! All-in-all a fun little demo to show off virtual humans in VR.
Over the last couple of months I’ve been working with Quantum Capture on a virtual reality training application for trauma center doctors at Sunnybrook and SickKids here in Toronto. This proof-of-concept simulation has you go through a fibreoptic intubation on a patient using your hands to feed the scope down the patient’s throat, while you look on a monitor to see what the scope sees.
There has been a lot more going on at Quantum Capture that I’ve been involved in, with a focus on programming virtual humans. Bloomberg TV Canada dropped by the Quantum Capture office, and their video provides a great overview: