Category: Event

Last night, David Wyand gave a presentation to the Toronto VR community about his approach to building a character’s locomotion art and IK for our multiplayer VR FPS, The Field. This talk was similar to the one he gave to the Toronto UE4 community back in September.


Photo by Blair Renaud

A copy of the slides has been made available using Google Slides: David Wyand’s Toronto UE4 Presentation Enjoy!

Last night, David Wyand gave a presentation to the Toronto UE4 community about his approach to building a character’s locomotion art and IK for our multiplayer VR FPS, The Field. He talked about driving a game character’s full body give only head and hands input, while the character walks, runs, climbs, crouches, and uh prones (?) along. And to do all this with minimal network traffic. The provided solution is based on art, Blueprints, and code.

A copy of the slides has been made available using Google Slides: David Wyand’s Toronto UE4 Presentation Enjoy!

For some time now, I’ve been helping the guys at Quantum Capture build out a virtual human toolkit for UnrealEngine 4. This includes head and eye movement, facial emotion display, and lip syncing based on audio clips. The latest demonstration of this tech is collaboration between Quantum Capture and Emteq.

Emteq produces a number of devices that allow for the wearer’s facial muscles to be tracked and used in a variety of applications, such as healthcare and entertainment. At this time their hardware is in the prototype stage, but it is great to see them procedurally drive one of Quantum Capture’s virtual humans (in the above image it is Alyssa) through the tech I worked on.

At AWE Europe 2017, Emteq formally announced the collaboration with Quantum Capture, and publicly showed off their tech. You can see their full demonstration video below.

May has been a busy month. First a demo for CVR2017, and now working with Quantum Capture to put together a demo for AWE2017.

The guys at Quantum Capture turned AWE co-founder and CEO, Ori Inbar, into a virtual human. We then put him into a number of situations depending on if the demoing player puts an Oculus Rift or a Microsoft Hololens on his head. In the house pictured above, when you put on the Hololens, Ori is surrounded by fish as part of a simulated, augmented reality environment.

That’s right, we put AR inside VR. That’s how we roll.

Worked once again with the fine folks at Quantum Capture to put together a virtual reality demo for CVR2017.

This demo, for the Oculus Rift, walks the player through assembling their own virtual human. You start off with a collection of heads to choose from, some of whom are easily distracted by bright objects. You may pick up each head and inspect it, all while the head and eyes stare into your soul.

MorganBot then has you choose a body and cloths for your head. Feel free to place and remove those heads at will! All-in-all a fun little demo to show off virtual humans in VR.

A game prototype I’ve been working on with Quantum Capture will be on display at tonight’s Wearables 2016 AR/VR mega-event.

2016-10-04-vega-01

In Vega you are a convict doing time on a space elevator, ensuring that the cargo containers destined for the mothership are safe. Made with Unreal Engine 4, and played on the HTC Vive.

At Wearables 2016 you’ll be able to play through the 10 minute experience, leading up to its dramatic ending.