Last night, David Wyand gave a presentation to the Toronto VR community about his approach to building a character’s locomotion art and IK for our multiplayer VR FPS, The Field. This talk was similar to the one he gave to the Toronto UE4 community back in September.
Last night, David Wyand gave a presentation to the Toronto UE4 community about his approach to building a character’s locomotion art and IK for our multiplayer VR FPS, The Field. He talked about driving a game character’s full body give only head and hands input, while the character walks, runs, climbs, crouches, and uh prones (?) along. And to do all this with minimal network traffic. The provided solution is based on art, Blueprints, and code.
For some time now, I’ve been helping the guys at Quantum Capture build out a virtual human toolkit for UnrealEngine 4. This includes head and eye movement, facial emotion display, and lip syncing based on audio clips. The latest demonstration of this tech is collaboration between Quantum Capture and Emteq.
Emteq produces a number of devices that allow for the wearer’s facial muscles to be tracked and used in a variety of applications, such as healthcare and entertainment. At this time their hardware is in the prototype stage, but it is great to see them procedurally drive one of Quantum Capture’s virtual humans (in the above image it is Alyssa) through the tech I worked on.
At AWE Europe 2017, Emteq formally announced the collaboration with Quantum Capture, and publicly showed off their tech. You can see their full demonstration video below.
May has been a busy month. First a demo for CVR2017, and now working with Quantum Capture to put together a demo for AWE2017.
The guys at Quantum Capture turned AWE co-founder and CEO, Ori Inbar, into a virtual human. We then put him into a number of situations depending on if the demoing player puts an Oculus Rift or a Microsoft Hololens on his head. In the house pictured above, when you put on the Hololens, Ori is surrounded by fish as part of a simulated, augmented reality environment.
That’s right, we put AR inside VR. That’s how we roll.
This demo, for the Oculus Rift, walks the player through assembling their own virtual human. You start off with a collection of heads to choose from, some of whom are easily distracted by bright objects. You may pick up each head and inspect it, all while the head and eyes stare into your soul.
MorganBot then has you choose a body and cloths for your head. Feel free to place and remove those heads at will! All-in-all a fun little demo to show off virtual humans in VR.
As part of the TorontoVR community group, David Wyand presented Turtle VR to attendees on April 18 at FITC2016. For the entire afternoon, a constant stream of people enjoyed trying the HTC Vive while creating art within Turtle VR.
Showing Turtle VR at FITC 2016
Turtle VR will be available for the HTC Vive on Steam in the Summer of 2016.
Last night, David Wyand gave a talk on Turtle VR and the technology behind it at TorontoVR. He gave a summary of using UE4, Coherent UI, Google Blockly, Oculus DK1, and Razer Hydra in producing the code block programmable drawing application.
Photo by UnrealEngineTO
The event was packed as Oculus came by to demo their consumer Rift and Oculus Touch. It looked like most people played Bullet Train, a demo made by Epic using UE4.
The Vive preview version of Turtle VR was also available for attendees to try out thanks to Globacore.
Last night, David Wyand gave a talk on Circumpaint, UE4, and the Oculus Mobile VR Jam at TorontoVR. He gave a summary of the VR Jam, and talked about the challenges in creating a Finalist VR Jam entry using Unreal Engine 4.
Photo by Stephan Tanguay
About 60 people attended the event at the Globacore headquarters, which included a talk by Denis Lirette about Globacore’s newest game, Power Core VR.