Have you ever wondered how to create your own virtual humans using the Unity engine?
The videos on this page are from a tutorial I presented with Dr. Kangsoo Kim (University of Calgary) and Dr. Nahal Norouzi (Meta) at the 2020 IEEE Conference on Virtual Reality and 3D User Interfaces.
There are a total of six videos in the series:
- The first video provides an overview of virtual humans for the specific domain of human subjects research studies.
- The second video kicks off the tutorial by demonstrating how to load a virtual human into the unity scene and make it speak using blendshapes and the Rogo LipSynch plugin.
- The third video demonstrates how to use Unity’s animation controllers to allow our virtual human to perform movements and gestures in the virtual scene.
- The fourth video demonstrates how to use Unity’s navmesh system to allow the virtual human to walk around its environment.
- The fifth video demonstrates how to set up a basic networking scripts to control the actions of the virtual human remotely.
- The last video demonstrates how to get data from Arduino devices into the unity scene to make virtual objects behave as if they were in the same physical environment as the user.
The slides for the presentations and example unity projects can be found here.
Please don’t hesitate to reach out to me if you have any questions or comments!
K. Kim, A. Erickson and N. Norouzi, “Developing Embodied Interactive Virtual Characters for Human-Subjects Studies,” 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Atlanta, GA, USA, 2020, pp. 1-1, doi: 10.1109/VRW50115.2020.00291.