Update: The chalktalk VR/AR is presented in the Siggraph Asia 2017.
Update: Now open-source! https://github.com/kenperlin/chalktalk. We are inviting people to contribute topics and collaborate on the project. Feel free to contact us!
Our research is being sponsored by Oculus, to support future Chalktalk-based classroom education.
Multiplayer in VR/AR is trending. SteamVR/OculusVR also supported this feature nowadays. However, since the lacking of details in VR/AR, the communication ways for multiplayer in VR/AR becomes a new challenge.
In the current world, people using body language, talking and virtural objects sharing to enhance it.
Limitations are obvious:
- Avatars lacks detail of human behavior
- People can be unwilling to talk, especially when not physically co-located
- Communication via virtual object sharing is limited by what kind of objects people can share
Here we propose a new way for communication in VR/AR, which is called Chalktalk. It is an intelligent, dynamic and interactive tool based on drawing, as shown in the video in the front.
The main features are as below:
- Dynamically generated content based on drawing
- Procedurally generated content based on style
- Interactive among multiple objects.
-The technical is very neat, features are:
- Web-based application
- Cross-platform capability, ability to combine WebVR/WebAR
- Node.js, server + broadcast capability, to support multiplayer experience
- Simple AI for recognition input patterns. A self-learning system will be added in future work
- Simple GUI-based editor for adding new features
- High-performance rendering implemented in WebGL, to achieve real-time procedural generate content rendering
- A Unity plugin is also provided