3D on the Web & WebXR
Table of Contents
IntroductionAyşegül Yönet begins the course by sharing examples of WebXR and augmented reality being used in the real world. The applications range from architectural and medical uses to automotive and agricultural.
Immersive Web TechnologiesAyşegül explains the scope of the W3C's work on the Immersive Web specification. Community groups contribute to a number of technologies including navigation, lighting estimate, geo alignment, and performance.
Creating a 3D Scene Using Babylon.jsAyşegül describes the objects required to create a basic 3D scene. A scene contains a camera, a light source, and one or more mesh objects. These concepts are demonstrated using Babylon.js in the Babylon.js online playground.
Babylon.js Practice and Libraries Q&AStudents are instructed to add mouse controls to a 3D scene. This segment also discusses differences between Babylon.js and Three.js.
Creating a Scene Using Three.jsAyşegül demonstrates how a 3D scene is created using Three.js. Similar to Babylon.js, the Three.js example contains a scene, a camera, and a basic mesh material. A WebGL renderer draws the scene on a canvas element. Animations are added using the requestAnimationFrame method.
Creating a Scene Using AFrameAyşegül demonstrates the AFrame library. Web components are used to compose a scene in HTML. VR functionality is included by default in the framework.
3D Libraries Discussion & ResourcesAyşegül answers questions about Three.js, Babylon.js, and AFrame. Resources for downloading and using 3D models are also covered in this segment.
Building 3D Scenes
Creating a Local Development EnvironmentAyşegül explains the required tools for working in a local environment and walks through the project files. Once the GitHub repository is cloned, Typescript and the other NPM packages can be installed. The dev script compiles the TypeScript code and starts a local development server.
Animated 3D Earth SceneAyşegül demonstrates how the 3D Earth scene is created with TypeScript in a local development environment. The TextureLoader class loads a texture and bump map. These textures are combined into a mesh and placed on the SphereGeometry.
Animated Clouds ExerciseStudents are instructed to create a new sphere with a cloud texture. The new sphere should be slightly larger than the Earth and rotate at a slightly faster rate.
Animated Clouds SolutionAyşegül live codes the solution to the Animated Clouds exercise.
Virtual & Augmented Reality (VR & AR)
AR Hololens DemoAyşegül demonstrates the Hololens device. Multiple cameras around the Hololens headset track hand the user's hand movements. A menu to access the applications appears on the user's wrist.
Creating a VR SceneAyşegül creates a VRButton component to launch the VR experience. The Chrome WebXR extension can be used to simulate the VR experience in the browser or inspect and debug the experience on a connected mobile device.
Creating an AR SceneAyşegül converts the VR scene to an AR scene by using the ARButton component and updating the renderer. AR experiences use the setAnimationLoop method on the renderer object since they require higher frame rates than the requestAnimationFrame function can provide.
Using AR Hit TestAyşegül adds hit test functionality to the AR scene. When a reference space is detected by the camera, a mesh is added to the scene as a target for a model to be placed. The model remains stationary at that position throughout the AR session.
Loading Models to ARAyşegül demonstrates how to load an immersive 3D model into the AR scene. Users are able to explore the model by walking around and moving the camera.
Loading Models PracticeStudents are instructed to try the other models and loaders located in the three.js/examples/models directory. Sample code for loading these models can be found on the threejs.org website.
AccessibilityAyşegül describes different ways VR and AR experiences can be made more accessible. One technique is to allow multiple forms of input like hand gestures, eye tracking, or vocal cues. The upcoming layers feature of the WebXR API will sync HTML element on the canvas to make the experience more visually accessible.