visionOS Course

Full day of visionOS training

YLabZ
8 min readFeb 16, 2024
Photo by Igor Omilaev on Unsplash

Dive into Spatial Computing*

This full-day seminar schedule designed to teach the fundamentals of visionOS, Apple’s platform for spatial computing.

  • The core of the training is code!

Apple sample apps for visionOS:

Descriptions:

Overview

VisionOS adds to the Shared Tech Stack (iOS & watchOS):
Swift · SwiftUI · SF Symbols · SwiftData · Swift Structured Concurrency · Swift Package Manager · XCTest

Development & Data:

  • Swift(UI): Modern, expressive language for UIs & logic.
  • Swift structured concurrency: Efficient handling of multiple tasks.
  • SwiftData: Unified data management from various sources.

iOS Dev

6 stories

<Github Link>

Spatial Computing & AR:

  • RealityKit: Builds immersive 3D AR experiences using ARKit.
  • RealityView: Integrates RealityKit content into SwiftUI apps.
  • ARKit: Tracks the environment and places virtual objects realistically.

Materials & Content Creation:

  • MaterialX: Open standard for describing materials.
  • Reality Composer Pro (MaterialX): Visual editor for creating 3D content.
  • USDZ: Efficient file format for storing and sharing 3D assets.

Development Environment:

  • Xcode: Integrated development environment for building VisionOS apps.

Target Audience: Developers familiar with Swift and iOS/iPadOS development concepts.

Learning Objectives:

  • Understand the core concepts of spatial computing and its applications.
  • Explore the visionOS development environment and tools.
  • Build basic interactions and experiences using SwiftUI and RealityKit.
  • Deep dive into advanced topics like physics, animations, and multiplayer experiences.

Schedule:

Morning Session (3 hours):

9:00 AM — 9:30 AM: Welcome & Introduction

  • Brief overview of the seminar agenda and learning objectives.
  • Introduction to spatial computing and its potential.
  • Overview of visionOS and its key features.

<Github Link>

9:30 AM — 10:20 AM: Setting Up Your Development Environment

Introduction to spatial computing, visionOS overview, and development environment setup.

  • Downloading and installing Xcode with visionOS tools.
  • Creating a new visionOS project.
  • Exploring the development workflow and interface.

<Github Link>

Key points covered: Learn the basics of spatial computing and visionOS, set up your development environment with Xcode and tools.

10:20 AM — 10:30 AM: Coffee Break

10:30 AM — 12:00 PM: Building Your First Spatial UI with SwiftUI

Building your first spatial UI with SwiftUI.

  • Understanding SwiftUI syntax and its application in visionOS.
  • Creating basic UI elements like buttons, text, and 3D objects.
  • Layering UI elements in the spatial environment.

<Github Link>

Key points covered: Learn SwiftUI syntax, create basic UI elements, and layer them in the spatial environment.

12:00 PM — 1:00 PM: Lunch Break

Afternoon Session (3 hours):

1:00 PM — 2:00 PM: Introduction to RealityKit

Introduction to RealityKit/RealityView, entities, anchors, and placing virtual objects.

  • Understanding the core concepts of RealityKit/RealityView and its role in visionOS.
  • Exploring entities, anchors, and spatial mapping.
  • Placing virtual objects in the real world.

<Github Link>

Key points covered: Understand RealityKit fundamentals, create spatial experiences with anchors and objects.

2:00 PM — 3:00 PM: Adding Interactivity with Gestures and Physics with ARKit.

  • Implementing TapGestureRecognizer and other gesture interactions.
  • Applying physics simulations for realistic object behavior.
  • Creating animations for dynamic experiences.

<Github Link>

3:00 PM — 3:10 PM: Coffee Break

3:10 PM — 4:30 PM: Deep Dive: Advanced Topics

Enhancing Your Experience with Reality Composer Pro

Deep dive: Enhancing your experience with Reality Composer Pro.

  • Understanding spatial audio concepts and implementation.
  • Understanding MaterialX.
  • Use Reality Composer Pro to add material and spacial audio to your project.

Key points covered: Learn about spatial audio, MaterialX, and use Reality Composer Pro to add them to your project.

<Github Link>

4:30 PM — 5:00 PM: Q&A and Wrap-up

  • Review Apple visionOS source code examples.
  • Open forum for questions and discussion.
  • Sharing resources and next steps for learning visionOS.

Additional Resources:

VisionOS:

Apple Vision Pro - visionOS Development

9 stories

Learn to build an VisionOStore.

Half-day Session

Nvidia Omniverse is a platform designed for building and collaborating on 3D projects. It allows developers to connect various 3D design tools and create realistic simulations. Those simulations can be used in different industries, like designing products or training robots in a safe virtual environment. Another key feature is the ability to build digital twins, which are computerized copies of real-world systems that can be used for testing and optimization.

Please note this will not run on MacOS without a VM Client.

Only for Windows and Linux
Working on this for years and still no solution?

You can use a VM Client

Building with Omniverse USD Composer

Nvidia Ominverse vs Apple Xcode Realilty Composer Pro

Apple Reality Composer Pro

Here’s a breakdown of the key differences between Nvidia Omniverse USD Composer and Apple Reality Composer Pro.

Omniverse USD Composer vs. Reality Composer Pro

While both tools cater to 3D content creation, their underlying toolchains differ significantly in philosophy and target audience. Here’s a more technical breakdown:

Nvidia Omniverse USD Composer:

  • Core Engine: Utilizes Nvidia’s open-source Omniverse Nucleus, a scalable scene description database that facilitates real-time collaboration and data streaming across geographically distributed teams.
  • Scene Description: Relies on Universal Scene Description (USD), an open-layered format that allows for modular scene assembly and editing. USD enables efficient management of complex scenes with references to external assets and layered edits.
  • 3D Editing and Authoring: Integrates with various 3D authoring tools like Maya, Houdini, Blender, and Adobe Substance through USD plugins. This allows artists to leverage their preferred tools for modeling, texturing, and animation, with final assets exported as USD files.
  • Rendering and Simulation: Integrates with a variety of third-party renderers like Pixar RenderMan, Nvidia RTX, and physically-based simulation tools like Nvidia PhysX. This flexibility allows for high-fidelity rendering and realistic simulations tailored to the specific project needs.
  • Material Authoring: Offers support for industry-standard Physically Based Shading (PBS) workflows, allowing artists to create realistic materials with advanced lighting and texturing capabilities. Integration with tools like Substance Designer and Mari enables advanced material creation and editing.
  • Animation Tools: Provides a timeline-based animation system with keyframe editing, character rigging capabilities, and integration with motion capture data.

Apple Reality Composer Pro:

  • Core Engine: Leverages Apple’s SceneKit framework, a high-performance 3D scene graph API optimized for real-time rendering on Apple devices. SceneKit offers a streamlined approach for building 3D scenes for AR experiences.
  • Scene Description: Primarily uses SceneKit’s native scene format, which is efficient for AR development within the Apple ecosystem but less interoperable with broader 3D pipelines.
  • 3D Editing and Authoring: Provides built-in 3D modeling tools for basic geometry creation and editing. However, for complex models, integration with external tools like Maya or Blender through USD plugins might be necessary.
  • Rendering and Simulation: Real-time rendering is handled directly by SceneKit, leveraging Metal graphics API for efficient performance on Apple devices. Physics and particle simulations are also supported through SceneKit’s built-in physics engine.
  • Material Authoring: Offers a simplified material editor with basic properties like diffuse, specular, and emissive control. For advanced materials, external tools like Substance Designer might be needed, with the final material exported for use in Reality Composer Pro.
  • Animation Tools: Provides a timeline-based animation system with keyframe editing capabilities. While not as robust as professional animation tools, it allows for basic animations and character rigging for AR experiences.

In Conclusion:

Omniverse USD Composer offers a powerful and highly customizable toolchain, ideal for professional 3D artists working with complex scenes and requiring interoperability across platforms. Reality Composer Pro prioritizes ease of use and real-time performance for AR development within the Apple ecosystem. Its streamlined toolchain might not be suitable for highly detailed 3D assets but excels in rapid prototyping and iteration for AR applications. Both use USD(z) as their foundation to build on.

https://developer.nvidia.com/omniverse

Nvidia Omniverse & Apple Vision Pro
Nvidia Omniverse is a platform for creating realistic 3D simulations. Apple Vision Pro is a high-resolution headset for augmented reality. Now, with new software, designers can use Omniverse to create digital twins (computerized copies of real things) and stream them directly to the Vision Pro. This lets designers see high-fidelity 3D models on the headset, which can be helpful for things like product design and factory planning.

NVIDIA Omniverse Cloud APIs:

  • Universal Scene Description (OpenUSD)
  • NVIDIA Graphics Delivery Network (GDN)
  • Apple Vision Pro
  • RTX Enterprise Cloud Rendering
  • SwiftUI
  • RealityKit

More coming soon: Tools for designers:

NVidia Ominverse USD Composer

Modular pieces from Ominverse Kit used to build USD Composer

Apple’s impressive entry in to photogrammetry: Object Capture API

Turn photos from your iPhone or iPad into high‑quality 3D models that are optimized for AR using the new Object Capture API on macOS Monterey. Object Capture uses photogrammetry to turn a series of pictures taken on your iPhone or iPad into USDZ files that can be viewed in AR Quick Look, seamlessly integrated into your Xcode project, or used in professional 3D content workflows.

Cinema 4D will not work as well as well as Reality Converter!

Reality Kit and Room Plan

  • USDZ
  • Reality Converter
  • Reality Composer

Reality Composer — Great for prototyping but not good for the production.

You need to use Reality Convert to make a USDZ file.

Thanks :-)

~Ash

--

--

No responses yet