Android XR Quick Dive
HoverBike is an immersive Android XR application that demonstrates how to create engaging, spatialized user experiences using the Jetpack XR SDK and Jetpack Compose. In this project, a 3D hover bike model is loaded and rendered in an XR environment, complete with continuous rotation, dynamic animations, and a fallback 2D interface for devices that do not support full XR functionality.
Android XR represents the future of immersive experiences on mobile devices, enabling developers to seamlessly integrate augmented reality (AR) and virtual reality (VR) into their apps. In this article, we will touch on the core components of this new technology and learn how one can unlock its full potential for creating engaging, spatialized user experiences.
The Jetpack XR SDK consists of the following libraries. Together, they provide a comprehensive toolkit for building rich and diverse spatialized experiences and are designed to work seamlessly with each other.
Libraries: Jetpack Compose for XR • Material Design for XR • Jetpack SceneCore & ARCore for Jetpack XR.
By reviewing each library’s features and practical use cases, people will learn practices for designing intuitive XR interfaces and blending virtual content with the real world.
Whether you’re new to AR/VR or an experienced developer, this article will intro you with the knowledge to build basic Android XR (Compose) applications.
In our previous intro article we compared and contrast Android XR, visionOS, and Meta Quest, providing insight into the evolving competitive landscape of extended reality platforms.
Resources
Documentation: Jetpack XR SDK
Examples: XR Samples on GitHub
Codelabs: XR Codelabs
Git Repo: Source code
If you have any questions … here is the working source code!
Website:
If you like AR/VR/XR please give us a visit
Jetpack Compose for XR
The Jetpack XR SDK lets you build immersive XR experiences using modern tools like Kotlin and Compose, as well as previous generation tools such as Java and Views. You can spatialize your UI, load and render 3D models and semantically understand the real world.
- Jetpack Compose for XR: Declaratively build spatial UI layouts that take advantage of Android XR’s spatial capabilities.
- Material Design for XR: Build with Material components and layouts that adapt for XR.
- Jetpack SceneCore: Build and manipulate the Android XR scene graph with 3D content.
- ARCore for Jetpack XR: Bring digital content into the real world with perception capabilities. (next article)
Below sections walk you through building an Android XR app using the Jetpack XR SDK, based on the official documentation and best practices.
Building Immersive Android XR Apps with Jetpack XR SDK
The world of extended reality (XR) is evolving rapidly, and Google’s Jetpack XR SDK makes it easier than ever to build immersive experiences for Android devices. Whether you’re interested in augmented reality (AR), virtual reality (VR), or mixed reality (MR), this toolkit integrates with modern Android development practices — including Kotlin and Jetpack Compose — to let you create compelling XR applications with less boilerplate and more focus on creativity.
XR Session Management: Detect and manage XR sessions with adaptive spatial and fallback 2D UIs.
3D Model Loading & Rendering: Asynchronously load a glTF model (hover_bike/scene.gltf
) and render it using Filament(under the hood). The model is animated and continuously spins to showcase its 3D properties.
Dynamic Spatial UI: Leverage SpatialPanel, Volume, and Orbiters to design an immersive, customizable user interface.
Animation & Pose Control: Set up a continuous rotation (spin) for the 3D model using custom pose and quaternion logic.
Fallback 2D Experience: Provide a 2D UI for devices without full XR support, ensuring a consistent experience across hardware.
Overview of the Jetpack XR SDK
The Jetpack XR SDK is Google’s toolkit for building XR applications on Android. It provides a high-level, declarative API that abstracts much of the underlying complexity (such as sensor management, session lifecycles, and rendering) so you can focus on your app’s unique immersive experiences.
Key Features
Lifecycle Awareness: The SDK integrates with Android’s lifecycle components, ensuring your XR session automatically handles pause/resume events.
Spatial vs. 2D UI: You can easily switch between spatial (3D) content and a fallback 2D layout. This makes your app adaptable to devices that either support XR or are running in a non-immersive mode.
Compose-Based UI: Built with Jetpack Compose, the SDK enables you to create declarative and reactive UIs that integrate smoothly with your XR content.
3D Model Integration: The SDK offers APIs for loading glTF models asynchronously (using methods like createGltfResourceAsync()
on your XR session), which you can then render in your scene.
Prerequisites
- Android Studio Meerkat (or later)
- Minimum API Level: 35 or higher
- Supported Device: A device (or emulator) with XR capabilities (standard emulators may have limited XR support)
- Dependencies:
- Jetpack XR SDK libraries
- Kotlin Coroutines with Guava support
Clone the Repository:
git clone https://github.com/developerY/HoverBike.git
cd HoverBike
Open the Project in Android Studio:
Launch Android Studio and open the HoverBike project.
Setting Up Your Project
Adding Dependencies
Before you start coding, add the necessary dependencies via your Gradle version catalog. (See Git Repo)
For example, your TOML libs.versions.toml
might include:
[versions]
agp = "8.10.0-alpha06"
kotlin = "2.0.21"
coreKtx = "1.15.0"
junit = "4.13.2"
junitVersion = "1.2.1"
espressoCore = "3.6.1"
lifecycleRuntimeKtx = "2.8.7"
lifecycleRuntimeCompose = "2.8.7"
lifecycleViewmodelCompose = "2.8.7"
activityCompose = "1.10.0"
runtime = "1.8.0-alpha06"
composeBom = "2024.09.00"
ui = "1.8.0-alpha06"
compose = "1.0.0-alpha01"
runtimeVersion = "1.0.0-alpha01"
scenecore = "1.0.0-alpha01"
kotlinxCoroutinesGuava = "1.9.0" #<-- only one to add if you start with a XR project
In your module’s build.gradle.kts
reference these libraries:
dependencies {
implementation(libs.androidx.core.ktx)
implementation(libs.androidx.lifecycle.runtime.ktx)
implementation(libs.androidx.lifecycle.runtime.compose)
implementation(libs.androidx.lifecycle.viewmodel.compose)
implementation(libs.androidx.activity.compose)
implementation(libs.androidx.runtime)
implementation(platform(libs.androidx.compose.bom))
implementation(libs.androidx.ui)
implementation(libs.androidx.ui.graphics)
implementation(libs.androidx.ui.tooling.preview)
implementation(libs.androidx.material3)
implementation(libs.androidx.compose)
implementation(libs.runtime)
implementation(libs.androidx.scenecore)
implementation(libs.kotlinx.coroutines.guava)
}
Understanding the Core Concepts
XR Session and Composition Locals
The SDK provides a composition local, LocalSession
, which gives you access to the active XR session. For example:
import androidx.xr.compose.platform.LocalSession
class myActivity : Activity() {
val xrSession = checkNotNull(LocalSession.current)
}
This session is your gateway to XR functionality — from managing the immersive experience to loading 3D assets.
Or — Creating spatialized entities from the SceneCore library
import androidx.xr.scenecore.Session
class MyActivity : Activity() {
val xrSession = Session.create(this)
}
Spatial UI Components
SpatialPanel — A composable that represents a movable, resizable panel in 3D space. You can configure its dimensions using a DpVolumeSize
and apply modifiers (like .resizable()
and .movable()
).
Summery — Spatial panels are the foundational building blocks of Android XR applications. They allow you to design experiences that extend beyond the confines of a traditional rectangular screen, enabling content to expand dynamically across the user’s surrounding space.
Here’s a deeper look at spatial panels and how they enhance XR experiences:
Immersive Containers: Spatial panels act as containers for your UI elements, interactive components, and immersive content. Instead of being confined to a flat display, your content can be anchored in 3D space, giving users the sensation of an “unlimited” display that surrounds them.
Flexible Layouts: With spatial panels, you can design layouts that adapt to the environment. They allow you to position, resize, and move content freely, ensuring that important elements remain accessible regardless of where the user is looking. This flexibility is essential for creating truly immersive, context-aware experiences.
User-Centric Design: Because spatial panels integrate directly with the XR environment, they can follow the user’s perspective or be anchored to real-world objects. This means you can design interactions that feel natural and intuitive — users can reach out and interact with UI elements as if they were physical objects in their space.
Layering and Overlays: Spatial panels can be layered to create depth in your UI. For example, you might use a primary spatial panel for your main content and overlay additional panels (or orbiters) that provide navigational controls or supplementary information. This layered approach helps prevent clutter and ensures that the immersive experience remains engaging and easy to navigate.
Adaptive to Context: Depending on the device or environment, spatial panels can adapt their behavior. On devices that support full spatial computing, they may offer rich 3D interactions, while on more traditional screens, they can fall back to a simpler 2D representation. This adaptability helps ensure a consistent user experience across a variety of hardware configurations.
Integration with Other XR Components: Spatial panels work hand-in-hand with other XR UI elements like orbiters. While orbiters provide floating controls and quick access to features, spatial panels serve as the main canvas where content is displayed. Together, they enable developers to create seamless and engaging XR interactions.
In summary, spatial panels are at the heart of Android XR app development. They empower you to craft experiences that extend beyond traditional displays, allowing your content to live and interact in the real world. This paves the way for more natural, engaging, and contextually aware applications in the realm of extended reality.
Orbiter — A helper composable that lets you overlay UI elements (like mode-switch buttons) in the spatial scene without interfering with your primary immersive content.
Summery — Orbiters are floating UI elements anchored to spatial panels or other entities, providing quick access to controls without cluttering the main content. They enable users to interact with key features while keeping the primary view visible. However, they should be used sparingly to avoid overwhelming users with too many spatialized elements. A best practice is to adapt only a few essential navigational components (like a navigation rail or bar) for orbiters, ensuring a streamlined and user-friendly XR experience.
Orbiters are dynamic, floating UI elements that enhance the XR experience by providing contextual controls and interactions without interfering with the primary immersive content.
Here’s an in-depth look at orbiters and how they contribute to Android XR apps:
Floating Interaction Points: Orbiters are designed to “float” in 3D space, meaning they aren’t tied to the boundaries of a fixed screen. This allows them to appear as if they are part of the real-world environment, offering users immediate access to functionality without breaking immersion.
Contextual and Accessible Controls: Because orbiters are anchored to spatial panels or other XR entities, they serve as dedicated access points for key features. This ensures that important navigational or control elements are always within reach, even when the main content occupies the full extent of the user’s field of view.
Enhanced Usability Without Clutter: Orbiters help maintain a clean and focused interface by offloading secondary actions from the main content. They provide a way to access additional options — like mode switching, settings, or navigation — without crowding the primary visual space.
Flexible Placement: You can position orbiters strategically (for example, at the edges or corners of a spatial panel) so they complement the overall design. Their placement can adapt based on the user’s perspective, ensuring they’re always in an ergonomically favorable position.
Interactivity and Feedback: Orbiters are designed for quick interaction. Their floating nature often makes them more noticeable, and they can be animated or highlighted to provide visual feedback when selected. This contributes to a more intuitive and responsive user experience.
Minimalist Approach: While orbiters can be a powerful addition, it’s important to use them sparingly. Too many floating elements can lead to cognitive overload. A best practice is to reserve orbiters for the most critical functions — such as navigation or mode switching — ensuring that users aren’t overwhelmed by excessive controls.
Integration with Spatial Panels: Orbiters work in tandem with spatial panels to create a layered, depth-rich UI. While the spatial panel serves as the main canvas for immersive content, orbiters float on top of or around it, providing contextual shortcuts and controls without detracting from the primary experience.
In summary, orbiters bring both functionality and visual elegance to Android XR apps. By offering accessible, context-sensitive controls in a non-intrusive manner, they help maintain a clean, immersive environment that allows users to interact naturally with extended reality content.
Use Material Design
Leverage Material Design components and adaptive layouts to accelerate development of your Android XR app. Material Design for XR builds on Material 3 by adding spatial UI behaviors, ensuring your app feels native while optimizing the use of space. Additionally, you can spatialize existing UI components — placing them in orbiters and applying spatial elevation — to create an immersive, context-aware experience.
Code Overview
MainActivity
sets up an immersive edge-to-edge UI, applies a gradient background, and decides whether to present the spatial UI or a fallback 2D interface based on device capabilities.
ObjectInVolume & Spinning Model: The core logic for loading, spinning, and rendering the 3D model is encapsulated in the ObjectInVolume
composable. This component:
- Loads the glTF model asynchronously using
createGltfResourceAsync("hover_bike/scene.gltf")
. - Creates a 3D entity from the loaded model.
- Sets the initial scale and starts an animation.
- Continuously updates the model’s rotation (spin) in a
LaunchedEffect
loop. - Uses helper functions like
axisAngleQuaternion
and aPose
class for managing transformations.
Additional components include:
- SpatialPanel & Volume: Provide the XR spatial context.
- FlashyBikeInfoPanel: Displays biking performance metrics.
- Orbiters and Mode Buttons: Offer controls for switching between spatial and home/full modes.
- Fallback 2D UI: Displays a simpler layout when the device doesn’t support full XR.
Some Future Enhancements ???
- Interactive HUD Overlays: Display real-time ride metrics and navigation cues.
- Gesture-Based Controls: Enable pinch-to-zoom or swipe-to-rotate interactions with the 3D model.
- Dynamic Environmental Effects: Integrate lighting and weather effects that respond to the real world.
- Social Integration: Add features like leaderboards, achievements, or social sharing.
The next sections walk you through the code that loads a glTF model, creates a 3D entity, and applies a continuous rotation to it in an XR environment.
- Loads and spins the model continuously in a
LaunchedEffect
loop. - Checks for device capability before creating the entity.
- Uses a modular design to separate the model rendering from UI concerns.
With this structure in place, you have a robust foundation for further expanding your XR biking app. You could add additional features like user interactions, dynamic animations, and even gesture controls as you continue developing your immersive experience.
Feel free to build upon this foundation and let me know if you have any further questions or need additional explanations!
Overview of how the app leverages the Jetpack XR SDK to:
- Create an XR session and check for 3D content capability.
- Load the glTF model asynchronously with a call to
createGltfResourceAsync("hover_bike/scene.gltf")
and wait for its completion. - Create a 3D model entity using a method such as
createGltfEntity(gltfModel)
if the device supports 3D content. - Set the entity’s initial scale and start a predefined animation.
- Continuously update the model’s rotation by recalculating its pose and setting it using a simple spin loop.
The rendered 3D model is displayed within a SpatialPanel
, which is part of the XR Compose UI. If the device does not support spatial (3D) UI, the app falls back to a 2D layout.
class MainActivity : ComponentActivity() {
@SuppressLint("RestrictedApi")
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
enableEdgeToEdge() // Helper for an immersive UI
setContent {
BikeTheme {
// Full-screen background with gradient.
Box(modifier = Modifier.fillMaxSize()) {
BrushBackground()
// Retrieve the current XR session.
val session = LocalSession.current
// Check if the device supports spatial UI.
val isSpatialEnabled = LocalSpatialCapabilities.current.isSpatialUiEnabled
Log.d(TAG, "LocalSession.current = $session, isSpatialUiEnabled = $isSpatialEnabled")
if (isSpatialEnabled) {
// Use spatial UI
Subspace {
MySpatialContent(onRequestHomeSpaceMode = { session?.requestHomeSpaceMode() })
}
} else {
// Fallback 2D content
My2DContent(onRequestFullSpaceMode = { session?.requestFullSpaceMode() })
}
}
}
}
}
}
Key Points:
- Edge-to-edge display: The helper function
enableEdgeToEdge()
configures the activity. - Session and Capabilities:
LocalSession.current
andLocalSpatialCapabilities.current
are used to decide which UI to present. - Theme and Background: A custom theme (
BikeTheme
) and a gradient background (BrushBackground
) create an immersive look.
Fallback 2D Content
If the device does not support spatial UI, My2DContent
provides a fallback 2D layout with standard components.
Let’s defining two example composables:
- MySpatialContent: For devices with spatial UI support.
- My2DContent: For fallback devices.
Spatial Content
Inside your spatial UI, you can display 3D models and overlay controls.
For example:
@Composable
fun MySpatialContent(onRequestHomeSpaceMode: () -> Unit) {
SpatialPanel(
modifier = SubspaceModifier
.size(DpVolumeSize(width = 1280.dp, height = 800.dp, depth = 7.dp))
.resizable()
.movable()
) {
Orbiter(
position = OrbiterEdge.Top,
offset = EdgeOffset.inner(offset = 20.dp),
alignment = Alignment.End,
shape = SpatialRoundedCornerShape(CornerSize(28.dp))
) {
HomeSpaceModeIconButton(
onClick = onRequestHomeSpaceMode,
)
}
}
}
2D Fallback
The fallback UI might be a simple layout with text and a button:
@Composable
fun My2DContent(onRequestFullSpaceMode: () -> Unit) {
Surface(
modifier = Modifier.fillMaxSize(),
color = MaterialTheme.colorScheme.background
) {
Row(
modifier = Modifier
.fillMaxSize()
.padding(16.dp),
horizontalArrangement = Arrangement.SpaceBetween,
verticalAlignment = Alignment.CenterVertically
) {
Text(
text = "Hello Android XR",
style = MaterialTheme.typography.headlineMedium,
modifier = Modifier.padding(48.dp)
)
if (LocalSpatialCapabilities.current.isSpatialUiEnabled) {
FullSpaceModeIconButton(
onClick = onRequestFullSpaceMode,
modifier = Modifier.padding(32.dp)
)
}
}
}
}
This allows seamless operation across a variety of devices.
Loading, Creating, and Spinning the 3D Model
One of the most exciting aspects of building XR apps is the ability to integrate rich 3D content. The official Jetpack XR SDK provides an API to load glTF models asynchronously using the XR session.
The core functionality is encapsulated in the ObjectInVolume
composable. This is different than the Git repo for simplicity …
This component:
- Loads the glTF model asynchronously.
- Creates a 3D model entity.
- Sets its initial scale and starts an animation.
- Uses a continuous loop (inside a
LaunchedEffect
) to update the entity’s pose so it spins.
@Composable
fun ObjectInVolume(show3DObject: Boolean) {
if (!show3DObject) return
val xrCoreSession = checkNotNull(LocalSession.current)
val scope = rememberCoroutineScope()
// State to track the spin angle.
var angle by remember { mutableStateOf(0f) }
LaunchedEffect(Unit) {
// 1. Load the model asynchronously
val gltfModel = xrCoreSession.createGltfResourceAsync("hover_bike/scene.gltf").await()
// 2. Check if 3D content is supported.
if (xrCoreSession.getSpatialCapabilities().hasCapability(SpatialCapabilities.SPATIAL_CAPABILITY_3D_CONTENT)) {
// 3. Create the glTF entity.
val entity = xrCoreSession.createGltfEntity(gltfModel)
// Set initial properties.
entity.setScale(0.001f)
entity.startAnimation(loop = true, animationName = "Hovering")
// 4. Continuously update the rotation in a ~60 FPS loop.
while (true) {
angle += 1f // Increment the spin angle.
// Compute a rotation quaternion around the Y-axis.
val rotation = axisAngleQuaternion(angle, Vector3(0f, 1f, 0f))
// Construct the new pose with a constant translation.
val pose = Pose(
translation = Vector3(0f, 0f, 0.5f), // 0.5 meters in front
rotation = rotation
)
entity.setPose(pose)
delay(16L) // Approximately 60 FPS.
}
}
}
}
Explanation
Asynchronous Model Loading: We call createGltfResourceAsync("hover_bike/scene.gltf")
and await()
its result.
3D Capability Check: We verify that the device supports 3D content by checking hasCapability(SpatialCapabilities.SPATIAL_CAPABILITY_3D_CONTENT)
.
Entity Creation: Using createGltfEntity(gltfModel)
, we create the entity. Then, we set its scale and start an animation named "Hovering"
.
Spin Loop: A while (true)
loop inside LaunchedEffect
continuously updates the rotation. We use a helper function axisAngleQuaternion(angle, Vector3(0f, 1f, 0f))
to generate a quaternion representing a rotation around the Y-axis. The new pose is then applied with entity.setPose(pose)
.
Volume Display: The Subspace
and Volume
components provide the XR spatial context for the model. While the model is spinning in the background, we display a simple text placeholder.
We can build upon this foundation. Let me know if you have any further questions or need additional explanations!
~~~
visionOS also has a Entity — Component — System (ECS)
Testing and Deployment
Real Device Testing — Testing XR experiences requires a compatible VR headsets. Emulators might not provide accurate sensor data or proper 3D rendering but are excellent during development.
Performance and Battery Considerations — XR experiences can be resource-intensive. Optimize your app by minimizing render overhead and testing under real-world conditions.
Publishing — Once your XR app is polished and tested, deploy it via the Play Store. Google recommends providing clear instructions for users on how to use the XR features safely.
Final Thoughts
The Jetpack XR SDK offers a powerful and modern framework for building immersive XR apps on Android. By leveraging Android’s latest development practices — such as Jetpack Compose and lifecycle-aware components — you can focus on crafting compelling experiences rather than wrestling with low-level details.
This article provided a first look at setting up your project, creating adaptive spatial and 2D UIs, loading and rendering 3D models, and finally integrating everything into a cohesive XR experience. As the XR ecosystem continues to evolve, keeping an eye on official documentation and community best practices will help you stay ahead of the curve.
Happy coding, and welcome to the future of immersive Android experiences!
Working Source Code:
And if you get a change please visit us @ ZoeWear.com
~Ash
Thank to https://www.youtube.com/@whynotcode videos for guidance when things got complex to understand.