Please find the details about Apple visionOS here:
Now that we understand the details lets look at the Apple Diorama App.
Download source here:
These are the source files for the project:
- DioramaApp: The main app struct that configures the app’s lifecycle events, initializes components, and sets up the main SwiftUI scenes.
2. ContentView: Another SwiftUI view responsible for managing UI controls, toggles, and sliders for adjusting various aspects of the immersive experience.
3. ViewModel: A class that manages various properties and behaviors of the app, like scaling, opacity, audio, and terrain material. It interacts with the app’s entities and components to control their behavior based on user input.
4. DioramaView: A SwiftUI view representing the main interface of the app. It integrates RealityKit’s RealityView and manages interactions with the immersive content.
4. FlockingComponent: A component that defines properties for simulating flocking behavior, such as velocity and seeking positions, for entities like birds.
5. AttachmentsProvider: A class that manages attachments associated with tags, facilitating the display of additional information in the app.
6. FlockingSystem: A RealityKit system that implements flocking behavior for multiple entities. It calculates and applies forces like separation, alignment, and cohesion to create coordinated motion.
7. LearnMoreView: A SwiftUI view that presents detailed information about entities. It includes animations and interactions for expanding and collapsing additional information.
These components collectively build an app that provides an interactive and immersive augmented reality experience using SwiftUI and RealityKit. The app enables users to explore and manipulate 3D scenes, control the display of immersive content, adjust scale and opacity, and observe dynamic behaviors like flocking.
DioramaApp.swift
@main
struct DioramaApp: App {
- Start Program
- Call ContentView
- Terminate the app when done
DioramaApp.swift defines the main entry point of an iOS app named `DioramaApp` using the `@main` attribute. It includes various elements, such as the use of `@UIApplicationDelegateAdaptor`, app initialization, and the construction of the main SwiftUI scene. Here’s a summary of the key components and their roles:
1. `DioramaApp` Struct:
- The `@main` attribute indicates that `DioramaApp` is the app’s entry point.
- The `@UIApplicationDelegateAdaptor` attribute associates the custom `AppDelegate` class, allowing it to manage app lifecycle events.
- The private property `immersiveSpaceIdentifier` holds an identifier used for immersive spaces.
- The `@State` property `viewModel` holds an instance of `ViewModel`.
- The `init()` block initializes various components, such as registering components, systems, and entities required for the RealityKit content.
- The `body` property returns a SwiftUI scene, specifying the main user interface content.
2. Custom `AppDelegate` Class:
- `AppDelegate` is a custom class conforming to the `UIApplicationDelegate` protocol.
- The `applicationShouldTerminateAfterLastWindowClosed(_:)` method determines whether the app should terminate when its last window is closed.
3. Main SwiftUI Scene:
- The `body` property returns a SwiftUI scene that defines the app’s main user interface.
- A `WindowGroup` is created to define the app’s main window. Inside the group, the `ContentView` is set as the root view.
- An `ImmersiveSpace` is added, providing immersive content for the app using the specified identifier. Inside this, the `DioramaView` is presented using the `viewModel`.
In summary, the `DioramaApp` struct serves as the entry point for the iOS app. It utilizes the `@UIApplicationDelegateAdaptor` attribute to connect the custom `AppDelegate` class to manage app lifecycle events. The app’s main user interface is defined within a SwiftUI scene, including the main window’s content and immersive spaces.
ContentView.swift
- Shows the main Window to control the Diorama:
- Show/Hide , Scale, Morph
- Setup the ViewModel
SwiftUI `ContentView` for the app’s user interface. It primarily focuses on controlling the display of immersive AR content using various SwiftUI features. Here’s a summarized explanation of the code:
1. `ContentView` Struct:
- `ContentView` is a SwiftUI view responsible for rendering the main content of the app’s user interface.
- It takes two parameters: `spaceId`, which is a unique identifier for immersive spaces, and `viewModel`, an instance of the `ViewModel`.
- The `@Environment` property wrappers are used to access environment values related to immersive AR spaces: `openImmersiveSpace` for opening immersive experiences and `dismissImmersiveSpace` for dismissing them.
- The `@State` property `isExpanded` tracks whether a certain part of the content is expanded or collapsed.
- The computed property `areControlsShowing` checks if the app’s root entity exists and if the immersive content is set to be shown.
- The `body` property contains the layout and UI elements of the view.
2. UI Elements:
- The content includes a vertical stack of UI elements created using the `Grid` view, organized in a structured manner.
- The `Toggle` element allows the user to control whether the immersive content is shown. The `.onChange(of:)` modifier observes changes to this toggle and performs actions accordingly.
- Two `GridRow` elements contain sliders for adjusting “Morph” and “Scale” properties. The sliders’ values are bound to properties in the `viewModel`. The `Slider` values are monitored using `.onChange(of:)`.
- The UI elements are conditionally disabled and their opacity is adjusted based on whether the immersive content is being displayed.
- The `glassBackgroundEffect()` modifier applies a visual effect to the content.
3. Private Function:
- The `update()` function is called when slider values are changed. It triggers updates related to terrain material and region-specific opacity in the `viewModel`.
In summary, the `ContentView` defines the UI layout for the app, with interactive elements that control the display of immersive content and sliders to adjust certain properties. The code effectively uses SwiftUI’s features to create an engaging and interactive user experience for managing immersive AR experiences.
ViewModel.swift
Contains all the variable for all the views.
- Update scale
- Update morph
- Setup / Control audio
`ViewModel` class has various properties and methods that manage the behavior and interactions of the app’s immersive AR content.
1. `ViewModel` Class:
- The `ViewModel` class represents the logic and data used to control the immersive AR content and its behavior within the app.
- It’s marked as `@Observable` to indicate that instances of this class can be used as observed objects in SwiftUI.
- Properties include:
— `materialParameterName`: A constant representing the name of a material parameter.
— `rootEntity`: An optional entity representing the root entity of the AR scene.
— `showImmersiveContent`: A boolean indicating whether the immersive AR content is being shown.
— `sliderValue`: A float value representing a slider’s position.
— `contentScaleSliderValue`: A float value representing another slider’s position.
— The `init` initializes the `ViewModel` with default values and updates region-specific opacity.
— Several static queries (`query` and `audioQuery`) are defined for querying entities with specific components.
2. Methods:
- `updateScale()`: Adjusts the scale of the root entity based on the `contentScaleSliderValue`.
- `updateRegionSpecificOpacity()`: Updates the opacity of region-specific components based on the `sliderValue`.
- `setupAudio()`: Sets up audio for different regions based on the `RegionSpecificComponent`. It plays audio files specific to different regions.
- `terrainMaterialValue`: Retrieves the value of a material parameter from the terrain’s material.
- `resetAudio()`: Resets audio playback controllers.
- `updateTerrainMaterial()`: Updates the terrain material’s shader graph based on the `sliderValue`.
3. Computed Properties:
- `terrainMaterial`: Retrieves the shader graph material associated with the terrain entity.
4. Extension:
- The `fileprivate extension Entity` defines a computed property `terrain`, which retrieves the terrain entity from the root entity.
In summary, the `ViewModel` class orchestrates the behavior of the immersive AR content by adjusting its scale, opacity, and audio based on user interactions and slider values. This class is crucial for managing the interactive and dynamic aspects of the AR experience within the app.
DioramaView.swift
This builds the main view
It represents a scene rendered using RealityKit and handles interactions with various components of the scene. Here’s a summary of what the code does:
1. Environment Variables and State Properties:
— The `@Environment(\.dismiss)` property wrapper allows the view to dismiss itself.
- Two static properties, `markersQuery` and `runtimeQuery`, are defined as `EntityQuery` instances.
- The `subscriptions` and `attachmentsProvider` state properties are initialized to hold event subscriptions and attachment entities.
2. Body of the View:
— The view’s body contains a `RealityView` that renders the augmented reality content.
- Inside the `RealityView`’s `make` closure, the code loads a Reality Composer entity named “DioramaAssembled”. It then sets this entity as the `rootEntity` in the `viewModel` and adds it to the content.
- The `update` closure is responsible for updating the content and managing attachment entities in the RealityKit environment.
- The code sets up audio-related configurations by invoking `setupAudio` method from the `viewModel`.
- Attachment entities are added to marked entities. The code iterates through entities with a `PointOfInterestRuntimeComponent` and adds attachments based on tags. These attachments are region-specific and have opacity components for fading in and out.
- Region-specific opacity is updated, and the terrain material is updated using the `viewModel`.
- The `attachments` closure generates the views associated with attachment entities and displays them using a `ForEach` loop.
3. Creating LearnMoreView:
— The `createLearnMoreView(for entity: Entity)` function creates a `LearnMoreView` associated with a given entity.
- If the entity already has a `PointOfInterestRuntimeComponent`, no additional component is added.
- A unique tag is generated for the entity, and a `LearnMoreView` is created based on the entity’s details.
- A `PointOfInterestRuntimeComponent` is added to the entity, and the `LearnMoreView` is stored in the `attachmentsProvider` using the generated tag.
4. Setup Birds:
— The `setupBirds(rootEntity entity: Entity)` function sets up bird entities with flocking components and animations.
- The function finds the entity named “Birds” and assigns a `FlockingComponent` to each bird.
- Animations are played on the birds, and the animation speed is set randomly.
In summary, the `DioramaView` represents an augmented reality scene using RealityKit. It handles loading and managing entities, setting up audio, attaching components, and rendering associated views for attachment entities. The view provides a way to interact with the augmented reality content using the RealityView component.
Component.swift
The provided code is a part of an augmented reality application built using Apple’s RealityKit framework. It defines two custom components that are used to control and modify the behavior of entities within the AR scene. Here’s a breakdown of the code:
1. `PointOfInterestRuntimeComponent`:
— This is a custom component that conforms to the `Component` protocol, indicating that it can be attached to entities in the AR scene.
— It has a single property: `attachmentTag` of type `ObjectIdentifier`.
— This component is designed to associate an attachment tag with an entity during runtime. The `attachmentTag` acts as an identifier to recognize attachments associated with the entity during the application’s runtime.
2. `ControlledOpacityComponent`:
— Another custom component conforming to the `Component` protocol.
— It’s used to control the opacity of an entity based on a condition.
— Contains a boolean property `shouldShow`, indicating whether the component should make the entity visible or invisible.
— It includes a method `opacity(forSliderValue:)` which calculates the opacity value to be applied to the entity based on the input slider value.
— If `shouldShow` is `false`, the opacity is set to `0.0`, making the entity invisible.
— If `shouldShow` is `true`, the opacity is set to the input `sliderValue`, thus modifying the entity’s opacity based on the slider’s value.
These components can be added to entities in the AR scene to control their behavior. The `PointOfInterestRuntimeComponent` helps identify and manage attachments associated with entities during runtime. The `ControlledOpacityComponent` allows dynamic opacity adjustments of entities based on conditions and user interactions, such as slider value changes. Overall, these components contribute to creating interactive and customizable experiences within the augmented reality application.
AttachmentsProvider.swift
AttachmentsProvider
, which conforms to the ObservableObject
protocol. The class is designed to manage a collection of attachments associated with specific tags in a SwiftUI-based application. Let's break down the code:
AttachmentsProvider
class is intended to manage attachments in a SwiftUI-based app, where these attachments are associated with specific tags. The class's properties and computed property enable easy management and sorting of attachments for use in the application's user interface.
LearnMoreView.swift
This view is intended to display additional information about a specific location.
1. `LearnMoreView`:
— This view is designed to display more details about a specific location.
— It takes several parameters such as the location’s name, description, image names, an optional `Entity` representing a trail, and the app’s `ViewModel`.
2. `@State private var showingMoreInfo`:
— This state variable manages whether the additional information is currently displayed or not.
3. `@Namespace private var animation`:
— This private namespace is used for the matched geometry effect to ensure smooth transitions.
4. `imagesFrame`:
— This property dynamically determines the height of the image frame based on whether the additional information is being shown or not.
5. `titleFont` and `descriptionFont`:
— These properties define fonts for the title and description text.
6. `body`:
— The main body of the `LearnMoreView`.
— The view starts with a `ZStack` that contains two text elements: one for displaying the name (visible when the additional information is hidden), and another for displaying more detailed information (visible when the additional information is shown).
7. `ImagesView`:
— This subview is responsible for displaying a horizontal scroll view of images related to the location.
— It uses the provided `imageNames` to create a set of images.
Overall, the `LearnMoreView` provides an interactive way to display more information about a location, including images, and can be integrated into a `RealityView` in a SwiftUI-based application.
FlockingComponent.swift
In summary, the FlockingComponent
struct represents a component that can be used to manage flocking behavior, where entities move in groups while maintaining certain positions and velocities. It conforms to the Component
protocol, allowing it to be attached to entities in a RealityKit-based environment. Additionally, it conforms to the Codable
protocol, making it encodable and decodable for serialization purposes.
FlockingSystem.swift
The provided code defines a system called `FlockingSystem` within RealityKit. This system simulates collective behavior, like the movement of a group of birds. It calculates forces for entities based on rules like separation, alignment, and cohesion. These forces are then used to update the entities’ positions, velocities, and orientations, creating a coordinated flocking motion. The system also periodically changes the entities’ target positions to simulate realistic movement patterns.
`FlockingSystem` that conforms to the `System` protocol, representing a system within the RealityKit framework that simulates flocking behavior. Here’s a summary of the code:
1. `public struct FlockingSystem: System`:
— This declares a struct named `FlockingSystem` that conforms to the `System` protocol.
2. System Parameters and Constants
— Several parameters and constants are defined to control various aspects of the flocking behavior, such as separation, alignment, cohesion, seeking, rotation, speed, and others.
3. `public init(scene: RealityKit.Scene)`:
— This is the initializer of the `FlockingSystem`.
— It takes a `RealityKit.Scene` as a parameter but doesn’t perform any specific initialization.
4. `public func update(context: SceneUpdateContext)`:
— This method is required by the `System` protocol and is used to update the flocking simulation.
— It takes a `SceneUpdateContext` as a parameter.
5. Flocking Simulation Logic:
— This method simulates the flocking behavior of entities in a scene based on the defined parameters.
— It calculates acceleration forces for each entity based on separation, alignment, cohesion, and seeking.
— The forces are accumulated and applied to update the velocity and position of the entities.
— The method also updates the orientation and position of the entities based on the calculated values.
6. Choosing New Seek Position:
— The method chooses a new seek position for the entities to create a simulation of movement.
In summary, the `FlockingSystem` struct defines a system in RealityKit that implements flocking behavior, where entities move together like a school of fish or a swarm of birds. The system calculates acceleration forces based on separation, alignment, cohesion, and seeking rules, and then updates the velocity and position of the entities. This simulates the coordinated movement of entities in a scene.
The rest of the code has to do with Reality Compose Pro.
Please watch for part 2 of this series.
Thanks,
~Ash
Please learn more about us at …