脐动脉2条是什么意思| 睡醒后腰疼是什么原因| 腱鞘囊肿挂什么科| 一个丝一个鸟读什么| 扦插是什么意思| 爽文是什么意思| 弓加耳念什么| 右胸是什么器官| 为什么会胃酸| 圣字五行属什么| 男孩什么时候开始发育| 故宫什么时候闭馆| 软科是什么意思| 羊肉和什么菜包饺子好吃| pr在医学上是什么意思| 反刍是什么意思| 法兰绒是什么面料| 高压高是什么原因| 大米发霉是什么样子| Ca是什么| 碱性磷酸酶偏低是什么意思| 瘦了是什么原因| 三次元是什么意思| 吃什么饭| 什么的口水| land rover是什么车| 女性肛门瘙痒用什么药| 办什么厂比较好| 梦到女儿丢了什么预兆| 指甲凹凸不平是什么原因| 珵字五行属什么| 根管治疗后要注意什么| 小孩早上起床咳嗽是什么原因| 楞严经讲的是什么| y谷氨酰基转移酶高是什么原因| 1954年是什么年| jacquemus是什么牌子| 龟头炎用什么软膏| 一什么田野| 特需门诊和专家门诊有什么区别| 命门火衰是什么意思| 运动喝什么水补充能量| 春五行属什么| u盘什么牌子好| 珵字五行属什么| 小孩拉肚子吃什么药好| 成都市市长是什么级别| 正县级是什么级别| 人乳头瘤病毒51型阳性是什么意思| 男人眼袋大是什么原因造成的| 嘴边长痘痘是什么原因| 房颤是什么原因引起的| 早射吃什么药最好| 朱元璋是什么星座| 白细胞酯酶阳性是什么意思| 月经来一点又不来了是什么原因| 月经推迟量少是什么原因| 米线用什么做的| 三下乡是什么意思| 坚强后盾是什么意思| 什么繁什么茂| m2是什么单位| 游泳有什么好处| 梦见卖衣服是什么意思| 梦见下雪了是什么意思| 冰丝是什么面料| 蝙蝠怕什么| 不作为什么意思| 晒背什么时候最佳时间| smz是什么药| 神经外科主要看什么病| 熊猫长什么样| 阳痿是什么意思| 簸箕是什么东西| 小便发红是什么症状男| 疤痕增生是什么样子| 白色念珠菌是什么意思| beauty是什么意思| 腰椎间盘突出和膨出有什么区别| 3.1415926是什么意思| 莓茶什么人不适合喝| 乳房结节是什么原因引起的| 天蝎座和什么星座最不配| 三个箭头朝下是什么牌子| 标间是什么意思| 姜子牙为什么没有封神| 血压低有什么危害| 工装裤配什么鞋| 清炖排骨放什么调料| 孩子流鼻血是什么原因| 莀字五行属什么| 吃什么会食物中毒| 嗓子哑是什么病的前兆| 曹曦月演过什么电视剧| 五月二十日是什么星座| 水痘不能吃什么食物| 葡萄又什么又什么| 脑脊液是什么颜色| 属虎生什么属相宝宝好| 乌龟喜欢吃什么食物| 经期适合喝什么汤| 大姨妈很多血块是什么原因| 陈皮的作用是什么| 乐捐是什么意思| 未来是什么意思| 男的为什么喜欢男的| 牡蛎是什么| 海归是什么意思| 什么让我成长| 肝病吃什么药| 籍贯是什么意思| 医院体检挂什么科| 795是什么意思| 下体瘙痒用什么药| 一路长虹是什么意思| 619是什么意思| loaf是什么意思| 蚂蚱长什么样| 甲钴胺片主治什么病| 喝碱性水有什么好处| 阴茎破皮擦什么药| 肩胛骨痛是什么原因| 白天不懂夜的黑什么意思| 天贝是什么东西| 做梦梦见火是什么意思| 土和什么相生| 痴汉是什么意思| 夜宵吃什么不会胖| 京ag6是什么意思| 埃及艳后叫什么| burberry是什么品牌| 晖字五行属什么| 胃萎缩是什么原因| 便秘吃什么益生菌| 包皮是什么| 小腹痛男性什么原因| 为什么会突然头晕| 夫妻是什么| 飞舞是什么意思| 复苏是什么意思| pra是什么意思| 妞字五行属什么| 尿有臭味是什么原因| 芊字五行属什么| 橘子什么季节成熟| 急性肠胃炎什么症状| 去美容院洗脸有什么好处| 谦虚什么意思| 蝗虫用什么呼吸| 陶渊明是什么派诗人| 王秋儿和王冬儿什么关系| 吃什么东西下火| 什么是淋巴结| 什么是肛漏| 手足情深什么意思| gpi是什么意思| 为什么要穿内裤| 尿急吃什么药效果最好| 醋泡脚有什么好处| few是什么意思| 谨字五行属什么| 润六月是什么意思| pashmina是什么面料| 为什么近视| 穿什么内衣好| 地位是什么意思| 肌酐高什么原因| 汗蒸有什么好处和功效| 关节炎有什么症状| 小孩病毒性感冒吃什么药效果好| 脚水肿是什么原因引起的| 下巴下面是什么部位| 眼睛肿是什么原因| 农村适合养殖什么| 未成年喝酒有什么危害| 18属什么生肖| 11月23日是什么星座| 什么毛什么血| 女生胸痛什么原因| amk是什么品牌| 男人做什么运动能提高性功能| 抱薪救火是什么意思| 病毒性感冒什么症状| 夜间睡觉口干是什么原因| 月经第二天是什么期| 日本豆腐是什么材料| 呸是什么意思| 国家能源局是什么级别| 阴毛是什么| 什么病会传染人| g是什么计量单位| 妊娠线什么时候长| 白条鱼是什么鱼| 满族八大碗都有什么菜| 脂肪肝吃什么食物好| 过期牛奶有什么用途| 软化血管吃什么药最好| 女人补肾吃什么药| 多喝水有什么好处| 福晋是什么意思| 肚脐左下方疼是什么原因| 什么是正颌手术| 痛风都不能吃什么东西| ptt是什么| 咽颊炎吃什么药| 吃鱼对身体有什么好处| bang什么意思| 降钙素原偏高说明什么| 月经刚完同房为什么痛| 脸上浮肿是什么原因导致的| 不问世事什么意思| 酱油是什么时候发明的| 虾皮是什么| vup是什么意思| 化疗期间吃什么水果好| 头皮发麻什么原因| 古筝是什么乐器| 九月3日是什么日子| 眉心跳动代表什么预兆| 中元节会开什么生肖| 初中毕业可以考什么证| 子宫肌瘤吃什么药| 鸿雁是什么意思| 鱿鱼炒什么好吃| 血糖高会出现什么症状| 凉茶是什么茶| 农历五月的别称是什么| 高血压突然变成低血压是什么原因| 牡丹王是什么茶| 托孤是什么意思| 9月17日是什么星座| 新晋是什么意思| 狗尾巴草有什么功效| 心慌挂什么科| 口若悬河是什么生肖| 什么是有源音箱| 大腿酸软无力是什么原因| 孕酮偏高说明什么| 舌头鱼又叫什么鱼| 梨子和什么一起榨汁好喝| 脚臭用什么洗效果最好| cdfi未见明显血流信号是什么意思| 西装外套配什么裤子| 蓝莓什么时候种植| 是什么意思| 月经是黑色的是什么原因| 皮肤科挂什么科| 什么叫做基本工资| 清洁度2度是什么意思| 笔记本电脑什么牌子好| 心脏早搏吃什么药最好| 自嘲是什么意思| 荨麻疹吃什么中药| 发烧酒精擦什么部位| 7个月的宝宝吃什么辅食| 电脑一体机什么牌子好| 陈赫开的火锅店叫什么| 过敏输液输什么药好| 腋毛什么时候开始生长| fish是什么意思| 古井贡酒是什么香型| 什么是伴手礼| 神经性头疼吃什么药好| 什么药和酒一起吃必死| 百度

View in English

  • Global Nav Open Menu Global Nav Close Menu
  • Apple Developer
Search
Cancel
  • Apple Developer
  • News
  • Discover
  • Design
  • Develop
  • Distribute
  • Support
  • Account
Only search within “”

Quick Links

5 Quick Links

Videos

Open Menu Close Menu
  • Collections
  • Topics
  • All Videos
  • About

More Videos

  • About
  • Transcript
  • 水手是干什么的

    百度 在这里,她第一次见到了伯父周恩来。

    Get ready to develop apps and games for visionOS! Discover the fundamental building blocks that make up spatial computing — windows, volumes, and spaces — and find out how you can use these elements to build engaging and immersive experiences.

    Resources

      • HD Video
      • SD Video

    Related Videos

    WWDC23

    • Build spatial experiences with RealityKit
    • Create immersive Unity apps
    • Develop your first immersive app
    • Meet Reality Composer Pro
    • Meet RealityKit Trace
    • Meet SwiftUI for spatial computing
    • Principles of spatial design
  • Search this video…

    ? Mellow instrumental hip-hop ? ? Jim Tilander: Hi, I'm Jim, an engineer on the RealityKit team. Today, my colleague Christopher from the ARKit team will join me in guiding you through how to get started with building apps for spatial computing. Let's dive in! We are excited about our new platform for spatial computing. This platform is built on familiar foundations for people to use and for you to develop apps on. It opens up new and exciting possibilities to blend real and virtual content, as well as using natural input to interact with your app - and the whole system has been designed to protect people's privacy, giving you the peace of mind to focus on your app's experience. Let's talk a bit about the fundamentals to build up our vocabulary and concepts of spatial computing. After that, we will go over the different ways to get started with your app. Then, my colleague Christopher will walk us through how to build your app, diving deeper into the details of spatial computing. Now, let's take a look at some of the fundamentals. First let's cover what both familiar and new UI concepts mean in spatial computing. By default, apps launch into the Shared Space. This is where apps exist side-by-side, much like multiple apps on a Mac desktop. People remain connected to their surroundings through passthrough. Each app can have one or more windows. These are SwiftUI scenes that can be resized and reflowed like you would expect of a normal macOS window. They can contain traditional views and controls, as well as 3D content, allowing you to mix and match 2D and 3D. People can reposition a window to their liking in their current space, just as one might expect. Volumes allow an app to display 3D content in defined bounds, sharing the space with other apps. Volumes are great for showcasing 3D content, for example, a chess board. People can reposition volumes in space, and they can be viewed from different angles. Volumes are SwiftUI scenes, allowing you to do layout in familiar ways, and they use the power of RealityKit to display your 3D content. Sometimes you might want to have more control of the level of immersion in your app… maybe to focus while watching a video or to play a game. You can do this by opening a dedicated Full Space, where your app's windows, volumes, and 3D objects are the only ones appearing across the view. In a Full Space, you can also take advantage of ARKit's APIs. For example, in addition to system-provided gestures, you can get more detailed Skeletal Hand Tracking to really incorporate the structure of people's hands into your experience. Your app can use a Full Space in different ways. You can use passthrough to ground content in the real world and keep people connected with their surroundings. And when you play Spatial Audio and render 3D through RealityKit you will automatically take advantage of the fact that the device will continually update understanding of the room to blend visuals and sound into people's surroundings, making them feel that these virtual objects really belong in their room. You can also choose to render to a fully-immersive space to fill up the entire field of view. This allows your app flexibility to deliver on creative intent of your app by customizing the lighting of virtual objects, as well as the ability to choose audio characteristics. These are the foundational elements of spatial computing: windows, volumes, and spaces. They give you a flexible toolset to build apps that can span the continuum of immersion. Christopher will talk more about this later. Now that we've introduced the foundational elements of spatial computing, let's explore the ways we can interact with windows, volumes, and spaces. On this platform, we can interact with apps by simply using our eyes and hands. People can, for example, interact with a button by looking at it and tapping their fingers together to select. People can also reach out and physically touch the same button in 3D space. For both these kinds of interactions, there is a variety of gestures that are possible, like taps, long presses, drags, rotations, zooms, and a lot more. The system detects these automatically and generates touch events for your app to respond to. Gestures are integrated well with SwiftUI. The same gesture API works seamlessly with RealityKit entities. This allows people to easily interact directly with your 3D scene elements. For example, this could be useful to place a flag directly onto this 3D model, or imagine controlling a virtual zipper or perhaps you want to interact and pick up virtual chess pieces. Now if you want to do a game of bowling or transform people's hands into a virtual club, you can do this through ARKit's Skeletal Hand Tracking. Here we can see an example how you can stack cubes on a table using taps and then smashing them onto the table with your hands. This is a powerful way that you can bring app-specific hands input into the experience. And finally, the system automatically brings input from wireless keyboards, trackpads, and accessibility hardware right into your app, and the Game Controller framework lets you add support for wireless game controllers as well. Collaborating and exploring things together is a fundamental part of spatial computing. We do this through SharePlay and the Group Activities framework. On this platform, as on macOS, people can share any window, like this Quick Look experience. When people share a Quick Look 3D model, we sync the orientation, scale and animations between participants, making it easy to collaborate while being in different locations. When people are collaborating on something that is shown in their space and that they physically point at, it is important that everyone in the SharePlay session have the same experience. This enables natural references such as gesturing to an object and reinforces the feeling of being physically together. We've added the concept of shared context to the system. The system manages this shared context for you helping make sure that participants in a SharePlay session can all experience content in the same way. You can use Spatial Persona Templates to further customize how people experience your content. To learn more, watch our sessions about designing and building spatial SharePlay experiences for this platform. Given that the device has a lot of intimate knowledge of the surroundings and people, we put a lot of architecture in place to protect people's privacy. Let's dive into that. Privacy is a core principle for guiding the design of this platform, while making it easy for you as a developer to leverage APIs to take advantage of the many capabilities of the device. Instead of allowing apps to access data from the sensors directly, the system does that for you and provides apps with events and visual cues. For example, the system knows the eye position and gestures of somebody's hands in 3D space and delivers that as touch events. Also, the system will render a hover effect on a view when it is the focus of attention but does not communicate to the app where the person is looking. For many situations, the system-provided behaviors are sufficient for your app to respond to interactions. In cases where you actually do need access to more sensitive information, the system will ask the people for their permission first. An example would be asking user permission to access scene understanding to detect walls and furniture or access to Skeletal Hand Tracking to bring custom interactions into your app. Now that we've seen some of the capabilities available for apps, let's move onto exploring how we are developing those apps. Everything starts with Xcode, Apple's integrated development environment. Xcode offers a complete set of tools for developing apps, including project management support, visual editors for your UI, debugging tools, a Simulator, and much more. And most importantly, Xcode also comes with the platform SDK, which provides the complete set of frameworks and APIs you'll use for developing your app. If your source file contains a SwiftUI preview provider, the preview canvas will automatically open up in Xcode. The preview canvas has been extended to support 3D, allowing you to visualize RealityKit code for your scene, including animations and custom code. This enables shorter iteration times, finding the right look and feel for your app as you edit live code and see the results of changes and tweaks directly. Let's experiment a little bit here with how the satellite looks orbiting the Earth by changing the orbital speed and the size of the satellite. Notice the preview reflects the code changes, making it easy to see the results of quick experimentation in the code. Xcode Previews also has an object mode that allows for quick previews of 3D layouts - for example, seeing if your layout fits inside the bounds of the view. This is great for building tightly-integrated scenes with both traditional UI and new 3D visuals. Xcode Preview gives you a fantastic way to get the layout right before you run your app. The Simulator is a great way of testing interactivity with your app. You can move and look around in the scene using a keyboard, mouse or compatible game controller. And it's easy to interact with your app by using simulated system gestures. The Simulator comes with three different simulated scenes, each with a day and night lighting. This makes it easy to see your app under different conditions. The Simulator is a great way to run and debug most apps and to quickly iterate during development with a very predictable environment. We've also extended Xcode to support a number of runtime visualizations while you are debugging to help you quickly understand and track down bugs by simply looking at the scene. Here we have plane estimation visible, including semantic meaning of those planes and the collision shapes in the scene. It's easy to toggle visualizations you would like to focus on from the debugger in Xcode. These visualizations works great both in the Simulator and in the device. When it becomes time to polish your application's performance and responsiveness, we've got familiar tools like Instruments. Instruments is a powerful performance analysis tool included with Xcode. You can use Instruments to provide you with actionable insights of your running app. And for spatial computing, Instruments 15 includes a new template, RealityKit Trace, providing even more and deeper insights into new behaviors on the platform. The RealityKit Trace template has new instruments allowing developers to understand GPU, CPU, and system power impact of their app, and identify performance hotspots. You can easily observe and understand frame bottlenecks and trace them back to vital metrics like total triangles submitted or number of RealityKit entities simulated. This lets you quickly find and address potential performance issues. For more details, check out the session "Meet RealityKit Trace." We've also introduced a new developer tool called Reality Composer Pro. It allows you to preview and prepare 3D content for your apps. Reality Composer Pro helps you get an overview of all your assets and how they fit together in your scene. A new feature that we added to RealityKit is particles, and you can use a workflow in Reality Composer Pro to author and preview them. Adding particles into your scene provides movement, life and endless possibilities. Clouds, rain and sparks are just a few effects that you can build in a short amount of time. Adding audio into your scenes and associating them with objects is a breeze. You can also spatially preview audio, which takes into account the shape and context of your entire scene. Most virtual objects will use RealityKit's physically-based material to represent a variety of real world materials. RealityKit uses sensor data to feed real-world lighting information into these materials, grounding them in people's surroundings. RealityKit also has a couple of additional standard materials available for your app to use in common scenarios. For those times when you have a very specific need, perhaps to convey a creative intent, you can author custom materials in Reality Composer Pro with the open standard MaterialX. You can do this through an easy-to-use node graph, without touching any code, and quickly preview them directly in the viewport. You can learn more about this in the session "Explore materials in Reality Composer Pro." When you're feeling good about your 3D content, you can send your scenes to your device and test your content directly. This is great for iteration times since you don't even have to build an app. To learn more, watch the session "Meet Reality Composer Pro." Another option that is available is Unity. Unity is bringing the ability for you to write apps for spatial computing with familiar workflows and without any plugins required. You can bring your existing content over to power new immersive experiences. To learn more, watch these sessions covering how to write immersive apps with Unity. Now that we understand some of the fundamental concepts and tools available to us, let's see how we can start building apps. There are two ways to get started - either you design a brand-new app from the ground up to be spatial or perhaps you have an existing app that you want to bring into this new spatial platform. Let's explore how we build a new app. Designing an application from the ground up to be spatial helps you to quickly embrace the new unique capabilities of spatial computing. To get started, you can use new app template for this platform. The app template has two new important options. First, you can choose your Initial Scene Type to be either a ‘Window’ or a ‘Volume’. This generates the initial starting code for you, and it's easy to add additional scenes later. The second option lets you add an entry point for an immersive space to your app. By default, your app will launch into the Shared Space. If you select Immersive Scene Type to ‘Space’, a second scene will be added to your app, along with an example button showing how to launch into this Full Space. And when you finish the assistant, you are presented with an initial working app in SwiftUI that shows familiar buttons mixed in with a 3D object rendered with RealityKit. To learn more, watch the session "Develop your first immersive app." We are also publishing code samples, each one of them illustrating different topics to quickly get you up to speed. Destination Video shows how to build a shared, immersive playback experience that incorporates 3D video and Spatial Audio. Happy Beam is an example of how you can create a game that leverages an Immersive Space, including custom hand gestures, to create a fun game with friends. And Hello World shows how to transition between different visual modes with a 3D globe. Christopher will talk more in detail about Hello World later. Building and designing your app from the ground up on this platform offers opportunities to easily embrace spatial computing concepts. However, some of you might have existing apps that you want to bring to spatial computing. From the start, iPad and iPhone apps look and feel great. If your app supports iPad, that variant will be preferred over iPhone though iPhone-only apps are fully supported. Let's take a look at the recipes app shown here in the Simulator. While this platform has its own darker style, iPad and iPhone apps retain a light mode style. Windows can scale to allow for ease of use, and rotations for apps are handled, allowing you to see different layouts. To find out more, watch the session "Run your iPad and iPhone apps in the Shared Space" to learn about the system's built-in behaviors, functional differences and how to test with the Simulator. However, running an existing iPad or iPhone app is just the beginning. It's easy to add a destination in your Xcode project for this platform with just a click. And after that, we can simply select our target device, recompile and run.

    Once you recompile, you get native spacing, sizing and relayout. Your windows and materials will all automatically move to the platform's look and feel, ensuring legibility in any light condition, and your app can take advantage of built-in capabilities like highlighting for your custom controls. Now here’s Christopher to show us how we can evolve our apps further using the concepts we've covered so far. Thanks Jim. I'm going to walk you through how to build an application that incorporates the elements you've learned previously. Let's start with Hello World to explore some of the great functionalities you can integrate into your app. Here’s our sample in action. Upon running the app in the Simulator, Hello World launches with a window into the Shared Space, right in front of us. This is a familiar-looking window made in SwiftUI, and it contains different elements such as text, images and buttons. Using tap gestures allows the navigation within the app. Observe how our new view has embedded 3D content. SwiftUI and 3D content now work together seamlessly. Going back to our main window and selecting Planet Earth brings us to a new view. A new element appears. This is a volume. It contains a 3D model of the Earth, alongside a few UI elements. By moving the window bar, the volume's position can be adjusted anywhere in the surroundings.

    Going back to our main window again and selecting View Outer Space brings up an invitation for us to enter the solar system.

    From here, we can enter space, which is shown here with an immersion style of ‘full’. Our example renders Planet Earth and dims passthrough, allowing us to focus on the content with no distractions of the surroundings. Now that we have seen how this looks in action, let's break down some of the functionalities of Hello World and show you how to use these concepts in your own apps. As you've learned from Jim, there are multiple elements: windows, volumes and spaces. You can look at this as a spectrum that your app can use to flex up and down, depending on what is best for people using your app in a specific moment. You can choose to present one or several windows in the Shared Space, allowing people to be more present. They can see passthrough and have a choice to have other apps side by side. Or… you can choose to increase the immersion level by having your app take over the space entirely. Finding the most suitable elements for your app's experience in a given moment and flexing between them is an important consideration when you design your app for spatial computing. Next, Let's look further into how to use windows as part of your experience. Windows serve as a starting point for your app. They are built with SwiftUI using scenes, and they contain traditional views and controls. Windows on this platform support mixing 2D and 3D content. This means that your 3D content can be presented alongside 2D UI in a window. Windows can be resized and repositioned in space. People can arrange them as per their liking. Let's go back to our example. In Hello World, the content view holds our SwiftUI images, text and buttons, along with a call to action to get more immersive content. Creating a window is as easy as adding a WindowGroup to a scene. Inside the WindowGroup, we will display our Content View. Our Content View can add 3D content, bringing a new dimension of depth to your app. To do that, you can use the new Model3D view. Model3D is similar to an image, making it easy to load and display beautiful 3D content in your app that is rendered by RealityKit. To add Model3D to your view, we initialize Model3D by passing the name of the satellite model. With this, Model3D will find and load the model, and place it into your view hierarchy. Now this window has the satellite embedded into the view and can be seen coming out of the z-axis, adding a new dimension of depth to your app. Now that we have added a satellite, we can add interactions. Interactions are fundamentally built into the system and provided by SwiftUI. SwiftUI provides the gesture recognizers you are already familiar with on Apple platforms, such as Tap, onHover, and RotateGesture. The platform provides new gesture recognizers that are made for 3D interactions, like rotations in 3D space, taps on 3D objects and more. Let's look at the code that enables interactions with the satellite. We are going to enable a spatial tap gesture so we can grab and move the satellite around. Starting from Model3D, we can now add a gesture. Inside we add a DragGesture targeted to the satellite entity. We can then use the values passed in from the update closure to move the satellite. Lets see what that looks like. Back in our satellite view, where our satellite is rendered, note the DragGesture allows me to tap and drag the model, moving with my interactions. As we've just seen, it's easy to mix 2D and 3D content together with Model3D. These are just a few things you can do with a window. Now let's look at another type of element, volume. Lets see what a volume has to offer. Volume is an extension of a window, giving you similar functionality. A volume is a new style of window that is ideal for 3D content. They can host multiple SwiftUI views containing your 2D or 3D content. Although volumes can be used in a Full Space, they are really built for the Shared Space, therefore content must remain within the bounds of the volume. Let's look at how to add a volume to your scene. You will start by creating a new WindowGroup and setting its windowStyle to volumetric. Then, you need to give it a defaultSize with the properties width, height and depth. The units of a volume can be specified in points or meters. Let's look at this running in the Simulator. When the application is presented, the volume is placed in front of the person. This volume has the dimensions we specified, along with the platform controls: the application title bar, which displays our app name, making it easy to identify which app this volume belongs to; the window bar, enabling the volume to be positioned; and the close button, suspending the app when tapped, closing the volume. Currently, our volume renders the 3D model of the Earth, but you might want to start adding more content and different behaviors. In order to do this, you can adopt RealityView as part of your app. RealityView is a new view that can be added to your scene, allowing for any number of entities to be managed directly within SwiftUI. SwiftUI and RealityView let you easily integrate your app by connecting to SwiftUI's managed state and entity properties. This makes it easy to drive the behavior of 3D models with a source of truth from your app's data model. Conversion between coordinate spaces is easy with conversion functions provided by RealityView, and RealityView offers a way to position SwiftUI elements inside your 3D scene through attachments. Let's take a moment to look at how we can use attachments inside RealityView. The RealityView initializer that we're going to use takes three parameters: a make closure, an update closure, and an attachments ViewBuilder. The make closure allows you to create entities and attach them to the root entity. The update closure, which is called whenever the state of the view changes. And lastly, the attachments closure is where we add our SwiftUI views with a tag property that allows RealityView to translate our views into entities. Now, let's work through an example of how to use attachments with RealityView. Adding an attachment is as easy as putting your SwiftUI view inside the attachment closure of RealityView. Let's use this icon of a delicious pastry to represent a location on our 3D globe. For each attachment, you must add a tag that gives the attachment a name. I'll name this one ‘pin’. To display the attachment, I'll add it to the content of my RealityView. I'll do that in the update closure by adding it to the root entity of the scene. Here, we can see the attachment we made previously, rendering on the globe above my favorite bakery location. As we've just seen, using RealityKit unleashes powerful features such as Model3D, RealityView, attachments and so many more. These can be easily integrated into your app. This is only scratching the surface of what RealityKit can do. If you want to know more, I encourage you to go and watch "Build spatial experiences with RealityKit" and "Enhance your spatial computing app with RealityKit." Let's recap what we went through so far. A volume is a container that is ideal for 2D and 3D content. Volumes are built for the Shared Space, can coexist with windows, and are bounded to specified dimensions. Next, let's dive into our last type of element, spaces. Once your app is opening a dedicated Full Space, the system hides all other apps, leaving only your app visible. Now you can place your app's window, volumes and content anywhere around you. Thanks to ARKit and RealityKit, your virtual content can even interact with your surroundings. You could throw a virtual ball into the room and watch as it bounces off of the wall and then rolls on the floor. And with the addition of hand tracking, you can build custom gestures and interactions or place the content relative to people's hands. Many of these capabilities are coming from ARKit. To go into more depth and learn how you can leverage them in your app, be sure to check out "Meet ARKit for spatial computing" session. With spaces, your app can also offer different levels of immersion, depending on which style is chosen at creation time. Jim talked a bit about the spectrum of immersion available in a Full Space. Let's dive in and learn more about how you can add more immersion into your app. Immersion style is a parameter that can be passed in your Full Space. There are two basic styles called .mixed and .full. Mixed style layers your app's content on top of passthrough. Full style hides passthrough and displays your content only. You can also combine the two by choosing progressive. This style allows some passthrough initially, but the person can change the level of immersion all the way up to full by turning the Digital Crown located on the top of the device. Let's go back to our example to explore immersion style. I'll start with the mixed style and see how that looks. And because Full Space is a SwiftUI scene, I can use RealityView to display the Earth. Here's the Earth viewed from high orbit… and here's how I displayed the scene in my app. Notice I didn't actually specify the immersion style. That's because when you create an immersive space, SwiftUI assumes mixed style by default. Let's also take your app completely immersive by adding a different immersion style. This time, I'll use immersion style ‘full’. Adding an immersive style to the end of our ImmersiveSpace is easy. We store the immersive style in our state variable and then set the type to full. Because we want to give people the choice of when they enter an immersive experience, it's a good idea to add a button to allow the person to decide if they want to enter this immersive style. Now let's see the new immersive style in action. Back in our app, I've taken Hello World from a single window to fully immersed, allowing us to view Planet Earth from any angle. And that's just the beginning of what you can do with your spatial app. Let's see where you can go from here. In this session, we've covered the fundamentals: how to get started, and then took you through the basics of building an app. We have some great sessions that should be your next stop - about the principles of spatial design, or to learn about building apps with SwiftUI and with RealityKit, or to begin creating your 3D content. With spatial computing, your app creation can venture into new, exciting avenues guided by your ingenuity. Thanks for watching! ?

Developer Footer

  • Videos
  • WWDC23
  • Get started with building apps for spatial computing
  • Open Menu Close Menu
    • iOS
    • iPadOS
    • macOS
    • tvOS
    • visionOS
    • watchOS
    Open Menu Close Menu
    • Swift
    • SwiftUI
    • Swift Playground
    • TestFlight
    • Xcode
    • Xcode Cloud
    • SF Symbols
    Open Menu Close Menu
    • Accessibility
    • Accessories
    • App Extensions
    • App Store
    • Audio & Video
    • Augmented Reality
    • Design
    • Distribution
    • Education
    • Fonts
    • Games
    • Health & Fitness
    • In-App Purchase
    • Localization
    • Maps & Location
    • Machine Learning
    • Open Source
    • Security
    • Safari & Web
    Open Menu Close Menu
    • Documentation
    • Tutorials
    • Downloads
    • Forums
    • Videos
    Open Menu Close Menu
    • Support Articles
    • Contact Us
    • Bug Reporting
    • System Status
    Open Menu Close Menu
    • Apple Developer
    • App Store Connect
    • Certificates, IDs, & Profiles
    • Feedback Assistant
    Open Menu Close Menu
    • Apple Developer Program
    • Apple Developer Enterprise Program
    • App Store Small Business Program
    • MFi Program
    • News Partner Program
    • Video Partner Program
    • Security Bounty Program
    • Security Research Device Program
    Open Menu Close Menu
    • Meet with Apple
    • Apple Developer Centers
    • App Store Awards
    • Apple Design Awards
    • Apple Developer Academies
    • WWDC
    Get the Apple Developer app.
    Copyright ? 2025 Apple Inc. All rights reserved.
    Terms of Use Privacy Policy Agreements and Guidelines
    胃黏膜受损是什么症状 缺镁吃什么药 周朝之后是什么朝代 什么是马赛克 经常流鼻血是什么病的前兆
    照看是什么意思 柠檬蜂蜜水有什么功效 甲鱼和乌龟有什么区别 吃什么东西能通便 心率偏低会有什么危害
    美丽的邂逅是什么意思 反复发烧是什么原因 潮热是什么症状 巾帼不让须眉是什么意思 杂菌阳性是什么意思
    陶氏腔积液是什么意思 阴湿是什么意思 户籍信息是什么 大暑是什么时候 痛风吃什么药治疗最有效
    缓释片是什么意思cl108k.com 什么是音序hcv9jop5ns8r.cn 靶向治疗是什么hcv8jop1ns6r.cn 司空见惯什么意思hcv8jop2ns9r.cn 大汗淋漓是什么意思hcv8jop6ns5r.cn
    频繁打哈欠是什么原因hcv9jop1ns7r.cn 肝火旺盛吃什么中成药hcv8jop1ns5r.cn 吾儿是什么意思hcv9jop2ns0r.cn lining是什么意思hcv8jop0ns6r.cn 什么疾什么快hcv9jop4ns1r.cn
    白带清洁度lll度是什么意思hcv7jop4ns5r.cn 自豪的什么hcv8jop7ns9r.cn 一案双查是什么意思baiqunet.com 查脂肪肝挂什么科室hcv8jop4ns1r.cn 什么时候放假hcv8jop4ns3r.cn
    长智齿意味着什么hcv9jop1ns9r.cn 相知是什么意思hcv7jop9ns9r.cn 脑白质疏松是什么意思hcv7jop7ns3r.cn fte是什么意思cj623037.com 肠胃不好吃什么药好hcv8jop3ns8r.cn
    百度