Chapters

Hide chapters

Apple Augmented Reality by Tutorials

First Edition - Early Access 3 · iOS 14 · Swift 5.1 · Xcode 12

Before You Begin

Section 0: 3 chapters
Show chapters Hide chapters

Section I: Reality Composer

Section 1: 5 chapters
Show chapters Hide chapters

Section VI: ARKit & SceneKit

Section 6: 2 chapters
Show chapters Hide chapters

Section VII: ECS & Collaborative Experiences (Bonus Section)

Section 7: 2 chapters
Show chapters Hide chapters

2. AR Quick Look
Written by Chris Language

The message from Apple is crystal clear: Augmented reality (AR) is here to stay, and it’s going to play a big part in the future of the iPhone. Ever since the release of ARKit 2.0 and iOS 12 at WWDC 2018, Apple has deeply integrated AR into the core of all its operating systems.

Even apps like iMessage, Mail, Notes, News, Safari and Files now have support for AR.

This is thanks to AR Quick Look, which is the simplest way to present AR content on mobile devices. In this chapter, you’ll learn about AR Quick Look. You’ll see how easy it is to integrate it into your own apps to give them some cool AR superpowers.

What is AR Quick Look?

You’re probably already familiar with Quick Look, which lets you quickly peek at images, PDFs and spreadsheets in apps like Mail and Safari. Quick Look is a framework that does the heavy lifting for you, giving your app superpowers that let it support a wide selection of universal file formats.

Here’s the best part: Quick Look now offers support for USDZ and Reality file formats via its AR Quick Look feature.

AR Quick Look lets you showcase a virtual 3D model of a physical product within your local space. The model appears grounded in your environment, giving you a good sense of how the physical product looks.

Imagine you want to buy a new sofa. A shopping app with this technology lets you check out how various sofas actually look in your living room.

AR Quick Look achieves a high degree of realism by mimicking realistic lighting conditions in your local environment. It combines this lighting with soft shadows and physically-based rendering (PBR) materials that shine and reflect the local environment, just like the real thing.

Using AR Quick Look is as simple as providing it with the path to your USDZ or Reality content and letting it do its magic. And there are lots of nifty things you can do with it, too.

AR Quick Look features

At face value, AR Quick Look seems simple enough. When you dig deeper, however, you’ll notice that it comes with a bucket-load of insanely cool features.

Here’s a look at what’s inside:

  • Anchors: Anchors allow you to anchor virtual content to various real-world surfaces. With the release of iOS 13, AR Quick Look supports horizontal surfaces like floors, ceilings, tables and chairs; vertical surfaces like walls; images including photos and posters; and faces and objects like toys and consumer products.

  • Occlusion: Occlusion allows the physical world to obscure virtual content based on its depth relative to the real world. AR Quick Look currently offers occlusion for people and faces. This feature works only on certain devices.

  • Physics, Forces and Collisions: Virtual content responds to the laws of physics. Objects can fall due to gravity and bounce and collide with one another.

  • Triggers and Behaviors: Users can reach into AR and interact with objects to trigger events, animations and sounds.

  • Realtime Shadows: Virtual content casts realistic-looking shadows onto real-world surfaces. The quality of the shadows depends on the device’s capabilities. Low-end devices project shadows, while high-end devices use ray-traced shadows.

  • High Dynamic Range, Tone Mapping and Color Correction: AR Quick Look samples the local environment in real time and uses the results to control the virtual content’s brightness, color and tone. This makes objects seem to blend naturally with their surroundings.

  • Camera Grain, Motion Blur and Depth of Field: Post-processing camera effects push the visual fidelity to the next level. Fast-moving objects blur, distant objects appear out of focus and adding a grain effect to crisp-looking virtual content makes it blend in with a typical grainy camera feed.

  • Multi-Sampling and Specular Anti-aliasing: AR Quick Look anti-aliases the virtual content’s edges to smooth out pixelation. It also anti-aliases specular reflections to prevent flickering.

  • Physically Based Rendering Clear Coat Materials: Apply super-realistic materials to virtual content so your objects look exactly like their real-life counterparts.

  • Ambient and Spatial Audio: Ambient sounds add another level of realism to virtual content. Objects produce spatially-accurate sound effects based on their location in physical space and their position relative to the camera.

  • Integration and Customization: You can easily integrate AR Quick Look into Web, iOS, macOS and tvOS apps.

  • Apple Pay: Apple Pay is fully integrated into AR Quick Look. Users can impulse buy your products without leaving the AR experience.

As you can see, AR Quick Look gives you the flexibility to make your products shine in AR. However, there are a few limitations to keep in mind while you work with it.

AR Quick Look limitations

Although AR Quick Look offers plenty of features, it’s important to note that the AR experience scales back some effects based on the capabilities of the user’s device. Only the latest and greatest high-end devices are capable of offering the full experience.

This might seem obvious, but it’s worth mentioning that AR Quick Look is only available in the Apple ecosystem. You can’t view AR Quick Look content on devices that run Android, Windows or any other non-Apple operating system.

AR Quick Look experiences are also somewhat limited due to the lack of any kind of scriptable or codable pipeline. More intelligent AR experiences require you to create apps for them.

Experiencing AR Quick Look

Apple offers a fantastic gallery of 3D models that you can use to explore AR Quick Look. If you’re running iOS 12 or newer on a device, you can try it for yourself.

Open the following link in Safari: https://apple.co/2C5362d

These models use the USDZ format, and thanks to AR Quick Look, Safari now has built-in support for that format.

Did you notice that tiny cube on each of the model images?

That’s Apple’s signature icon to indicate that the model is viewable in AR.

So, what are you waiting for? Pick an option and try it for yourself.

AR mode

When you pick a model, Safari launches AR Quick Look, which loads the referenced USDZ file from a URL and presents it to you.

It launches directly into AR mode to get the user into AR as quickly as possible.

Wait, what the duck! Is that Launchpad McQuack?

As soon as AR Quick Look detects the desired surface, it automatically places the 3D model on top of that surface. The experience is seamless and, with quality virtual content, you can easily believe you’re seeing the real thing.

There are a few things you can do while in AR mode:

  • Positioning: You can easily position the 3D model with a tap, hold and drag gesture to place the model wherever you want. AR Quick Look understands both horizontal and vertical surfaces. So if there’s a wall behind the model, you can simply drag the model onto the wall, and it’ll stick.

  • Scaling: Scale the 3D model larger or smaller with a pinch-in or -out gesture. Reset the scale to 100% by double-tapping the 3D model.

  • Rotating: Rotate the 3D model by placing two fingers on the screen and moving them in a circular motion. Again, a double-tap gesture will reset the rotation.

  • Levitating: Defy gravity and levitate the 3D model with a two-finger upwards-drag gesture.

  • Snapshots: Take cool pictures of your AR experience by quickly tapping the camera shutter button once. This will save a snapshot to your photos.

  • Videos: You can even make a video recording of your AR experience by holding down the camera shutter button for a short time. As soon as you let go, AR Quick Look will automatically save the video clip to your photos. Excellent!

  • Sharing: Select the share button at the top-right and you’ll get a list of apps that let you share the current model. How about AirDropping it to a nearby friend?

Once you’re done playing, you can close AR Quick Look with the X button at the top-left corner. You’ll return to the webpage, where you can explore some of the other cool 3D models.

Object mode

Switch into Object mode by selecting the Object tab in AR mode. Here, you can inspect the 3D model with the same basic gestures to manipulate it, like pinch to scale and swipe to rotate.

With Object mode, you’re able to see the object’s details without the distractions of the real world around it.

Once you’ve finished looking at the models, you’re ready to move on to learning how to add augmented reality to your websites.

AR Quick Look for web

As of iOS 12, Safari has built-in support for previewing USDZ and Reality files, thanks to AR Quick Look. In this section, you’ll learn how to integrate USDZ file support into your own websites.

Open the starter_web folder and double-click index.html. This launches Safari and loads the following web page:

This is an example web page with two USDZ models. When you inspect the files in the folder, you’ll observe three images along with three USDZ files.

Your next step is to go through the process of adding another USDZ model to your AR gallery.

Open index.html using a plain text editor.

Note: To edit HTML files, you need to use a plain text editor; TextEdit tends to render the file rather than give you access to the underlying HTML code. If you do not have a plain text editor, you can use Xcode to edit the file.

Add the following HTML markup to the bottom of the file, just above the </body> tag:

<a href="pig.usdz" rel="ar">
</a>

This adds a standard <a> tag, which creates references to URLs. Look at the provided attributes:

  • href: This is set to pig.usdz. It points to the USDZ file that you’re referencing, which is in the same location as index.html.

  • rel: This attribute specifies the relationship between the current document and the linked document. In this case, you’re setting the relationship to ar, indicating that the referenced document is an AR model.

Now, add the following line of code just before the previously-added </a> tag:

<img src="pig.jpg" width="250" height="250">

Up to this point, the reference to the USDZ file was invisible on the webpage. This line of code adds an image to the reference, giving the user something to tap.

Finally, open .htaccess with a text editor and add the following line:

AddType model/vnd.usdz+zip .usdz

This adds the required MIME type so Safari knows what to do with the USDZ file type.

Note: To support Reality files, use the following MIME type:

AddType model/vnd.reality .reality

Save your changes and test. Once again, open index.html in Safari.

That’s it; you just added another USDZ file to your webpage. Fantastic!

Note: You can only experience AR Quick Look on an actual device running iOS 12 or newer. You also need to deploy your webpage to an actual web server to browse to it from your device. Setting up a local web server falls outside the scope of this book.

If you’re wondering how to add AR Quick View support to existing apps, you’ve come to the right place. You’re going to do that next.

AR Quick Look for apps

Your first step to add AR Quick Look to an app is to open the starter project from the starter folder. It’s a basic single-view app with a UITableView and a custom cell that shows a small image and a name.

Do a quick build and run to test it.

When you select a row, nothing happens yet. All you’re doing at this point is storing the selected row index in a variable named modelIndex. You’ll use this variable later.

Next, you’ll load the images using an array of strings named modelNames, which links directly to the images stored in Assets.xcassets.

Import the USDZ files into the project by dragging and dropping the Models folder, found inside resources, into the project.

Make sure you’ve checked Add to targets, then click Finish to complete the process.

You’ll now see a new Models group inside the project; you can preview the USDZ files within Xcode.

Open ViewController.swift and add the following to the top of the file:

import QuickLook

This imports the QuickLook framework, which is required to implement the AR Quick Look functionality within your app.

Next, add the following protocols to ViewController:

QLPreviewControllerDelegate, QLPreviewControllerDataSource

Here’s a closer look at these protocols:

  • QLPreviewControllerDelegate: This protocol lets the preview controller provide a zoom animation for the Quick Look preview. It also specifies if your app opens a URL and responds to the opening and closing of the preview.

  • QLPreviewControllerDataSource: This protocol lets the data source tell the QLPreviewController how many items to include in a preview item navigation list.

Implement the protocols by adding the following below QLPreviewControllerDataSource:

``swift func numberOfPreviewItems(in controller: QLPreviewController) -> Int { return 1 }


When previewing AR content, you’re _always_ going to preview only one object at a time. So when the data source queries the number of preview items, you tell it that there’s only one item available for preview.



Add the following function below the previously-added function:

```swift  
func previewController(
  _ controller: QLPreviewController, 
  previewItemAt index: Int) -> QLPreviewItem {
  let url = Bundle.main.url(
    forResource: modelNames[modelIndex],
    withExtension: "usdz")!
  return url as QLPreviewItem
}

So what’s going on in the code above? When the preview controller requests the resourceURL, you construct the URL for the selected resource name at modelIndex in the modelNames array. You also specify the resource extension as usdz. Finally, the code passes the URL back to the controller as a QLPreviewItem.

You’ve now implemented all the Quick Look protocols. The only thing left to do is present the preview.

Add the following lines of code to the bottom of tableView(_:didSelectRowAt:):

// 1
let previewController = QLPreviewController()
// 2
previewController.dataSource = self
previewController.delegate = self
// 3
present(previewController, animated: false)

Finally, you’re ready to present the AR Quick Look preview to the user. With this code:

  1. You create an instance of QLPreviewController.
  2. Next, you nominate the ViewController class as dataSource and delegate for the preview controller.
  3. Finally, you present the preview controller to the user.

That’s it, you’re done! Your app can use AR Quick Look now. Build and run to test it.

Note: Make sure your device is running iOS 12 or newer, or you won’t be able to see it.

You’re about to witness Uncle Scrooge banking a coin!

Key points

Well done, you’ve reached the end of the first chapter.

Here’s what you learned:

  • Apple has deeply integrated AR into iOS, macOS and tvOS. Many commonly-used apps provide AR support with AR Quick Look.

  • AR Quick Look is feature rich and provides a premier augmented reality experience out of the box. Users instinctively know what to do.

  • AR Quick Look uses USDZ and Reality files.

  • You can use USDZ and Reality content on the web. Upload your file and create a stock-standard reference to it. Thanks to its built-in support, Safari’s smart enough to know it can view the content with AR Quick Look. It automatically gives the user that mind-blowing AR experience.

  • Need to add AR support to some of your existing apps? AR Quick Look has your back. Simply add your USDZ and Reality files to your existing project along with some sampled images. Then use QLPreviewController to do the heavy lifting for you.

Where to go from here?

Here are a few links to expand your knowledge on this topic:

Now that you know all there is to know about AR Quick Look, you might wonder how you create your own USDZ and Reality files. Well, continue to the next chapter to find out!

Have a technical question? Want to report a bug? You can ask questions and report bugs to the book authors in our official book forum here.
© 2024 Kodeco Inc.