New Scanning and Text Capabilities with VisionKit

VisionKit comes with new Scanning and Text Capabilities. In this tutorial, you’ll learn how to use Apple’s latest VisionKit update to take advantage of the new capabilities. By Warren Burton.

4 (1) · 1 Review

Download materials
Save for later
Share
You are currently viewing page 4 of 4 of this article. Click here to view the first page.

Providing User Feedback

When you scanned the book barcode and a text fragment, you probably weren’t sure whether the item saved. In this section, you’ll provide some haptic feedback to the user when they save an item.

Providing Haptic Feedback

Haptic feedback is when your device vibrates in response to an action. You’ll use CoreHaptics to generate these vibrations.

Add this import at the top of ScannerViewController below import SwiftUI:

import CoreHaptics

Then, add this property to the top of ScannerViewController:

let hapticEngine: CHHapticEngine? = {
  do {
    let engine = try CHHapticEngine()
    engine.notifyWhenPlayersFinished { _ in
      return .stopEngine
    }
    return engine
  } catch {
    print("haptics are not working - because \(error)")
    return nil
  }
}()

Here you create a CHHapticEngine and configure it to stop running by returning .stopEngine once all patterns have played to the user. Stopping the engine after use is recommended by the documentation.

Add this extension to ScannerViewController.swift:

extension ScannerViewController {
  func hapticPattern() throws -> CHHapticPattern {
    let events = [
      CHHapticEvent(
        eventType: .hapticTransient,
        parameters: [],
        relativeTime: 0,
        duration: 0.25
      ),
      CHHapticEvent(
        eventType: .hapticTransient,
        parameters: [],
        relativeTime: 0.25,
        duration: 0.5
      )
    ]
    let pattern = try CHHapticPattern(events: events, parameters: [])
    return pattern
  }

  func playHapticClick() {
    guard let hapticEngine else {
      return
    }
    
    guard UIDevice.current.userInterfaceIdiom == .phone else {
      return
    }

    do {
      try hapticEngine.start()
      let pattern = try hapticPattern()
      let player = try hapticEngine.makePlayer(with: pattern)
      try player.start(atTime: 0)
    } catch {
      print("haptics are not working - because \(error)")
    }
  }
}

In hapticPattern, you build a CHHapticPattern that describes a double tap pattern. CHHapticPattern has a rich API that’ss worth exploring beyond this tutorial.

playHapticClick plays your hapticPattern. Haptics are only available on iPhone, so if you’re using an iPad, you use an early return to do nothing. You’ll soon do something else for iPad.

You start CHHapticEngine just before you play the pattern. This connects to the value .stopEngine that you returned in notifyWhenPlayersFinished previously.

Finally, locate extension ScannerViewController: DataScannerViewControllerDelegate and add this line at the end of dataScanner(_:didTapOn:):

playHapticClick()

Build and run to feel the haptic pattern when you tap a recognized item. In the next section, you’ll add a recognition sound for people using an iPad.

Adding a Feedback Sound

To play a sound, you need a sound file. You could make your own, but for this tutorial, you’ll use a sound that’s included in the starter project.

Go to the top of ScannerViewController.swift and add this import below the other imports:

import AVFoundation

Add this property inside ScannerViewController:

var feedbackPlayer: AVAudioPlayer?

Finally, add this extension to ScannerViewController.swift:

extension ScannerViewController {
  func playFeedbackSound() {
    guard let url = Bundle.main.url(
      forResource: "WAV_Jinja",
      withExtension: "wav"
    ) else {
      return
    }
    do {
      feedbackPlayer = try AVAudioPlayer(contentsOf: url)
      feedbackPlayer?.play()
    } catch {
      print("Error playing sound - \(error)!")
    }
  }
}

Inside dataScanner(_:didTapOn:), below playHapticClick() add this call:

playFeedbackSound()

Build and run. Ensure that the device isn’t muted and the volume is not zero. When you touch a recognized item, a sound will ring out.

Congratulations! You made a barcode and text scanner focused on students or librarians who want to collect ISBNs and text fragments.

That’s all the material for this tutorial, but you can browse the documentation for DataScannerViewController to see other elements of this API.

In this tutorial, you used a UIKit-based project. If you want to use DataScannerViewController in a SwiftUI project, you’ll need to host it in a UIViewControllerRepresentable SwiftUI View. UIViewControllerRepresentable is the mirror API of UIHostingViewController.

  • Use SwiftUI views in UIKit with UIHostingViewController.
  • Use UIKit view controllers in SwiftUI with UIViewControllerRepresentable.

Learning how to implement UIViewControllerRepresentable is out of scope for this tutorial, but what you’ve learned about DataScannerViewController will apply when you do.

You can download the completed project using the Download Materials link at the top or bottom of the tutorial.

In this tutorial, you learned how to:

  • Start and stop DataScannerViewController.
  • Work with the data structures that DataScannerViewController provides.
  • Integrate SwiftUI views in UIKit components.
  • Work with camera hardware availability and user permissions.
  • Use haptics and sound to provide user feedback.

DataScannerViewController opens up a world of interaction — at minimal development cost — with the physical and textual worlds.

Have some fun and create a game or use it to hide information in plain sight. A QR code can hold up to 7,000 characters depending on size. If you use encryption only, people with the key can read that data. That’s an information channel even when internet access is blocked, unavailable or insecure.

Please share what you develop in the forum for this tutorial using the link below. I look forward to seeing what you do!