Opaque Return Types and Type Erasure

Learn how to use opaque return types and type erasure to improve your understanding of intuitive and effective APIs, both as a developer and a consumer. By Tom Elliott.

Leave a rating/review
Download materials
Save for later
Share

In this tutorial, you’ll learn about using opaque return types and type erasure in your code. These are advanced concepts and more commonly used by library authors than app developers. But even if you don’t plan on building a library anytime soon, understanding how and why these features are used will help you become a better app developer. :]

Getting Started

First, download the project materials by clicking the Download Materials button at the top or bottom of this tutorial.

Unlike most raywenderlich.com tutorials, the project materials for this tutorial contain two different projects. The first is a library called MagicImage, which provides a simple API for applying filters to images. The second is a project to demonstrate the library’s functionality, imaginatively called MagicImageDemo.

You won’t be opening the MagicImage library Xcode project in this tutorial — it’s embedded in the demo project for you already. Instead, open MagicImageDemo.xcodeproj in Xcode.

The demo project replicates the filter screen of photo-sharing apps like Instagram. During this tutorial, you’ll add new filters to the demo app. You’ll add new functionality to the underlying library, MagicImage, as well as fix issues in its current implementation.

Build and run the app. You’ll see an image of a butterfly and a filter bar below the image, containing a single filter called Normal.

The Starter Project

In Xcode, open Assets.xcassets. Then, open the Butterfly image. Note how the image appears true to life, with lush green plants in the background and a white butterfly in the foreground. Yet in the app, the image displays with a sepia filter applied.

Next, open FiltergramView.swift. This is the main view for the demo. The view itself has two sections stacked vertically. The image and the filter bar.

When the view appears it calls loadImage. This method loads the butterfly image and then transforms it into a new image by applying a sepia filter. The sepia image is then set as the image displayed in the app.

The sepia filter is an instance of a protocol called MIFilter. apply(to:) on UIImage is part of the MagicImage API. Both of these are discussed below.

The MIFilter protocol

Sharing Secrets

You’ll need to think about the code in this tutorial from two points of view: as the author of a library used by other developers and as the author of an app that uses the library. To make things clear, the text will refer to three different personas throughout the tutorial:

  • 🤓 Liam the Library Author. He’s in charge of writing the MagicImage library.
  • 🦸‍♀️Abbie the App Author. She uses the MagicImage library to enhance her latest app, Filtergram.
  • 🤖 And not forgetting Corinne the Compiler. Liam and Abbie need to listen to what she has to say!

The concepts you’ll learn in this tutorial can be quite hard to grasp. You might find it useful to think of type information as some sort of secret shared between Liam, Abbie and Corinne. Each of the concepts has the secret shared by a different mix of people.

Interesting Images

The MagicImage library removes the pain from applying filters to images. Under the hood, it uses Core Image. Core Image then uses Quartz, through another framework called Core Graphics, to actually render the images. Core Image and Core Graphics are complex frameworks that would take more than a tutorial to cover in their own rights.

Core Image provides its own image type, CIImage, and Core Graphics also provides its own image type, CGImage. These are separate from the SwiftUI Image type and the UIKit UIImage type. Confusing? Yes! This is why you need a library to make it easier. :]

Open MIFilter.swift. The MagicImage library exposes a protocol called MIFilter, which defines a method called apply(to:). This method applies a filter to an existing image: specifically, an MIImage.

Open MagicImage+UIImage.swift. First, note MIImage is simply a typealias for CIImage. Second, note this file adds an extension on UIImage that adds a single method: apply(_:). This method applies an MIFilter to itself before returning a new UIImage. You don’t need to understand exactly what this is doing, but if you’re curious, you can learn about Core Image from the official Apple documentation.

Note: You can also learn about Core Image in our video course on the subject.

Finally, switch back to MIFilter.swift. After the protocol definition, this file defines three filters. The first, IdentityFilter, is pretty boring. It simply returns the original image without any modification. Sepia uses the sepia tone CIFilter to apply a sepia effect to images. Finally, Compose is a different kind of filter that takes two input filters and composes (combines) them into one.

Protocols Primer

As a quick recap: Protocols define a contract other entities must fulfill when adopting the protocol. For example, the protocol Equatable from the Swift standard library requires any type that conforms to it to implement the type methods == and !=:

static func == (lhs: Self, rhs: Self) -> Bool
static func != (lhs: Self, rhs: Self) -> Bool

Protocols are useful because they allow developers to hide the specific implementation of a type away from others. If a function in MagicImage returns a protocol, then you can model this as Liam knows the secret of the returned type while Abbie and Corinne don’t.

Who knows the secret?

  • 🤓 Liam the library author ✅
  • 🤖 Corinne the Compiler ❌
  • 🦸‍♀️Abbie the App Author ❌
Note: To learn more about protocols you can read our tutorial about protocol oriented programming.

Fun With Filters

Currently, MagicImage provides only three filters, and two of them are pretty boring. Time to fix that! :] Open MIFilter.swift.

Under the definition for the Sepia filter, add the following code:

// 1
public struct Olde: MIFilter {
  // 2
  public var name = "Olde"
  public var id: String { self.name }

  public init() {}

  // 3
  public func apply(to miImage: MIImage) -> MIImage {
    let ciFilter = CIFilter.sepiaTone()
    ciFilter.inputImage = miImage
    ciFilter.intensity = 0.75

    // Create a CIImage from our filter
    guard let outputImage = ciFilter.outputImage else {
      return miImage
    }

    return outputImage
  }
}

// 4
public struct Posterize: MIFilter {
  public var name = "Posterize"
  public var id: String { self.name }

  public init() {}

  public func apply(to miImage: MIImage) -> MIImage {
    let ciFilter = CIFilter.colorPosterize()
    ciFilter.inputImage = miImage
    ciFilter.levels = 10

    guard let outputImage = ciFilter.outputImage else {
      return miImage
    }

    return outputImage
  }
}

public struct Crystallize: MIFilter {
  public var name = "Crystallize"
  public var id: String { self.name }

  public init() {}

  public func apply(to miImage: MIImage) -> MIImage {
    let ciFilter = CIFilter.crystallize()
    ciFilter.inputImage = miImage
    ciFilter.radius = 50

    guard let outputImage = ciFilter.outputImage else {
      return miImage
    }

    return outputImage
  }
}

// 5  
public struct HorizontalFlip: MIFilter {
  public var name = "Flip Horizontal"
  public var id: String { self.name }

  public init() {}

  public func apply(to miImage: MIImage) -> MIImage {
    return miImage.oriented(.upMirrored)
  }
}

This looks like a lot of code, but much of it is quite similar. Here’s a breakdown:

  1. Define a new filter called Olde, which conforms to the MIFilter protocol.
  2. Give the filter a name and an ID. In this case, you use the name as the ID.
  3. Write the apply method. This method creates a Core Image sepiaTone filter and applies the filter to the provided MIImage before returning the output from the filter.
  4. The Posterize and Crystallize filters follow the same pattern, albeit with different Core Image filters.
  5. Finally, the HorizontalFlip flip filter uses the built-in oriented(_:) method of CIImage (remembering MIImage is just a typealias for CIImage) to create a filter that flips the input image.

Now that you’ve added some filters, it’s time to give them a whirl in the demo app.

Open FiltergramView.swift. In loadImage(), replace the use of Sepia() with your new filters, one at a time. Build the app each time and marvel at the beauty of your new filters! :]

// One at a time!
let uiImage = inputImage.apply(Olde())
let uiImage = inputImage.apply(Crystallize())
let uiImage = inputImage.apply(Posterize())
let uiImage = inputImage.apply(HorizontalFlip())

Butterfly with background blurred

Finish this section by updating the filter in loadImage() with selectedFilter, like so:

let uiImage = inputImage.apply(selectedFilter)

The app now uses the filter stored in the FiltergramView‘s state to apply to the input image rather than a hard-coded filter.