Building your first Camera Kit App
Camera Kit brings Snap's cutting-edge augmented reality technology to your iOS mobile and desktop web applications.
In this tutorial, you'll learn how to build an iOS-based Camera Kit application, allowing you to apply a Lens to the user's camera.
Prerequisites
- A Snapchat Developer Account and a created Camera Kit application on the Developer Portal
- Basic knowledge of Swift and UIKit (UIView, UIViewController)
- Familiarity with AVFoundation for camera management.
- Latest Xcode installed
New to iOS development? Check out this community curated list of resources or App Dev Tutorials by Apple
Getting Started
Let's prepare our local development environment and add the Camera Kit SDK as a dependency using Swift Package Manager.
Download the project files above and open Starting Project/CameraKitBasicSample.xcodeproj
Select "File" → "Add Package Dependencies…".
Paste the following link into the search box: https://github.com/Snapchat/camera-kit-ios-sdk
.
Press "Add Package" to begin adding Camera Kit to your project.
After Xcode finishes downloading the package, select SCSDKCameraKit
. Set everything else to "None" and press "Add Package".
We are now done adding Camera Kit as a dependency to your project.
Initialize Camera Kit
We'll start by defining our credentials: API token, Lens ID, and Lens group ID. Then, we'll create and configure a Camera Kit session.
Open CameraViewController.swift
and import SCSDKCameraKit
library
This library provides access to core Camera Kit functionalities.
Configure your credentials: API token, Lens group ID and Lens ID
You can find these credentials by navigating to My Lenses and selecting your application. You can use Staging API Token, Demo Lens Group ID and the ID of any Lens from that group.
Set up PreviewView
PreviewView
is a UIView
that renders Camera Kit output and will be used to display the camera feed.
Create AVCaptureSession
AVCaptureSession
manages the camera input, which will feed the live video stream into Camera Kit.
Start Camera Kit Session
We'll connect all the components into a single system by implementing a start session helper method.
Begin implementing startSession
function by adding the PreviewView
as output
Configure the previously created PreviewView
to respond to touch inputs and add it as the output, allowing the camera feed to be displayed with AR Lenses.
Start Camera Kit session
AVSessionInput
wraps the device's camera input, while ARSessionInput
enables AR functionality. These inputs feed into Camera Kit to support AR experiences with the camera feed. This starts the session, enabling the live camera feed and preparing it for AR experiences.
Start camera input
Ensure the camera input starts running on a background thread to avoid blocking the main thread.
Open Info.plist
and add Privacy items related to camera and microphone permissions.
You can also edit the Info.plist
source code directly and add the following:
<key>NSMicrophoneUsageDescription</key>
<string>Camera Kit Sample app uses your mic for lenses</string>
<key>NSCameraUsageDescription</key>
<string>Camera Kit Sample app uses your camera for lenses</string>
Applying a Lens
Now that you have the camera feed running, this section will guide you through applying the Lens to your live video feed.
Fetch the Lens
Add a Lens observer that will monitor updates for your specific Lens and group. This observer enables you to apply the Lens once it's ready for use.
Implement the Lens Observer
Create an observer that monitors the Lens repository for updates and handles any potential errors during the update process.
Apply the Lens
When the Lens is ready, apply it to the live camera feed and handle any potential errors.
Summary
You've now successfully applied a Lens to a live camera feed in your iOS app. This forms a strong foundation for your Camera Kit integration, allowing you to expand with features like a Lens carousel, video recording, switching between cameras, and more in the future.