So, what’s QR code? I imagine most of you recognize what a QR code is. In case you haven’t heard of it, simply check out the above picture – that’s a QR code.
QR (brief for Fast Response) code is a sort of two-dimensional bar code developed by Denso. Initially designed for monitoring elements in manufacturing, QR code has gained recognition in shopper area in recent times as a technique to encode the URL of a touchdown web page or advertising info. Not like the fundamental barcode that you simply’re acquainted with, a QR code comprises info in each the horizontal and vertical path. Thus, this contributes to its functionality of storing a bigger quantity of knowledge in each numeric and letter type. I don’t need to go into the technical particulars of the QR code right here. For those who’re focused on studying extra, you may take a look at the official web site of QR code.
As an iOS developer, you might surprise how one can empower your app to learn a QR code. Earlier, I’ve written a tutorial on constructing a QR Code reader utilizing UIKit and AVFoundation. With the discharge of SwiftUI, let’s see how the identical QR Code Scanner app may be carried out utilizing this new UI framework.
Take a Fast Have a look at the QR Code Scanner App
The demo app that we’re going to construct is pretty easy and simple. Earlier than we proceed to construct the demo app, nevertheless, it’s essential to grasp that each one varieties of barcode scanning in iOS, together with QR code scanning, is completely primarily based on video seize. Maintain this level in thoughts, because it’ll allow you to perceive this tutorial.
So, how does the demo app work?
Check out the screenshot under. That is how the app UI seems. The app works just about like a video capturing app however with out the recording characteristic. When the app is launched, it makes use of the iPhone’s rear digicam to identify a QR code and decodes it robotically. The decoded info (e.g. an URL) is displayed proper on the backside of the display screen.

Now that you simply perceive how the demo app works, let’s get began and develop the QR code reader app in SwiftUI.
Constructing the QRScannerController Class
The SwiftUI framework doesn’t include a built-in API for launching the digicam. To make use of the gadget’s digicam, we have to use UIKit to construct a view controller for capturing movies. After which we make use of UIViewControllerRepresentable
so as to add the view controller to the SwiftUI undertaking.
Assuming you’ve created a brand new SwiftUI undertaking in Xcode, let’s first create a brand new Swift file named QRScanner.swift
. Within the file, import each SwiftUI and AVFoundation frameworks:
import SwiftUI import AVFoundation |
Subsequent, implement a brand new class referred to as QRScannerController
like this:
var delegate: AVCaptureMetadataOutputObjectsDelegate?
override func viewDidLoad() {
tremendous.viewDidLoad()
// Get the back-facing digicam for capturing movies
guard let captureDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, place: .again) else {
print(“Did not get the digicam gadget”)
return
}
let videoInput: AVCaptureDeviceInput
do {
// Get an occasion of the AVCaptureDeviceInput class utilizing the earlier gadget object.
videoInput = attempt AVCaptureDeviceInput(gadget: captureDevice)
} catch {
// If any error happens, merely print it out and do not proceed any extra.
print(error)
return
}
// Set the enter gadget on the seize session.
captureSession.addInput(videoInput)
// Initialize a AVCaptureMetadataOutput object and set it because the output gadget to the seize session.
let captureMetadataOutput = AVCaptureMetadataOutput()
captureSession.addOutput(captureMetadataOutput)
// Set delegate and use the default dispatch queue to execute the decision again
captureMetadataOutput.setMetadataObjectsDelegate(delegate, queue: DispatchQueue.foremost)
captureMetadataOutput.metadataObjectTypes = [ .qr ]
// Initialize the video preview layer and add it as a sublayer to the viewPreview view’s layer.
videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
videoPreviewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
videoPreviewLayer?.body = view.layer.bounds
view.layer.addSublayer(videoPreviewLayer!)
// Begin video seize.
DispatchQueue.international(qos: .background).async {
self.captureSession.startRunning()
}
}
}
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 | class QRScannerController: UIViewController { var captureSession = AVCaptureSession() var videoPreviewLayer: AVCaptureVideoPreviewLayer? var qrCodeFrameView: UIView?
var delegate: AVCaptureMetadataOutputObjectsDelegate?
override func viewDidLoad() { tremendous.viewDidLoad()
// Get the back-facing digicam for capturing movies guard let captureDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, place: .again) else { print(“Did not get the digicam gadget”) return }
let videoInput: AVCaptureDeviceInput
do { // Get an occasion of the AVCaptureDeviceInput class utilizing the earlier gadget object. videoInput = attempt AVCaptureDeviceInput(gadget: captureDevice)
} catch { // If any error happens, merely print it out and do not proceed any extra. print(error) return }
// Set the enter gadget on the seize session. captureSession.addInput(videoInput)
// Initialize a AVCaptureMetadataOutput object and set it because the output gadget to the seize session. let captureMetadataOutput = AVCaptureMetadataOutput() captureSession.addOutput(captureMetadataOutput)
// Set delegate and use the default dispatch queue to execute the decision again captureMetadataOutput.setMetadataObjectsDelegate(delegate, queue: DispatchQueue.foremost) captureMetadataOutput.metadataObjectTypes = [ .qr ]
// Initialize the video preview layer and add it as a sublayer to the viewPreview view’s layer. videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession) videoPreviewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill videoPreviewLayer?.body = view.layer.bounds view.layer.addSublayer(videoPreviewLayer!)
// Begin video seize. DispatchQueue.international(qos: .background).async { self.captureSession.startRunning() }
}
} |
For those who’ve learn the earlier tutorial, it is best to perceive how the code works. Anyway, let me shortly stroll you thru the code once more. As talked about within the earlier part, QR code scanning relies on video seize. To carry out a real-time seize, all we have to do is:
- Lookup the again digicam gadget.
- Set the enter of the
AVCaptureSession
object to the suitableAVCaptureDevice
for video capturing.
So, within the viewDidLoad
methodology, we initialize the again digicam utilizing AVCaptureDevice
. Subsequent, we create an occasion of AVCaptureDeviceInput
utilizing the digicam gadget. The enter gadget is then added to the captureSession
object. An occasion of AVCaptureMetadataOutput
is created and added to the identical session object as an output to the seize session.
We additionally set the delegate object (AVCaptureMetadataOutputObjectsDelegate
) for processing the QR code. When QR codes are captured from the receiver’s connection, they’re vended to the delegate object. We haven’t carried out this delegate object but and can do it later.
The metadataObjectTypes
property is used to specify what sort of metadata we’re focused on. The worth of .qr
clearly signifies that we simply need to do QR code scanning.
The previous couple of traces of the code above is to create the video preview layer and add it as a sublayer to the viewPreview view’s layer. This shows the video captured by the gadget’s digicam on display screen.
Integrating QRScannerController with SwiftUI
Now that we’ve ready the view controller for capturing video and scanning QR code, how can we combine it with our SwiftUI undertaking. SwiftUI gives a protocol referred to as UIViewControllerRepresentable
to create and handle a UIViewController
object.
In the identical file, let’s create a struct named QRScanner
that adopts the protocol:
func makeUIViewController(context: Context) -> QRScannerController {
let controller = QRScannerController()
return controller
}
func updateUIViewController(_ uiViewController: QRScannerController, context: Context) {
}
}
struct QRScanner: UIViewControllerRepresentable {
func makeUIViewController(context: Context) –> QRScannerController { let controller = QRScannerController()
return controller }
func updateUIViewController(_ uiViewController: QRScannerController, context: Context) { } } |
We implement the 2 required strategies of the UIViewControllerRepresentable
protocol. Within the makeUIViewController
methodology, we return an occasion of QRScannerController
. Since we don’t must replace the state of the view controller, the updateUIViewController
methodology is empty.
That is how you utilize a UIViewController
object in SwiftUI undertaking.
Utilizing QRScanner
Now let’s swap over to ContentView.swift
and use the QRScanner
struct we simply created. All it’s essential initialize within the physique
a part of ContentView
:
var physique: some View {
ZStack(alignment: .backside) {
QRScanner()
Textual content(scanResult)
.padding()
.background(.black)
.foregroundColor(.white)
.padding(.backside)
}
}
}
struct ContentView: View { @State var scanResult = “No QR code detected”
var physique: some View { ZStack(alignment: .backside) { QRScanner()
Textual content(scanResult) .padding() .background(.black) .foregroundColor(.white) .padding(.backside) } } } |
I additionally added a textual content label for displaying the results of QR scan. Within the simulator, it solely shows the textual content label. Later, should you run the app in an actual gadget (iPhone/iPad), the app ought to begin the built-in digicam.

Earlier than you may efficiently launch the app, you must add a key named NSCameraUsageDescription
within the Data.plist
file. Within the undertaking navigator, choose the undertaking file and go to the Data part. Add a brand new row and set the important thing to Privateness – Digicam Utilization Description. For its worth, set it to We have to entry your digicam for scanning QR code.

For those who run the app now, it ought to robotically entry the built-in digicam and begin capturing video. Nonetheless, the QR code scanning doesn’t work but.
Dealing with Scan Outcomes
In ContentView
, we’ve got a state variable to retailer the scan end result. The query is how can the QRScanner
(or QRScannerController
) move the decoded info of the QR code again to ContentView
?
In case you are not forgetful, we haven’t carried out the delegate (i.e. the occasion of AVCaptureMetadataOutputObjectsDelegate
) for processing the QR code. The next delegate methodology of AVCaptureMetadataOutputObjectsDelegate
is required to be carried out:
optionally available func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) |
The duty of this delegate is to retrieve the decoded info and move it again to the SwiftUI app. To change knowledge between the view controller object and the SwiftUI interface, we have to present a Coordinator
occasion, which additionally adopts the AVCaptureMetadataOutputObjectsDelegate
protocol, to deal with these interactions.
First, declare a binding in QRScanner
:
@Binding var end result: String |
Subsequent, insert the next code in QRScanner
to arrange the Coordinator
class:
@Binding var scanResult: String
init(_ scanResult: Binding<String>) {
self._scanResult = scanResult
}
func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) {
// Test if the metadataObjects array isn’t nil and it comprises a minimum of one object.
if metadataObjects.rely == 0 {
scanResult = “No QR code detected”
return
}
// Get the metadata object.
let metadataObj = metadataObjects[0] as! AVMetadataMachineReadableCodeObject
if metadataObj.kind == AVMetadataObject.ObjectType.qr,
let end result = metadataObj.stringValue {
scanResult = end result
print(scanResult)
}
}
}
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 | class Coordinator: NSObject, AVCaptureMetadataOutputObjectsDelegate {
@Binding var scanResult: String
init(_ scanResult: Binding<String>) { self._scanResult = scanResult }
func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) {
// Test if the metadataObjects array isn’t nil and it comprises a minimum of one object. if metadataObjects.rely == 0 { scanResult = “No QR code detected” return }
// Get the metadata object. let metadataObj = metadataObjects[0] as! AVMetadataMachineReadableCodeObject
if metadataObj.kind == AVMetadataObject.ObjectType.qr, let end result = metadataObj.stringValue {
scanResult = end result print(scanResult)
} } } |
The category has a binding for updating the scan end result. That is how we move the scan end result again to the SwiftUI objects.
To course of the scan results of QR codes, we additionally implement the metadataOutput
methodology. The second parameter (i.e. metadataObjects
) of the strategy is an array object, which comprises all of the metadata objects which have been learn. The very very first thing we have to do is make it possible for this array isn’t nil
, and it comprises a minimum of one object. In any other case, we set the worth of scanResult
to No QR code detected.
If a metadata object is discovered, we test to see if it’s a QR code and decode the embedded knowledge. The decoded info may be accessed through the use of the stringValue
property of an AVMetadataMachineReadableCode
object.
As soon as we put together the Coordinator
class, insert the next methodology to create the Coordinator
occasion in QRScanner
:
func makeCoordinator() –> Coordinator { Coordinator($end result) } |
Additionally, replace the makeUIViewController
methodology like under. Now we have to assign the coordinator
object to the controller’s delegate
:
return controller
}
func makeUIViewController(context: Context) –> QRScannerController { let controller = QRScannerController() controller.delegate = context.coordinator
return controller } |
The undertaking is sort of full. Now swap again to ContentView.swift
. Replace QRScanner()
like under to move the scan end result:
QRScanner(end result: $scanResult) |
That’s it! You’re able to go! Hit the Run button to compile and take a look at the app on an actual gadget.