The goal of this project was to prototype an app that controls an iPhone X with face gestures. It is written in Swift and was constructed in Xcode. The app will only run on an iPhone X because Apple's TrueDepth camera is the hardware that makes it possible. The code uses Apple's ARKit library to access the 3D face model generated by the TrueDepth camera, which is how face gestures are recognized under the hood. Specifically, however, this app recognizes face gestures through the FaceTrigger class written by GitHub user barnaclejive. His FaceTrigger repository can be found here: https://github.com/barnaclejive/FaceTrigger
The novelty in this app is the interface that the face gestures control. It's a simple app with only two tabs; one tab is a "busy box" of UI elements that are controlled by face gestures, and the other tab allows users to pick which gestures control which elements.
All source code files for the app are contained in the FaceGesturesPrototype
folder.
Within that folder is an MIT license (because this code is open source), a podspec file for FaceTrigger, and three other folders: FaceGesturesPrototype.xcworkspace
, FaceTrigger
, and FaceTriggerExample
.
FaceGesturesPrototype.xcworkspace
is the Xcode workspace of this project. To run this code on an iPhone X, all one really needs to do is clone this repo onto their local machine and then double click FaceGesturesPrototype.xcworkspace
. That should open Xcode, allowing the app to be run.
The FaceTrigger
folder contains source code files for FaceTrigger.
The FaceTriggerExample
folder contains all the source code written specifically for this project.
-
FaceGesturesPrototype/FaceTriggerExample/FaceTriggerExample/ViewController.swift
contains all the logic for gesture recognition and UI manipulation. -
FaceGesturesPrototype/FaceTriggerExample/FaceTriggerExample/AssignmentViewController.swift
contains all the logic to select which gestures get hooked up to which elements.
Pretty much everything else in this repo is stuff generated by Xcode.