Apple WWDC20 Swift Student Challenge
Potato Mouth AR Audio Visualizer
🏆 Selected by Apple
Overview
Potato Mouth is an AR project with a live voice-animated potato character. After a subtitled spoken introduction, a user may plant the potato in a patch of soil, watch it quickly grow, and sprout into the air. The potato's mouth is then animated by the fundamental frequency of the microphone input, to include vocal inflection in the animation heuristic. Settings for input sensitivity and freezing the potato's position are displayed over the bottom of the AR window.
Features
AR Environment
ARKit, SceneKit, CGMutablePath, SwiftUI, DispatchQueue
- A user may tap on any detected horizontal plane, displayed as "soil," to plant the potato in a desired location.
- The SceneKit/CGMutablePath face can be modified for animation.
- The potato can be toggled to face the camera, or freeze in its current position.
Mouth Animation
AVCaptureSession, vDSP FFT, Combine, DispatchQueue
- A custom audio engine detects the fundamental frequency of the microphone input.
- The fundamental frequency is converted to the nearest note in the 12-tone musical scale, and its accuracy relative to the note's frequency is calculated.
- When the microphone input is above the input threshold, the potato's mouth opens wider as the detected note is closer to being in tune.
Subtitled Speech Synthesizer
AVSpeechSynthesizer, Combine, SwiftUI
- AVSpeechSynthesizer speaks the most recent update to an observed string.
- A SwiftUI view also observes the spoken string to display subtitles during speech.
- The next spoken string can be updated without interrupting the current speech, and automatically spoken after the current speech is complete.
Demo Video