LipsSync is an iOS project designed to collect audio recordings and corresponding lip aperture data based on real-time tracking using ARKit and iPhone's Face ID. This project allows you to capture both the sound and the lip movement (aperture) during a recording session.
- Audio Recording: Records high-quality audio during the session.
- Lip Aperture Tracking: Tracks the mouth's aperture in real-time using Face ID and ARKit.
- Real-Time Data Collection: Provides synchronized data for both audio and lip aperture over time.
- FaceID Integration: Uses iPhone's Face ID for precise tracking of the lip movement.
- iOS 16.7 or later
- Xcode 15.0 or later
- An iPhone with Face ID support (for lip aperture tracking)
- CocoaPods for dependency management
-
Clone this repository:
git clone https://github.com/KCKT0112/LipsSync.git cd LipsSync -
Install dependencies using CocoaPods:
pod install
-
Open the
.xcworkspacefile in Xcode:open LipsSync.xcworkspace
-
Build and run the project on a compatible iOS device.
- Open the app on a Face ID-enabled iPhone.
- Begin a recording session, and the app will track your lip aperture using Face ID and record the corresponding audio.
- After the session, you can analyze the lip aperture data alongside the recorded audio.
To 👉 https://github.com/yqzhishen/lips-sync-visualizer
This project is licensed under the MIT License - see the LICENSE file for details.