Skip to content

KCKT0112/LipsSync

Repository files navigation

LipsSync

LipsSync is an iOS project designed to collect audio recordings and corresponding lip aperture data based on real-time tracking using ARKit and iPhone's Face ID. This project allows you to capture both the sound and the lip movement (aperture) during a recording session.

Features

  • Audio Recording: Records high-quality audio during the session.
  • Lip Aperture Tracking: Tracks the mouth's aperture in real-time using Face ID and ARKit.
  • Real-Time Data Collection: Provides synchronized data for both audio and lip aperture over time.
  • FaceID Integration: Uses iPhone's Face ID for precise tracking of the lip movement.

Requirements

  • iOS 16.7 or later
  • Xcode 15.0 or later
  • An iPhone with Face ID support (for lip aperture tracking)
  • CocoaPods for dependency management

Installation

  1. Clone this repository:

    git clone https://github.com/KCKT0112/LipsSync.git
    cd LipsSync
  2. Install dependencies using CocoaPods:

    pod install
  3. Open the .xcworkspace file in Xcode:

    open LipsSync.xcworkspace
  4. Build and run the project on a compatible iOS device.

Usage

  1. Open the app on a Face ID-enabled iPhone.
  2. Begin a recording session, and the app will track your lip aperture using Face ID and record the corresponding audio.
  3. After the session, you can analyze the lip aperture data alongside the recorded audio.

Data Collection Visualization Tool

To 👉 https://github.com/yqzhishen/lips-sync-visualizer

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

Facial Data Collection Tool for Acoustics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published