Skip to main content

Quick Start (Swift)

Supported Environment

  • Minimum supported iOS version is 11.0

  • Minimum supported device is iPhone 6s

  • Swift and Objective-C are supported.

  • Camera permission and Network connected environment are required.

  • Build target to REAL DEVICE during the development process.

info

SeeSo SDK uses the front camera of the device)

Sample Project

Please go to the sample project page for the purpose of using it rather than implementing it.

Environment Set-ups

Append SDK to the project

  1. Drag & drop downloaded SeeSo.framework file to the project.

    ios-quick-start-_2

  2. Check the 'Copy items if needed' option. Then, select the [Added folders > Create groups] and click the finish button.

    ios-quick-start-3

  3. Check if the SeeSo.framework has been appended at the navigator list on the left side.

    ios-quick-start-4

  4. Change option 'Do Not Embed' of [Frameworks, Libraries, and Embedded Content > SeeSo.framework] to 'Embed & sign'.

Camera permission

  1. Move to [General > Info].

    ios-quick-start-6

  2. Click '+' button in [Custom iOS Target Properties] table. The button will appear when hovering the mouse.

    ios-quick-start-7

  3. Append the [Privacy - Camera Usage Description] option.

Use SDK

  1. Import SeeSo in the code. There will be an error when the target is not a real device.

    ios-quick-start-9

  2. Change the target device to the real device as above.

    ios-quick-start-10

  3. The error will disappear and the SDK will work.

Sample App Implementation

What are we making

This documnet provides a way to create a sample swift application which visualize user's gaze point coordinates with proper UI. (Portrait direction only)

Technical Overview

You will get sample codes to use followings:

  • Object construction and destruction

  • startTracking, stopTracking, startCalibration, stopCalibration

  • Filter toggle

  • Implementation :

    1. GazeTracker Construction

      import UIKit
      import SeeSo
      import AVFoundation

      class ViewController: UIViewController {

      var tracker : GazeTracker? = nil

      override func viewDidLoad() {
      super.viewDidLoad()
      // Do any additional setup after loading the view.
      }
      }

      First, import SeeSo for the GazeTracker constructions. You should define the variable to assign the created GazeTracker object. Please import the AVFoundation to detect the camera permission status.

      extension ViewController : InitializationDelegate {
      func onInitialized(tracker: GazeTracker?, error: InitializationError) {
      if (tracker != nil){
      self.tracker = tracker
      print("initalized GazeTracker")
      }else{
      print("init failed : \(error.description)")
      }
      }
      }

      Define the InitializationDelegate implementation on ViewController. Because the GazeTracker object will be relayed from InitalizationDelegat.onInitalized, it should be defined as above. When the GazeTracker construction is finished, print log to console.

      // `True` when the camera persimmion exists
      if AVCaptureDevice .authorizationStatus(for: .video) == .authorized {
      // Run initGazeTracker when the permission exists.
      }else{
      // Reqeust camera permission to user.
      AVCaptureDevice.requestAccess(for: .video, completionHandler: {
      response in
      if response {
      // Permission allowed.
      // Run initGazeTracker here.
      }
      })
      }

      Due to the GazeTracker uses data from the camera, it needs the camera permission. Check if the camera permission exists, request the permission if it doesn't.

      override func viewDidLoad() {
      super.viewDidLoad()
      if AVCaptureDevice .authorizationStatus(for: .video) == .authorized {
      GazeTracker.initGazeTracker(license: "YOUR_DEVELOPMENT_LICENSE_KEY", delegate: self)
      }else{
      AVCaptureDevice.requestAccess(for: .video, completionHandler: {
      response in
      if response {
      GazeTracker.initGazeTracker(license: "YOUR_DEVELOPMENT_LICENSE_KEY", delegate: self)
      }
      })
      }
      }

      Implement the GazeTracker constructing codes. You MUST replace the license key to the key that you generated from the SeeSo Console.

  1. Start Tracking

    extension ViewController : StatusDelegate {
    func onStarted() {
    print("tracker starts tracking.")
    }

    func onStopped(error: StatusError) {
    print("stop error : \(error.description)")
    }
    }

    Implement StatusDelegate to check if the gaze tracking is started when GazeTracker.startTracking function is called.

    extension ViewController : InitializationDelegate {
    func onInitialized(tracker: GazeTracker?, error: InitializationError) {
    if (tracker != nil){
    self.tracker = tracker
    print("initalized GazeTracker")
    self.tracker?.statusDelegate = self
    self.tracker?.startTracking()
    }else{
    print("init failed : \(error.description)")
    }
    }
    }

    Connect the StatusDelegate implementation to the GazeTracker object, then call startTracking.

  2. Log the gaze data

    extension ViewController : GazeDelegate {
    func onGaze(gazeInfo : GazeInfo) {
    print("timestamp : \(gazeInfo.timestamp), (x , y) : (\(gazeInfo.x), \(gazeInfo.y)) , state : \(gazeInfo.tracking.description)")
    }
    }

    Implement the GazeDelegate, the delegate provides the gaze data and the eye movement status from the GazeTracker.

    extension ViewController : InitializationDelegate {
    func onInitialized(tracker: GazeTracker?, error: InitializationError) {
    if (tracker != nil){
    ...
    self.tracker?.gazeDelegate = self

    self.tracker?.startTracking()
    }else{
    print("init failed : \(error.description)")
    }
    }
    }

    Connect the implemented GazeDelegate.

  1. Overall codes

    import SeeSo
    import UIKit
    import AVFoundation

    class ViewController: UIViewController {
    var tracker : GazeTracker? = nil
    override func viewDidLoad() {
    super.viewDidLoad()
    if AVCaptureDevice .authorizationStatus(for: .video) == .authorized {
    GazeTracker.initGazeTracker(license: "YOUR_DEVELOPMENT_LICENSE_KEY", delegate: self)
    }else{
    AVCaptureDevice.requestAccess(for: .video, completionHandler: {
    response in
    if response {
    GazeTracker.initGazeTracker(license: "YOUR_DEVELOPMENT_LICENSE_KEY", delegate: self)
    }
    })
    }
    }
    }
    extension ViewController : InitializationDelegate {
    func onInitialized(tracker: GazeTracker?, error: InitializationError) {
    if (tracker != nil){
    self.tracker = tracker
    print("initalized GazeTracker")
    self.tracker?.statusDelegate = self
    self.tracker?.gazeDelegate = self
    self.tracker?.startTracking()
    }else{
    print("init failed : \(error.description)")
    }
    }
    }

    extension ViewController : StatusDelegate {
    func onStarted() {
    print("tracker starts tracking.")
    }

    func onStopped(error: StatusError) {
    print("stop error : \(error.description)")
    }
    }

    extension ViewController : GazeDelegate {
    func onGaze(gazeInfo : GazeInfo) {
    print("timestamp : \(gazeInfo.timestamp), (x , y) : (\(gazeInfo.x), \(gazeInfo.y)) , state : \(gazeInfo.trackingState.description)")
    }
    }

Run

  1. Build on device

    Click the build button on the left-top side.

    ios_quick_start-11

  2. Feature

    • You should allow the camera permission first.
    /img/ios-quick-start/capture.jpeg
    • When setup is successful and the camera permission allowed, the data from the GazeTracker will be shown on the console.

      ios-quick-start-12

    • As a default, timestamp, gaze point data, and tracking state will be logged. The state will be TRACKING when SDK can detect a face, FACE_MISSING when doesn't.

      ios-quick-start-13