Skip to main content
Version: Release

API Docs

Overview

NameTypeDescription
GazeTrackerclassClass that generates gaze tracking data
GazeInfoclassA class composed of information about eye tracking.
UserStatusOptionclassThe class contains User Status options information for GazeTracker
InitializationDelegateprotocolA Delegate interface that notifies the result when creating a gaze tracker.
GazeTrackerDelegateprotocolAll Delegates used in GazeTracker instance inherit from GazeTrackerDelegate.
StatusDelegateprotocolProtocol that informs whether the Gaze Tracker is operating or not.
GazeDelegateprotocolProtocol to receive frame-by-frame GazeInfo.
CalibrationDelegateprotocolProtocol used in the calibration process.
UserStatusDelegateprotocolProtocol used for user status function.
ImageDelegateprotocolProtocol that receives the image buffer from the front camera.
InitializationErrorenumThe Enum that contains error types of InitializationDelegate.
StatusErrorenumThe Enum that contains error types of StatusDelegate.
TrackingStateenumThe Enum that contains tracking state types of GazeInfo.
EyeMovementState​enumThe Enum that contains eye movement state types using at GazeInfo.
ScreenStateenumThe Enum that contains screen state types using at GazeInfo.
CalibrationModeenumThe Enum that contains calibration mode types of startCalibartion.
AccuracyCriteriaenumThe Enum that contains accuracy criteria of startCalibartion.

GazeTracker

  @objc public class GazeTracker: NSObject

GazeTracker is the class that generates gaze tracking data from the video from the device's front camera.

Summary

Constructor and Destructor
initGazeTracker
deinitGazeTracker
VariablesType
statusDelegateStatusDelegate?
gazeDelegateGazeDelegate?
calibrationDelegateCalibrationDelegate?
imageDelegateImageDelegate?
userStatusDelegateUserStatusDelegate?
Functions
getFrameworkVersion
startTracking
stopTracking
isTracking
setTrackingFPS
setCameraPreview
removeCameraPreview
startCalibration
stopCalibration
isCalibrating
startCollectSamples
setCalibrationData
setAttentionInterval
getAttentionScore
setAttentionRegion New
getattentionRegion New
removeattentionRegion New
setDelegates

getFrameworkVersion

  @objc public static func getFrameworkVersion() → String

This will return SeeSo framework version.

ReturnsTypeDescription
versionString, NSString *SeeSo Version
    let version = GazeTracker.getFrameworkVersion()
print("SeeSo version: \(version)")

initGazeTracker

  @objc public static func initGazeTracker(license: String,  delegate InitailizationDelegate, option: UserStatusOption? = nil)

GazeTracker constructing process includes authentication.

Function is asynchronous.

InitializationDelegate will relay constructed object.

ParametersTypeDescription
licenseString, NSString *Key generated from manage.seeso.io
delegateInitailzationDelegate, id <InitializationDelegate>Delegate represents initialization status
optionUserStatusOptionA class containing User status option information (optional)
    GazeTracker.initGazeTracker(license: "YOUR_LICENSE_KEY", delegate: self)

let userStatusOption = UserStatusOption()
userStatusOption.useAll()

GazeTracker.initGazeTracker(license: "YOUR_LICENSE_KEY", delegate: self, option: userStatusOption)

deinitGazeTracker

  @objc public static func deinitGazeTracker(tracker: GazeTracker?)

After the destruction, every API will be disabled.

For memory optimization, assigning nil to the destructed object is recommended.

ParametersTypeDescription
trackerGazeTracker?, GazeTracker *Target GazeTracker instance of destruction.
    GazeTracker.deinitGazeTracker(tracker: tracker);
tracker = nil

startTracking

  @objc public func startTracking()

This is the function that should be called before starting gaze tracking.

It calls the onStarted() function in the StatusDelegate object when succeeded.

  tracker.startTracking()

stopTracking

  @objc public func stopTracking()

his function is called to stop gaze tracking.

It calls the StatusDelegate.onStop() object when succeed.

  tracker.stopTracking()

isTracking

This function represents the status of gaze tracking.

It returns true when gaze tracking is working, false when gaze tracking is stopped.

Return TypeDescription
Bool, BOOLIt is a return value indicating whether the tracking state is present.
  let isTracking = tracker.isTracking();
print("Tracking status: \(isTracking)")

setTrackingFPS

  @objc public func setTrackingFPS(fps: Int) → Bool

The parameter that sets the FPS of the gaze tracking source.

Its value should bigger than 0 and no more than 30.

FPS can be dropped due to device spec. The default value is 30.

If state is TrackingState.FACE_MISSING, FPS will set as 30 automatically.

ParametersTypeDescription
fpsInt, NSIntegerCustom FPS(Frame Per Second) for gaze tracking.
Return TypeDescription
Bool, BOOLwhether it is set fps
  tracker.setTrackingFPS(fps: 20)

setCameraPreview

  @objc public func setCameraPreview(cameraPreview: UIView)

This is a preview of the camera that GazeTracker using.

You should rotate the screen if landscape mode

ParametersTypeDescription
cameraPreviewUIView, UIView *UIView for camera preview.
  tracker.setCamerapreview(preview: preview)

removeCameraPreview

  @objc public func removeCaemraPreview()

This will remove the camera preview that set at GazeTracker.

  tracker.removeCameraPreview();

startCalibration

  @objc public func  startCalibration(mode: CalibrationMode?, criteria : AccuracyCriteria?, region: CGRect?) → Bool

There are four cases at function returns. It returns true when parameters are valid.

If startCalibration() was called when the tracker is not tracking, it will return false.

If the value of mode is not defined, the function returns false.

It also returns false when the calibration region has been set outside of the device screen.

The false return will block the calibration process.

ParametersTypeDescription
modeCalibrationMode?Can select calibration option. Three options — DEFAULT(0), ONE_POINT(1), FIVE_POINT(5), SIX_POINT(6) — are available. Default is FIVE_POINT.
regionCGRect?Region that needs calibration. The unit is point(pt).
criteriaAccuracyCriteria?Option that manage calibration process and accuracy. Three options - DEFAULT(0), LOW(1), HIGH(2) - are available. Default is DEFAULT.
Return TypeDescription
Bool, BOOLWhether it is set.
  let startedResult = tracker.startCabliration(mode: .ONE_POINT, region: CGRect(x: 100, y: 100, width: 100, height: 100))
let startedResult2 = tracker.startCabliration(mode: .ONE_POINT)
let startedResult3 = tracker.startCabliration(region: CGRect(x: 100, y: 200, width: 300, height: 400))
let startedResult4 = tracker.startCabliration()

stopCalibration

  @objc public func stopCalibration()
  tracker.stopCalibration()

isCalibrating

  @objc public func isCalibrating() → Bool

It returns true when calibrating, false when when not calibrating.

Return TypeDescription
Bool, BOOLIt returns true when calibrating.
  let isCalibrating = tracker.isCalibrating();
print("Calibrating status: \(isCalibrating)")

startCollectSamples

  @objc public func startCollectSamples()

This relays the coordinates of the point that should be seen when calibration is in progress at the function: CalibrationDelegate.onCalibrationNextPoint.

tip

If startCollectSamples returns false when using SwiftUI, it may be caused because of timing issue between the SwiftUI and the core logic. You may add approximately 0.5 or 1 second delay to fix the issue.

Return TypeDescription
Bool, BOOLIf startCollectionSamples calling timing went wrong, it will return false You should display coordinates on the screen and call startCollectSamples for calibration.
  tracker.startCollectSamples()

setCalibrationData

  @objc public func setCalibrationData(calibrationData : [Double]) -> Bool

Set existing calibration data to GazeTracker.

ParametersTypeDescription
calibrationData[Double], NSArray<NSNumber \*> \*Calibration Data
Return TypeDescription
Bool, BOOLIf setCalibrationData() was called when the tracker is not tracking, it will return false.
  if (tracker.setCalibrationDat(calibrationData : data)) {
print("Loading of calibration data was successful.")
}else {
pirnt("Failed to load calibration data.")
}

setAttentionInterval

  @objc public func setAttentionInterval(interval : Int)

Set time interval for the UserStatus Attention callback.

The UserStatus Attention score will be calculated for the given time interval.

The beginning and ending timestamps are passed through the onAttention callback as timestampBegin and timestampEnd.

The interval range is 10 to 60 (seconds), and the default value is 30 seconds.

ParametersTypeDescription
intervalInt, NSIntegerTime interval for the UserStatus Attention score.
  tracker.setAttentionInterval(interval: 30)

getAttentionScore

  @objc public func getAttentionScore() -> Double

Get current Attention score from the GazeTracker.

This API does not provide timestamp of the Attention score data.

Use onAttention callback, unless the Attention score is required at a specific time/location.

Return TypeDescription
Double, doubleAttention score
  let score = tracker.getAttentionScore()

setAttentionRegion

  @objc public func SetAttentionRegion(region : CGRect)

Set Attention Region.

ParametersTypeDescription
regionCGRectdefault is UIScreen Bounds.
  ...
// set position (100,100), size (100,100) Attention region.
tracker?.setAttentionRegion(region: CGRect(x:100,y:100,width:100,height:100))
...

//The 'Attention' score for the set region is output.
print("Get Attention Scroe : \(tracker?.getAttentionScore())")

getAttentionRegion

  @objc public func GetAttentionRegion() -> CGRect

Get Current set Attention Region.

Return TypeDescription
CGRectdefault is UIScreen Bounds.
  ...
// set position (100,100), size (100,100) Attention region.
tracker?.etAttentionRegion(region: CGRect(x:100,y:100,width:100,height:100))
...

//The 'Attention' set region is output.
print("Get Attention Region : \(tracker?.getAttentionRegion())")

removeAttentionRegion

  @objc public func removeAttentionRegion()

current set attention region is removed.

  tracker?.removeAttentionRegion()
let rect = tracker?.getAttentionRegion()
// Weired values will be printed.
print("Attension Region : \(rect.x), (rect.y), (rect.width) \(rect.height)")

setDelegates

  @objc public func setDelegates(statusDelegate: StatusDelegate? , gazeDelegate: GazeDelegate? , calibrationDelegate: CalibrationDelegate?, imageDelegate: ImageDelegate?, userStatusDelegate: UserStatusDelegate? = nil)

Enroll all delegates that inherited GazeTrackerDelegate to GazeTracker at once.

ParameterTypeDescription
statusDelegateStatusDelegate?, id<StatusDelegate>StatusDelegate
gazeDelegateGazeDelegate?, id<GazeDelegate>GazeDelegate
calibrationDelegateCalibrationDelegate?, id<CalibrationDelegate>CalibrationDelegate
imageDelegateImageDelegate?, id<ImageDelegate>ImageDelegate
userStatusDelegate (optional)UserStatusDelegate, id<UserStatusDelegate>userStatusDelegate
  // parameter is nullable: attributes are selective.
tracker.setDelegates(statusDelegate: statusDelegate, gazeDelegate: self);

GazeInfo

  @objc public class GazeInfo : NSObject

A class composed of information about eye tracking.

Summary

Variablestypedescription
timestamplongTimestamp of gaze point. The unit is millisecond. The time format is UTC.
xfloatx coordinate value of gaze point. Origin is device screen. The unit is pixel(px).
yfloaty coordinate value of gaze point. Origin is device screen. The unit is pixel(px).
trackingStateTrackingStateSUCCESS, LOW_CONFIDENCE, UNSUPPORTED, FACE_MISSING
eyeMovementStateEyeMovementStateFIXATION, SACCADE, UNKNOWN
screenStateScreenStateINSIDE_OF_SCREEN, OUTSIDE_OF_SCREEN, UNKNOWN

timestamp

@objc public let timestamp : Double

Timestamp of gaze point. The unit is millisecond. The time format is UTC.

x

@objc public let x : Double

x coordinate value of gaze point. Origin is device screen. The unit is pixel(pt).

y

@objc public let y : Double

y coordinate value of gaze point. Origin is device screen. The unit is pixel(pt).

trackingState

@objc public let trackingState : TrackingState

See trackingState.

eyemovementState

@objc public let eyemovementState : EyeMovementState

See eyemovementState.

screenState

@objc public let screenState : ScreenState

See screenState.

UserStatusOption

  @objc public class UserStatusOption: NSObject

The class contains User Status options information for GazeTracker

Summary

functions
isUseAttention
isUseBlink
isUseDrowsiness
useAttention
useBlink
useDrowsiness
useAll

isUseAttention

  public Bool isUseAttention()

::: caution isUseAttension is renamed to isUseAttention as of 2.5.0. :::

Return TypeDescription
booleanReturn true if Attention flag is on, otherwise return false.
  public Bool isUseBlink()
Return TypeDescription
booleanReturn true if Blink flag is on, otherwise return false.

isUseDrowsiness

  public Bool isUseDrowsiness()
Return TypeDescription
booleanReturn true if Blink flag is on, otherwise return false.

useAttention

  public void useAttention()

Set Attention flag.

  public void useBlink()

Set Blink flag.

useDrowsiness

  public void useDrowsiness()

Set Drowsiness flag.

useAll

  public void useAll()

Set All User Status flag.

InitializationDelegate

  @objc public protocol InitailizationDelegate: GazeTrackerDelegate

Summary

function
onInitialized

onInitialized

  @objc public func onInitialized(tracker: GazeTracker?, error: InitializationError);

The callback function that calls when GazeTracker.init function is called.

It returns a constructed object when succeed, but nil if failed.

The error will be set by its type when construction failed.

ParameterTypeDescription
trackerGazeTracker?, GazeTracker *Relay GazeTracker when construction succeed. Relay nil if failed.
errorInitializationErrorThe enum that contains error types of initGazeTracker.
  public func onInitialized( tracker: GazeTracker?, error: InitializationError){
if (tracker != nil) {
// Take object as class property when initialzation succeed
this.tracker = tracker;
} else {
if (error == InitializationError.ERROR_INIT) {
print("Initialization failed")
} else if (error == InitializationError.ERROR_CAMERA_PERMISSION) {
...
}
}
}

GazeTrackerDelegate

  @objc public protocol GazeTrackerDelegate

protocol variable uses in GazeTracker. All delegate in GazeTracker inherit GazeTrackerDelegate.

StatusDelegate

 @objc public protocol StatusDelegate: GazeTrackerDelegate

onStarted

  public func onStarted()

The function that automatically calls after GazeTracker.StartTracking succeed. Actions like calibration, preview, etc. are available after it.

  public func onStarted(){
tracker.startCalibration()
}

onStopped

  public void onStopped(error: StatusError)

Error value will be StatusError.ERROR_NONE if gaze tracking stopped after GazeTracker.stopTracking called but different values for a different statuses.

It works properly when startTracking explicitly called at the gaze tracker stopping process.

ParameterTypeDescription
errorStatusErrorStatusError.ERROR_NONE, StatusError.ERROR_CAMERA_START, StatusError.ERROR_CAMERA_INTERRUPT are available.
  public void onStopped(error: StatusError) {
if (error != .ERROR_NONE) {
// stopTracking() is not called
}
}

GazeDelegate

  @objc public protocol GazeDelegate: GazeTrackerDelegate

onGaze

  @objc public func onGaze(gazeInfo : GazeInfo)
ParameterTypeDescription
gazeInfoGazeInfoSee GazeInfo.
  public func onGaze(gazeInfo : GazeInfo){
if ( gazeInfo.trackingState == TrackingState.SUCCESS){
// Gaze tracking succeed after calibration
}else {
// Face Missing
}
}

CalibrationDelegate

  @objc public protocol CalibrationDelegate: GazeTrackerDelegate

::: caution CalibrationDelegate is never called on the main queue. Please use DispatchQueue.main.async when using UI. :::

onCalibrationNextPoint

  @objc public func onCalibrationNextPoint(x: Double,  y: Double)

The x, y coordinate value of the gaze point that should be focused during the calibration process.

You should call startCollectSamples to keep process the calibration.

ParameterTypeDescription
xDouble, doubleThe x coordinate value of the gaze point that should be focused during the calibration process. Origin is the device screen. The unit is point(pt)
yDouble, doubleThe y coordinate value of the gaze point that should be focused during the calibration process. Origin is the device screen. The unit is point(pt)
  public func onCalibrationNextPoint(x: Double, y: Double){
// Display UI at given point
// call startCollectSamples() like below for start calibration
tracker.startCollectSamples()
}

onCalibrationProgress

  @objc public func onCalibrationProgress(progress: Double)

Progress will be between 0.0~1.0.

The next point will be guided when the value reaches 1.0.

ParameterTypeDescription
progressDouble, doubleCalibration progression for each point.
  public func onCalibrationProgress(progress: Double){
progress.setProgress(progress)
}

onCalibrationFinished

  @objc public func onCalibrationFinished(calibrationData: [Double])

Callback for notify calibration ends. When this function is called, the calibration UI will be removed. After this callback, data from GazeDelegate will be calibrated gaze data.

The calibrationData passed as a parameter has already been applied to GazeTracker,

and you can save it and load the this calibration data directly into GazeTracker without new calibration by calling setCalibrationData(calibrationData) when restarting the app etc..

parameterTypeDescription
calibrationData[Double], NSArray<NSNumber *> *Calibration Data
  public func onCalibrationFinished(calibrationData: [Double]) {
//Remove calibration UI
removeCalibrationUI()
//save calibration Data
self.calibrationData = calibrationData
}

UserStatusDelegate

  @objc public protocol UserStatusDelegate: GazeTrackerDelegate
  1. Attention: How much the user attention is focused on the screen content for interval time (0.0 ~ 1.0)

  2. Drowsiness: If the user feel drowsiness (True/False)

  3. Blink: If the user blink eyes (left eye, right eye, general(both eyes))

Summary

Functions
onAttention
onBlink
onDrowsiness

onAttention

  @objc public func onAttention(timestampBegin: Int, timestampEnd: Int, score: Double)
  1. Timestamp range of the data will be passed as timestampBegin and timestampEnd in onAttention callback.
  2. The default time interval is 30 seconds.
  3. If the user attention level is low, score in onAttention callback will be closed to 0.0.
  4. If the user attention level is high, score in onAttention callback will be closed to 1.0.
ParametersTypeDescription
timestampBeginInt, NSIntegerBeginning Timestamp of the data.
timestampEndInt, NSIntegerEnding Timestamp of the data.
scoreDouble, doubleUser Attention rate score between the timestamps.
  func onAttention(timestampBegin: Int, timestampEnd: Int, score: Double) {
attentionView.text = "Attention: " + String(round(score * 10000) / 10000)
}
  @objc public func onBlink(timestamp: Int, isBlinkLeft: Bool, isBlinkRight: Bool, isBlink: Bool, eyeOpenness: Double)
  1. Timestamp of the data will be passed as timestamp in onBlink callback.
  2. If the user blink left eye, isBlinkLeft in onBlink callback will be true.
  3. If the user blink right eye, isBlinkRight in onBlink callback will be true.
  4. If the user blink eyes, isBlink in onBlink callback will be true (This is a general blink condition).
  5. If the user's eyes are wide, eyeOpenness in onBlink callback will be closed to 1.0 (not available yet).
  6. If the user's eyes are narrow, eyeOpenness in onBlink callback will be closed to 0.0 (not available yet).
ParametersTypeDescription
timestampInt, NSIntegerTimestamp of the data.
isBlinkLeftBool, BOOLUser Left Blink flag.
isBlinkRightBool, BOOLUser Right Blink flag.
isBlinkBool, BOOLUser Blink flag.
eyeOpennessDouble, doubleUser EyeOpenness rate (not available yet).
  func onBlink(timestamp: Int, isBlinkLeft: Bool, isBlinkRight: Bool, isBlink: Bool, eyeOpenness: Double) {
blinkView.text = "Blink: " + String(isBlink)
blinkLeftView.text = "Blink Left: " + String(isBlinkLeft)
blinkRightView.text = "Blink Right: " + String(isBlinkRight)
}

onDrowsiness

  @objc public func onDrowsiness(timestamp: Int, isDrowsiness: Bool)
  1. Timestamp of the data will be passed as timestamp in onDrowsiness callback.
  2. If the user feel Drowsiness, isDrowsiness in onDrowsiness callback will be true, Otherwise, isDrowsiness will be false.
ParametersTypeDescription
timestampInt, NSIntegerTimestamp of the data.
isDrowsinessBool, BOOLUser Drowsiness flag.
  func onDrowsiness(timestamp: Int, isDrowsiness: Bool) {
drowsinessView.text = "Drowsiness: " + String(isDrowsiness)
}

ImageDelegate

  @objc public protocol ImageDelegate: GazeTrackerDelegate

onImage

  @objc public func onImage(timestamp: Double,  image: CMSampleBuffer)

The function that provide the image as CMSampleBuffer form.

ParameterTypeDescription
timestampDouble, doubleThe timestamp of camera image creation. The unit is millisecond. The time format is UTC.
imageCMSampleBuffer, CMSampleBufferRefCMSampleBuffer type image from camera((kCVPixelFormatType_32BGRA, AVCaptureSession.Preset.vga640x480). The direction is 90 deg rotated as CCW from portrait direction.
  public func onImage(timestamp: Double, image: CMSampleBuffer){
// SAMPLE: save image as jpeg with proper function
if imageSave {
writeJPEGFile(image: image);
}
}

InitializationError

  @objc public enum InitializationError: Int

The Enum that contains error types of InitializationDelegate

Please read : Authentication for more details.

StatusError

  @objc public enum StatusError: Int

The enum that contains error types of StatusDelegate

ConstantValueDescription
ERROR_NONE0GazeTracker.stopTracking call succeed without error.
ERROR_CAMERA_START1Error code occurs when GazeTracker.startTracking is called but front camera of device is not available.
ERROR_CAMERA_INTERRUPT2Error code occurs when camera is unavailable.

TrackingState

  @objc public enum TrackingState: Int

The enum that contains state types using at GazeDelegate.

ConstantTypeDescription
SUCCESS0Face alignment is in a best position (Gaze tracking success, with valid x and y).
LOW_CONFIDENCE1Face alignment is not in the best position, should not be used for precise gaze tracking (Gaze tracking success, with less accurate x and y).
UNSUPPORTED2Face alignment is not suitable for tracking (Gaze tracking fail, with invalid x and y).
FACE_MISSING3Face is missing (Gaze tracking fail).

EyeMovementState

  @objc public enum EyeMovementState: Int

The enum that contains state types using at GazeDelegate.

ConstantsValueDescritpion
FIXATION0It means that the gazes from the past and the gazes up to the present have made a fixation.
SACCADE2It means that the gazes from the past and the gazes up to the present have formed a saccade.
UNKNOWN3Not pixation or saccade.

ScreenState

  @objc public enum ScreenState: Int

The enum that contains state types using at GazeDelegate.

ConstantsValueDescription
INSIDE_OF_SCREEN0Gaze tracking is success and the gaze point is inside of device screen.
OUTSIDE_OF_SCREEN1Gaze tracking is success and the gaze point is outside of device screen.
UNKNOWN2Gaze tracking is fail.

CalibrationMode

  @objc public enum CalibrationMode: Int

The enum that contains mode types of GazeTracker.startCalibartion .

ConstantsValueDescription
DEFAULT0Default. Represent FIVE_POINT.
ONE_POINT1One-point calibration mode.
FIVE_POINT5Five-point calibration mode.
SIX_POINT6Six-point calibration mode.

AccuracyCriteria

  @objc public enum AccuracyCriteria: Int

The enum that contains accuracy criteria of GazeTracker.startCalibartion.

ConstantsValueDescription
DEFAULT0Default calibration accuracy criteria.
LOW1Low calibration accuracy criteria.
HIGH2High calibration accuracy criteria.