FaceAI Web SDK; the fastest, lightest and complete Facial Expression Analysis and Emotion Recognition AI; that works on the Client End Point / device within any HTML5 Web-browser on Mobile or Desktop. FaceAI SDK works right on Client End, none of your personal data goes to a Server.
Released: September 25, 2020
Table of Contents
- How to use FaceAI Web SDK?
- Initialize FaceAI
- Face Detector
- Face Pose
- Face Age
- Face Gender
- Face Emotion
- Face Features
- Face Arousal Valence
- Face Attention
- Face Wish
How to use FaceAI Web SDK?
- Download the SDK and extra the archive. You will find the following .js file.
EnxFaceAI.js
– This is a standard Java Script library.
- Now, you may use the EnxFaceAI.js file in your HTML file to make use of the SDK.
<html> <head> <script language="javascript" src="path/EnxFaceAI.js"></script> </head> <body></body> </html>
Note: You must enable FaceAI while defining a Room for Facial Expression Analysis to work. Use “{ "settings": { "facex": true; }}
to define a room.
Initialize FaceAI
FaceAI analyses Video Streams within an ongoing EnableX Video Session. To get started with analysis, you must bind the FaceAI Object with the Room in which Video Session is on.
Method: EnxFaceAI.init(connectedRoomInfo, stream, callback)
– To start analysis on given Stream
Parameters:
connectedRoomInfo
– JSON Object. Pass the Response-JSON returned as Callback ofEnxRtc.joinRoom()
orEnxRoom.connect()
method.stream
– The Stream Object which will be analyzed. You may analyze Local Stream Object or Remote Stream Object (Stream Reference may be bound in Active Talkers List)callback
: Callback to know if the room is enabled for FaceAI Analysis an that the client point is connected to an active session
Example:
localStream = EnxRtc.joinRoom(token, config, (response, error) => { if (error && error != null) { } if (response && response != null) { const FaceAI = new EnxFaceAI(); // Construct the Object FaceAI.init(response, localStream, function (event) { // event.result == 0 - All Ok to process }) } })
Face Detector
This is to detect how many faces are there in a Video Stream. The Event Listener keeps getting data in JSON as the detector tries to detect faces in the changing video frame.
Method: EnxFaceAI.startFaceDetector(config, callback)
– To start analysis
Parameters:
config
– JSON Object. This is to configure or customize parameter using which the Face Detector would analyze.maxInputFrameSize
– Number. Default 160 (pixel). Input Frame Size in pixels for Face Detection.fullFrameDetection
– A boolean. It is true when detection was full-frame and multiple faces can be returned, false otherwise.
callback
: Callback to know that processing request has been accepted.
Event Listener:
face-detector
– This event notifies repeatedly with Face Detection analysis report with JSON Object. JSON Object Reference appended below:
{ faces: Array(n) rects: Array(n) status: string }
JSON Object Explanation:
faces
: Array. The detected faces in form of ImageData objects (zero or one; or multiple faces, if fullFrameDetection is true)rects
: Array of objects. Describes the bounding boxes (zero or one; or multiple rects, if fullFrameDetection is true)x
: Upper left point x coordinatey
: Upper left point y coordinatewidth
: Width of the bounding boxheight
: Height of the bounding box
status
: String. Its status of the face trackerINIT
: Detector initializing; zero or many faces could be returnedTRACK_OK
: Detector is correctly tracking one face; one face is returnedRECOVERING
: Detector lost a face and attempting to recover and continue tracking; zero faces are returned
Method: EnxFaceAI.stopFaceDetector(callback)
– – To stop analysis
Parameters:
callback
: Callback to know that request has been accepted.
Example:
config = { maxInputFrameSize: 200, fullFrameDetection: true }; // Start Face Detector faceAI.startFaceDetector(config, (res) => { if (res.result === 0) { window.addEventListener("face-detector", (evt) => { console.log(evt.detail, "face-detector"); }); } }); // Stop Face Detector faceAI.stopFaceDetector((res) => { if (res.result === 0) { } });
Face Pose
This is to analyze face rotation and position in a Video Stream. The Event Listener keeps getting data in JSON as FaceAI keeps detecting face rotation in the video stream. Face Rotation angle data is represented in terms of radiants as Pitch, Roll and Yaw.
Method: EnxFaceAI.startFacePose(config, callback)
– To start analysis
Parameters:
config
– JSON Object. This is to configure or customize parameter using which the Face Pose would be analyzed.smoothness
: Number. Default 0.65. Range 0-1. A value closer to 1 provides greater smoothing and slower response time. Lower values provide less smoothing but faster response time. Set it to 0 (zero) if you need the raw signal.
callback
: Callback to know that processing request has been accepted.
Event Listener:
face-pose
– This event notifies repeatedly with Face Rotation & Position analysis report with JSON Object. JSON Object Reference appended below:
{ output: { pose: { pitch: Number, roll: Number, yaw: Number } } }
JSON Object Explanation:
output
: Face Rotation & Position Reportpose
: Filtered (smoothened) pose rotation angles expressed in radiants aspitch
,roll
andyaw
.

Method: EnxFaceAI.stopFacePose(callback)
– – To stop analysis
Parameters:
callback
: Callback to know that request has been accepted.
Example:
config = { smoothness: 0.65 }; // Start Face Pose faceAI.startFacePose(config, (res) => { if (res.result === 0) { window.addEventListener("face-pose", (evt) => { console.log(evt.detail, "face-pose"); }); } }); // Stop Face Pose faceAI.stopFacePose((res) => { if (res.result === 0) { } });
Face Age
This is to analyze and predict face age in a Video Stream. Face Age predicts within a age-range. The Event Listener keeps getting data in JSON as FaceAI keeps analyzing face age. If the prediction quality is poor, the event is not fired.
Method: EnxFaceAI.startFaceAge(config, callback)
– To start analysis
Parameters:
config
– JSON Object. This is to configure or customize parameter using which the Face Age would be analyzed.- Currently EnableX doesn’t have any parameter. Therefore, you need to keep it empty while calling this method.
callback
: Callback to know that processing request has been accepted.
Event Listener:
face-age
– This event notifies repeatedly with Face AGE analysis report with JSON Object. JSON Object Reference appended below:
{ output: { age: { -18: Number, 18-35: Number, 35-51: Number, 51-: Number }, numericAge: Number } }
JSON Object Explanation:
output
: Face Age Analysis Reportage
: Filtered (smoothened) age prediction:-18
: Probability Weightage suggesting less than 18 years old.18-35
: Probability Weightage suggesting between 18 to 35 years old.35-51:
Probability Weightage suggesting between 18 35 years old.51-
: Probability Weightage suggesting equal or greater than 51 years old.
numericAge
: Numeric. Estimated Age
Method: EnxFaceAI.stopFaceAge(callback)
– – To stop analysis
Parameters:
callback
: Callback to know that request has been accepted.
Example:
config = {}; // Start Face Age faceAI.startFaceAge(config, (res) => { if (res.result === 0) { window.addEventListener("face-age", (evt) => { console.log(evt.detail, "face-age"); }); } }); // Stop Face Age faceAI.stopFaceAge((res) => { if (res.result === 0) { } });
Face Gender
This is to analyze face gender in a Video Stream. The Event Listener keeps getting data in JSON as FaceAI keeps analyzing face gender.
Method: EnxFaceAI.startFaceGender(config, callback)
– To start analysis
Parameters:
config
– JSON Object. This is to configure or customize parameter using which the Face Gender would be analyzed.smoothness
: Number. Default 0.95. Range 0-1. A value closer to 1 provides greater smoothing and slower response time. Lower values provide less smoothing but faster response time. Set it to 0 (zero) if you need the raw signal.threshold
: Number. Default 0.70. Range 0.5-1. It controls the minimum value of confidence for whichmostConfident
output returns the predicted gender name instead of undefined.
callback
: Callback to know that processing request has been accepted.
Event Listener:
face-gender
– This event notifies repeatedly with Face Gender analysis report with JSON Object. JSON Object Reference appended below:
{ output: { gender: { Female: Number, Male: Number }, mostConfident: String } }
JSON Object Explanation:
output
: Face Gender Reportgender
: Filtered (smoothened) probabilities of the gender prediction:Female
: Probability weightage for gender is femaleMale
: Probability weightage for gender is male
mostConfident
: Gender name of the most likely result if its smoothened probability is above the threshold, otherwise it is undefined.
Method: EnxFaceAI.stopFaceGender(callback)
– – To stop analysis
Parameters:
callback
: Callback to know that request has been accepted.
Example:
config = { smoothness: 0.95, threshold: 0.70 }; // Start Face Gender faceAI.startFaceGender(config, (res) => { if (res.result === 0) { window.addEventListener("face-gender", (evt) => { console.log(evt.detail, "face-gender"); }); } }); // Stop Face Gender faceAI.stopFaceGender((res) => { if (res.result === 0) { } });
Face Emotion
This is to analyze face emotions in Video Stream. It analyses basic 8 emotions in a human face, viz. Angry, Disgust, Fear, Happy, Sad, Surprise, Neutral. It also returns most dominate emotion on face. The Event Listener keeps getting data in JSON as FaceAI keeps analyzing face emotion.
Method: EnxFaceAI.startFaceEmotion(config, callback)
– To start analysis
Parameters:
config
– JSON Object. This is to configure or customize parameter using which the Face Gender would be analyzed.smoothness
: Number. Default 0.95. Range 0-1. A value closer to 1 provides greater smoothing and slower response time. Lower values provide less smoothing but faster response time. Set it to 0 (zero) if you need the raw signal.threshold
: Number. Default 0.70. Range 0.5-1. It controls the minimum value of confidence for whichmostConfident
output returns the predicted gender name instead of undefined.
callback
: Callback to know that processing request has been accepted.
Event Listener:
face-emotion
– This event notifies repeatedly with Face Gender analysis report with JSON Object. JSON Object Reference appended below:
{ output: { dominantEmotion: String, emotion: { Angry: Number, Disgust: Number, Fear: Number, Happy: Number, Neutral: Number, Sad: Number, Surprise: Number } } }
JSON Object Explanation:
output
: Face Emotion ReportdominantEmotion
: Name of Dominant Emotion if present, otherwise it is undefined.emotion
: Filtered (smoothened) values of the probability distribution of emotions. The sum of all the probabilities is always 1, each probability in the distribution has a value between 0 and 1.Angry
: Probability for Angry.Disgust
: Probability for Disgust.Fear
: Probability for Fear.Happy
: Probability for Happy.Sad
: Probability for Sad.Surprise
: Probability for Surprise.Neutral
: Probability for Neutral.
Method: EnxFaceAI.stopFaceEmotion(callback)
– – To stop analysis
Parameters:
callback
: Callback to know that request has been accepted.
Example:
config = { smoothness: 0.95, threshold: 0.70 }; // Start Face Emotion faceAI.startFaceEmotion(config, (res) => { if (res.result === 0) { window.addEventListener("face-emotion", (evt) => { console.log(evt.detail, "face-emotion"); }); } }); // Stop Face Emotion faceAI.stopFaceEmotion((res) => { if (res.result === 0) { } });
Face Features
This is to analyze face features in a Video Stream. The Event Listener keeps getting data in JSON as FaceAI keeps analyzing face features.
Method: EnxFaceAI.startFaceFeatures(config, callback)
– To start analysis
Parameters:
config
– JSON Object. This is to configure or customize parameter using which the Face Features would be analyzed.smoothness
: Number. Default 0.90. Range 0-1. Value closer to 1 provides greater smoothing and slower response time. Lower values provide lesser smoothing but faster response time. Set it to 0 (zero) if you need the raw signal.
callback
: Callback to know that processing request has been accepted.
Event Listener:
face-features
– This event notifies repeatedly with Face Features analysis report with JSON Object. JSON Object Reference appended below:
{ output: { features: { ArchedEyebrows: Number, Attractive: Number, .... .... } } }
JSON Object Explanation:
output
: Face Features Reportfeatures
: Filtered (smoothened) probabilities of each face independent feature in range 0.0 – 1.0. The following features are evaluated:- Arched Eyebrows
- Attractive
- Bags Under Eyes
- Bald
- Bangs
- Beard 5 O’Clock Shadow
- Big Lips
- Big Nose
- Black Hair
- Blond Hair
- Brown Hair
- Chubby
- Double Chin
- Earrings
- Eyebrows Bushy
- Eyeglasses
- Goatee
- Gray Hair
- Hat
- Heavy Makeup
- High Cheekbones
- Lipstick
- Mouth Slightly Open
- Mustache
- Narrow Eyes
- Necklace
- Necktie
- No Beard
- Oval Face
- Pale Skin
- Pointy Nose
- Receding Hairline
- Rosy Cheeks
- Sideburns
- Straight Hair
- Wavy Hair
Method: EnxFaceAI.stopFaceFeatures(callback)
– – To stop analysis
Parameters:
callback
: Callback to know that request has been accepted.
Example:
config = { smoothness: 0.90 }; // Start Face Features faceAI.startFaceFeatures(config, (res) => { if (res.result === 0) { window.addEventListener("face-features", (evt) => { console.log(evt.detail, "face-features"); }); } }); // Stop Face Features faceAI.stopFaceFeatures((res) => { if (res.result === 0) { } });
Face Arousal Valence
This is to analyze face arousal valence in a Video Stream. The Event Listener keeps getting data in JSON as FaceAI keeps analyzing face arousal valence.
Method: EnxFaceAI.startFaceArousalValence(config, callback)
– To start analysis
Parameters:
config
– JSON Object. This is to configure or customize parameter using which the Face Arousal Valence would be analyzed.smoothness
: Number. Default 0.70. Range 0-1. Value closer to 1 provides greater smoothing and slower response time. Lower values provide lesser smoothing but faster response time. Set it to 0 (zero) if you need the raw signal.
callback
: Callback to know that processing request has been accepted.
Event Listener:
face-arousal-valence
– This event notifies repeatedly with Face Arousal Valence analysis report with JSON Object. JSON Object Reference appended below:
{ output: { arousalvalence: { arousal: Number, valence: Number } } }
JSON Object Explanation:
output
: Face Arousal Valence Reportarousalvalence
: Filtered (smoothened) values.arousal
: Range 1.0 to 1.0. It represents the degree of engagement (positive arousal), or disengagement (negative arousal).valence
: Range -1.0 to 1.0. It represents the degree of pleasantness (positive valence), or unpleasantness (negative valence).
Method: EnxFaceAI.stopFaceArousalValence(callback)
– – To stop analysis
Parameters:
callback
: Callback to know that request has been accepted.
Example:
config = { smoothness: 0.70 }; // Start Face Arousal Valence faceAI.startFaceArousalValence(config, (res) => { if (res.result === 0) { window.addEventListener("face-arousal-valence", (evt) => { console.log(evt.detail, "face-arousal-valence"); }); } }); // Stop Face Arousal Valence faceAI.stopFaceArousalValence((res) => { if (res.result === 0) { } });
Face Attention
This is to analyze face attention in a Video Stream. The Event Listener keeps getting data in JSON as FaceAI keeps analyzing face attention.
Method: EnxFaceAI.startFaceAttention(config, callback)
– To start analysis
Parameters:
config
– JSON Object. This is to configure or customize parameter using which the Face Attention would be analyzed.smoothness
: Number. Default 0.83. Range 0-1. Value closer to 1 provides greater smoothing and slower response time. Lower values provide lesser smoothing but faster response time. Set it to 0 (zero) if you need the raw signal.
callback
: Callback to know that processing request has been accepted.
Event Listener:
face-attention
– This event notifies repeatedly with Face Attention analysis report with JSON Object. JSON Object Reference appended below:
{ output: { attention: Number } }
JSON Object Explanation:
output
: Face Attention Reportattention
: Filtered value (smoothened) in range [0.0, 1.0]. A value close to 1.0 represents attention, a value close to 0.0 represents distraction.
Method: EnxFaceAI.stopFaceAttention(callback)
– – To stop analysis
Parameters:
callback
: Callback to know that request has been accepted.
Example:
config = { smoothness: 0.85 }; // Start Face Attention faceAI.startFaceAttention(config, (res) => { if (res.result === 0) { window.addEventListener("face-attention", (evt) => { console.log(evt.detail, "face-attention"); }); } }); // Stop Face Attention faceAI.stopFaceAttention((res) => { if (res.result === 0) { } });
Face Wish
This is to analyze face wish in a Video Stream. The Event Listener keeps getting data in JSON as FaceAI keeps analyzing face wish.
Method: EnxFaceAI.startFaceWish(config, callback)
– To start analysis
Parameters:
config
– JSON Object. This is to configure or customize parameter using which the Face Wish would be analyzed.smoothness
: Number. Default 0.80. Range 0-1. Value closer to 1 provides greater smoothing and slower response time. Lower values provide lesser smoothing but faster response time.
callback
: Callback to know that processing request has been accepted.
Event Listener:
face-wish
– This event notifies repeatedly with Face Wish analysis report with JSON Object. JSON Object Reference appended below:
{ output: { wish: Number } }
JSON Object Explanation:
output
: Face Wish Reportwish
: Filtered value (smoothened) in range [0, 1.0]. A value closer to 0 represents a lower wish, a value closer to 1.0 represents a higher wish.
Method: EnxFaceAI.stopFaceWish(callback)
– – To stop analysis
Parameters:
callback
: Callback to know that request has been accepted.
Example:
config = { smoothness: 0.80 }; // Start Face Wish faceAI.startFaceWish(config, (res) => { if (res.result === 0) { window.addEventListener("face-wish", (evt) => { console.log(evt.detail, "face-wish"); }); } }); // Stop Face Wish faceAI.stopFaceWish((res) => { if (res.result === 0) { } });