Skip to main content

SDK Methods

create

Description:

You need to call this method before using the SDK. The method returns an instance of the SDK.

Parameters:

NameTypeDescription
apiKeystringAPI key
apiURLstringAPI URL
langstringLanguage (en or es)
encryptbooleanIndicates the use of encryption (default: false)
translationsobjectThe object with the translations. (Ask support) (optional)
opencvURLstringThe URL of opencv (optional)
facefinderURLbooleanThe URL of facefinder (optional)
darkModebooleanShow dark variants of all tutorials (default: false)

Returns

  • Object:

    The SDK instance

Example:

// Get onBoarding instance
const onBoarding = OnBoarding.create({
apiKey: "myApiKey",
apiURL: "myApiURL",
lang: "en",
});
// You can save it on a global variable
var onBoarding = OnBoarding.create({ apiKey: "myApiKey", apiURL: "myApiURL" });

// use the instance
onBoarding.createSession(countryCode).then((token) => {
// save token
});

createSession

Description:

Initializes new onboarding session.

Parameters:

NameTypeDescription
countryCodestringit can be 'US' OR 'MX'.
externalIdstring (optional)Any ID
optionsObjectobject (optional)Check Options seccions

Options:

NameTypeDescription
interviewIdstringThe interviewId for another session. If you send this paramameter you are not going to create a new session. It will use the one in interviewId.
configurationIdstringThe flow (configuration) ID. If you send this paramameter the thresholds, checks, autoapprove, etc. of the flow will be used.
customFieldsobjectThe object containing the customField

Returns:

The access token

  • Object:

    JSON object

Example:

// Get token
onBoarding.createSession(countryCode).then(token => {
// save token
...
});

Or with an async function

// Get session
const session = await onBoarding.createSession(countryCode);
const token = session.token;
const interviewId = session.interviewId;

With an interviewId

const session = await onBoarding.createSession("MX", null, {
interviewId: "5ec49fd49999999999999f1e",
});

isDesktop

Description:

This function informs you if the user is using a desktop computer (it can be a laptop)

Example:

if (onBoarding.isDesktop()) {
onBoarding.renderRedirectToMobile(containerRef.current, {
onSuccess: () => {
renderFinishScreen();
},
session: session,
flowId: flowId,
});
} else {
// show mobile normal flow
renderFrontId();
}

renderRedirectToMobile

Description:

This method renders the Redirect Component

Parameters:

NameTypeDescription
containerHTMLElement
options Objectobjectoptions object

Options:

NameTypeDescription
sessionobject*The session object you get in createSession (Mandatory)
flowIdstringID of the flow that will be used in the mobile
onSuccessfunction*Callback when the onboarding in mobile was successful and completed
urlstringURL to be redirected to
showSmsbooleanBoolean flag indicating if the SMS component should be shown
allowReEnrollmentbooleanBoolean flag indicating if user re-enrollment is allowed
externalIdstringExternal ID to concatenate to URL if needed
assetsobjectObject containing assets from the dashboard
expiredbooleanBoolean flag indicating if a session is expired
smsTextstringCustom text to be send in SMS. The URL is concatenated after this text

Example:

const flowId = "someflowid";

const session = await onBoarding.createSession("ALL", null, {
configurationId: flowId,
});

if (onBoarding.isDesktop()) {
onBoarding.renderRedirectToMobile(containerRef.current, {
onSuccess: () => {
renderFinishScreen();
},
session: session,
flowId: flowId,
});
} else {
// show mobile normal flow
renderFrontId();
}

sendGeolocation

Description:

This is almost the same as addGeolocation with the difference being that this will automatically ask the user for the coordinates.

Parameters:

NameTypeDescription
tokenstringaccess token to be used in all other methods.

Returns:

  • Object:

    JSON object

  • JSON example:

    {
    "location": "Belgrade, Serbia"
    }

Example:

onBoarding.sendGeolocation({ token }).then((res) => res);

renderCamera

Description:

This method is going to render the camera with the autodetect and notifications in the browser.

Parameters:

NameTypeDescription
typestringtype of captured image: `'front''back''selfiedocumentpassportsecondIdthirdIdinsuranceCard'`
containerHTMLElement
optionsObjectobjectoptions object

Options:

NameTypeDescription
onSuccessfunctionCallback when the detect was successful, with param object
onErrorfunctionCallback when the detect was unsuccessful
tokenObjectThe session object (optional if not presented is readed from state)
numberOfTriesNumberThe number of capturing attempts
onLogfunctionA function with a logger
permissionMessageStringThe message in the permission screen. Default: "Press \"Allow\" to continue."
permissionBackgroundColorStringThe background of permissions screen. Default: "#696969
sendBase64booleanIf you need to send the image in base64, set this to true (only if type = 'selfie')
timeoutnumberThe timeout in milliseconds for looking for a face (only if type = 'selfie')
showPreviewbooleanSet to true to show capture preview. (only if type = 'selfie')
streamMediaStreamIf you send a stream, it won't ask the user again. This is an advanced option.
nativeCamerabooleanThis is only optional for documents. If set to true, it's going to use the native camera for the capture (default: false)
showTutorialbooleanOption to show the tutorial (default: false)
isRecordingEnabledbooleanOption to record the capture camera (default: false)
onRestartOnboardingfunctionCallback for the restart flow after camera access is denied (default: () => window.location.reload())
showDoublePermissionsRequestbooleanOption to show the custom permission prompt (default: false)
onlyAllowCapturebooleanOption to disable the uploading of files and images. Only works when the type is document and nativeCamera is set to true (default: false)
disableFullScreenbooleanOption to disable the full screen behaviour our SDK has. This will use the container width and height. Use this only for Desktop apps. (default:false)
showCustomCameraPermissionScreenbooleanOption to open a modal with instructions to allow camera permissions when user denies them. (default:false)

onSuccess parameters:

NameTypeDescription
classificationBooleanIs server classified image as a Id flag
correctGlareBooleanIs the glare in good range flag
correctSharpnessBooleanIs the sharpness in the good range flag
countryCodeStringValid ISO alpha-2 or alpha-3 code of the ID issuing country
glareNumberGlare level
horizontalResolutionNumberHorizontal resolution in pixels
issueNameStringName of provided ID
issueYearNumberThe year when the ID was issued
readabilityBooleanIs the data readable from the ID flag
sessionStatusStringCurrent status of the session
sharpnessNumberSharpness level
skipBackIdCaptureBooleanFlag for skipping back ID capture
typeOfIdStringType of ID (ID, Passport...)
failedReasonStringClassification fail reason.

Returns:

  • Object:

    JSON object

    NameTypeDescription
    closefunctionThis will close the camera and will unmount the component

Example:

const { close } = incode.renderCamera("front", container, {
onSuccess: onSuccess,
onError: onError,
});

// then you can close the SDK from your code
document.querySelector("#close-modal").addEventListener("click", () => {
close(); // this will close the component prematurely, so use it only if you want to abort the process and close the camera
});

processId

Description:

This method should be called once the uploads of the front and back sides of the ID are over. Incode validations, tests and OCR parsing are processed during this method. To fetch OCR data, use the ocrData method. Parameter queueName is needed so it can be determined in which queue the interview should go. Once this is called, the user is no longer able to upload the ID (front, back side, passport or insurance card)!

Parameters:

NameTypeDescription
tokenstringaccess token.

Returns:

  • Object:

    JSON object

  • JSON example:

    {
    "success": true
    }

Example:

const response = await onBoarding.processId({
token,
});

processFace

This method should be called after both the selfie and the front of the ID are uploaded. (/omni/add/face and /omni/add/front-id). The response contains information if this user already exists in the system and there is recognition confidence between the selfie image and the face image from the front of the ID. In order to get face images, the method /omni/get/images should be called.

Parameters:

NameTypeDescription
tokenstringaccess token.

Returns:

  • Object:

    JSON object

  • Parameters:

    NameTypeDescription
    confidencenumberfloat number 0 - 1 indicating recognition confidence
    existingUserbooleanindicates if the user has already been enrolled
  • JSON example:

    {
    "confidence": 0.794,
    "existingUser": false
    }

Example:

const response = await onBoarding.processFace({
token,
});

renderFaceMatch

Description:

This method renders the component that shows the face match animation.

Parameters:

NameTypeDescription
containerHTMLElement
options Objectobjectoptions object

Options:

NameTypeDescription
tokenStringThe session object
onSuccessfunctionCallback when the face match animation is finish
livenessbooleanIf the liveness passed or not. You will get in the onSuccess callback from renderCamera('selfie', ...)
existingUserbooleanIf the user is already in our database. You will get in the onSuccess callback from renderCamera('selfie', ...)

Example:

incode.renderFaceMatch(document.getElementById("app"), {
token: session,
onSuccess: console.log,
onError: console.log,
liveness: liveness,
existingUser: existingUser,
});

renderMlConsent

Description:

This method will render the ML Consent component.

Parameters:

NameTypeDescription
containerHTMLElement
options Objectobjectoptions object

Options:

NameTypeDescription
tokenobjectThe token object (mandatory).
typestringML Type (Worldwide)
onSuccessfunctionCallback when the upload of signature was successful

Example:

onBoarding.renderMlConsent(document.getElementById("app"), {
token: session,
onSuccess() {
console.log("ML accepted");
},
});

renderQrScanner

Description:

This method will render the QR Scanner component.

Parameters:

NameTypeDescription
containerHTMLElement
options Objectobjectoptions object

Options:

NameTypeDescription
sessionobjectThe session object (mandatory).
onSuccessfunctionCallback when the upload of signature was successful

Example:

onBoarding.renderQrScanner(document.getElementById("app"), {
session: session,
onSuccess() {
console.log("QR scanned");
},
});

renderEnterCurp

Description:

This method renders the component to enter and validate CURP.

Parameters:

NameTypeDescription
tokenstringaccess token to be used in all other methods.

Example:

incode.renderEnterCurp(document.getElementById('app'), {
token: session.token,
onSuccess: console.log,
onError: console.log,,
})

processGovernmentValidation

Description:

This method should be called after both /omni/process/id (or /omni/add/document-id) and /omni/add/face are over. This method compares the user's selfie image against the image in the INE database.

Parameters:

NameTypeDescription
tokenstringaccess token.

Returns:

  • Object:

    JSON object

  • JSON example:

    {
    "success": true
    }

renderSignature

Description:

This method will render the Signature Component.

Parameters:

NameTypeDescription
containerHTMLElement
options Objectobjectoptions object

Options:

NameTypeDescription
tokenStringThe token string (Mandatory)
onSuccessfunctionCallback when the upload of signature was successful
onErrorfunctionCallback when the upload of signature was unsuccessful
typestringType of signature. This can be used for sign contracts
initialsbooleanIf true, this will send initials and not signature
titlejsxThe title element in JSX syntax
subtitlejsxThe subtitle element in JSX syntax
canvasBackgroundColorstringThe background color of the canvas (default: '#fff')
canvasBorderColorstringThe border color of the canvas (default: '#20263d')
penColorstringThe pen color of the canvas (default: '#20263d')

Example:

onBoarding.renderSignature(document.getElementById("app"), {
token: session.token,
onSuccess: console.log,
onError: console.log,
});

addDocument

Description:

Upload document image and store data.

Parameters:

NameTypeDescription
tokenstringThe session object.
typestringPossible values: signature, document, medicalDoc, thirdId, contract
imagestringImage of a document represented in base64.

Returns:

  • Object:

    JSON object

  • `Parameters:

    NameTypeDescription
    successbooleanIf the call was called correctly.
    additionalInformationobjectInfo about the signed contract
  • additionalInformation:
NameTypeDescription
contractIdstringUsed later when adding signature to contract.
  • JSON example:

    {
    "success": true,
    "additionalInformation": {
    "contractId": "cf2ec1fd-bf20-4885-b748-d7e761592d19"
    }
    }

Example:

incode.addDocument({
token,
type: "contract",
image: contractBase64,
});

attachSignatureOnAnnotation

Description:

Signature from interview to previously uploaded PDF contracts at all places annotated with text from request, with desired height on the PDF. Annotations must contain text with either preselected values, or an alternative can be supplied. Preselected values are: INITIALS, PATIENTS_SIGNATURE, NON_MEDICAL_SIGNATURE_1, NON_MEDICAL_SIGNATURE_2, MEDICAL_SIGNATURE_1, MEDICAL_SIGNATURE_2. The endpoint is intended to be used as a fluent method. One call to the method responds by giving the ID of the signed document which can then be used in the subsequent call to continuously add another signature on a different annotation.

Parameters:

NameTypeDescription
tokenstringThe session object.
contractIdstring (optional)IDs of previously added contract (Check addDocument)
signatureTypestring (optional)One of the preselected values for annotation text
signedContractIdstring (optional)IDs of previously signed contract (ID is returned as response from the previous attachSignatureOnAnnotation)

Returns:

  • Object:

    JSON object

  • `Parameters:

    NameTypeDescription
    successbooleanIf the call was called correctly.
    additionalInformationobjectInfo about the signed contract
  • additionalInformation:
NameTypeDescription
signedDocumentIdstringNew signed document ID
signedDocumentURLstringTemporary URL to new signed document
  • JSON example:

    {
    "success": true,
    "additionalInformation": {
    "signedDocumentId": "new signed document ID",
    "signedDocumentURL": "temporary URL to new signed document"
    }
    }

Example:

incode.attachSignatureOnAnnotation({
contractId: "somecontractid",
signatureType: "PATIENTS_SIGNATURE",
token: session.token,
signedContractId: "someSignedContractId",
});

renderVideoSelfie

Description:

This method renders the VideoSelfie Component

Parameters:

NameTypeDescription
containerHTMLElement
options Objectobjectoptions object
callback Objectobjectoptions object

Options:

NameTypeDescription
tokenStringThe token string (Mandatory)
showTutorialbooleanOption to show the tutorial (default: false)
modulesArray[String]The modules you want to perform in video selfie. Check the table to see all available modules (The default is all modules)
speechToTextCheckbooleanOption to perform the speech to text check. If this is false, the videoSelfie always will call onSuccess (default: true)
performLivenessbooleanOff/On liveness check, by default it is false
videoSelfieAsSelfiebooleanThe option to choose from which image you will make a face check, from selfie or video selfie. If it is true it will be from selfie
compareOCREnabledbooleanOption to off/on front OCR check (default: false)
compareIDEnabledbooleanOption to off/on front ID check (default: true)
compareBackOCREnabledbooleanOption to off/on back OCR check (default: false)
compareBackIDEnabledbooleanOption to off/on back ID check (default: false)
questionsCountNumberOption to set number of questions in questions module (default: 3)

Modules

NameDescription
selfieSelfie step
frontFront ID step
backBack ID step
poaProof of address step
questionsRandom questions step
speechThe user will perform an authorization. Important: this step is mandatory and needs to be the last one

Callback

NameTypeDescription
onSuccessfunctionCallback when the videoselfue was successful. If this is called it means the speech detection was successful.
onErrorfunctionCallback when the detect was unsuccessful. If this is called it means the speech detection was unsuccessful
numberOfTriesnumberThe number of tries in speech-to-text

Example:

onBoarding.renderVideoSelfie(
container,
{
token: session,
showTutorial: true,
modules: ["selfie", "front", "back", "speech"], // you can add 'poa' and 'questions'
speechToTextCheck: true, // this is the check for the speech
},
{
onSuccess: () => alert("speech detected"),
onError: () => alert("speech not detected"),
numberOfTries: 3,
}
);

renderConference

Description:

This method will render the Conference component

Parameters:

NameTypeDescription
containerHTMLElement
options Objectobjectoptions object
callback Objectobjectcallbacks object

Options:

NameTypeDescription
tokenObjectThe session object (Mandatory)
showOTPBooleanIf true, it will render the OTP screen before conference
numberOfTriesNumberThe number of tries that the user has for OTP. If you put -1 the user will have infinite tries. Default: 3
queueStringThe queue for the video conference. Default: ''

Callbacks:

NameTypeDescription
onSuccessfunctionPasses the status as a parameter. Possible statuses are: close, deny, approve
onErrorfunctionCallback when the detect was unsuccessful. Something went wrong in the conference
onConnectfunctionCallback when the conference is connected
onLogfunctionCallback for the logging

Example:

onBoarding.renderConference(
container,
{
token: token,
showOTP: false,
},
{
onSuccess: (status) => {
log(status);

switch (status) {
case "close":
console.log("The conference was close");
goToStep("closeCase");
break;
case "deny":
console.log("The conference was denied");
goToStep("denyCase");
break;
case "close":
console.log("The conference was approved");
goToStep("approveCase");
break;
}
},
}
);

getFinishStatus

Description:

This method is called after you finish the onboarding process but only if you are using custom flows. It is required for self configurations, like autoapprove, to function properly.

Parameters:

NameTypeDescription
flowIdstringThe ID of the flow you put in createSession (optional, can be null)
tokenstringThe token obtained from createSession.

Returns:

  • Object:

    JSON object

  • Properties:

    NameTypeDescription
    redirectionUrlstringUrl to be redirected when onboarding is done
    actionstringPossible values: none | approved | manualReview | rejected. Refers to the action performed based on the flow score configurations if applicable.
  • JSON example:

    {
    "action": "none",
    "redirectionUrl": ""
    }

Example:

// first you call createSession with the flow id you want to use in the onboarding. (It's called configurationId in the createSession method)
const flowId = 'someFlowId'
const { token } = incode.createSession(code, undefined, {
configurationId: flowId,
})
...
// Call SDK methods to finish the onboarding process
...
// then you call getFinishStatus with the session object and the flowId (flowId is optional here)
incode.getFinishStatus(flowId, { token });

getScore

Description:

Get ID validation, liveness and face recognition results for given interviewId.

Parameters:

NameTypeDescription
tokenstringaccess token.

Returns:

  • Object: JSON object
  • JSON example:

    {
    "idValidation": {
    "photoSecurityAndQuality": [
    {
    "value": "PASSED",
    "status": "OK",
    "key": "incodeTamperCheck"
    },
    {
    "value": "PASSED",
    "status": "OK",
    "key": "incodeFakeCheck"
    },
    {
    "value": "PASSED",
    "status": "OK",
    "key": "incodeIdLiveness"
    },
    {
    "value": "PASSED",
    "status": "OK",
    "key": "qrCheck"
    },
    {
    "value": "PASSED",
    "status": "OK",
    "key": "issueDateCheck"
    },
    {
    "value": "100",
    "status": "OK",
    "key": "balancedLightFront"
    },
    {
    "value": "100",
    "status": "OK",
    "key": "balancedLightBack"
    },
    {
    "value": "59",
    "status": "OK",
    "key": "sharpnessFront"
    },
    {
    "value": "67",
    "status": "OK",
    "key": "sharpnessBack"
    },
    {
    "value": "100.0",
    "status": "OK",
    "key": "readabilityCheck"
    },
    {
    "value": "79",
    "status": "WARN",
    "key": "incodeCrossCheck"
    }
    ],
    "idSpecific": [
    {
    "value": "50",
    "status": "WARN",
    "key": "2DBarcodeRead"
    },
    {
    "value": "100",
    "status": "OK",
    "key": "birthDateValid"
    },
    {
    "value": "100",
    "status": "OK",
    "key": "compositeCheckDigit"
    },
    {
    "value": "100",
    "status": "OK",
    "key": "documentClassification"
    },
    {
    "value": "100",
    "status": "OK",
    "key": "documentExpired"
    },
    {
    "value": "100",
    "status": "OK",
    "key": "documentNumberCheckDigit"
    },
    {
    "value": "100",
    "status": "OK",
    "key": "expirationDateCrosscheck"
    },
    {
    "value": "100",
    "status": "OK",
    "key": "expirationDateValid"
    },
    {
    "value": "50",
    "status": "WARN",
    "key": "fullNameCrosscheck"
    },
    {
    "value": "100",
    "status": "OK",
    "key": "imageTamperingCheck"
    },
    {
    "value": "100",
    "status": "OK",
    "key": "issueDateValid"
    },
    {
    "value": "100",
    "status": "OK",
    "key": "issuingStateValid"
    },
    {
    "value": "100",
    "status": "OK",
    "key": "visiblePattern"
    },
    {
    "value": "100",
    "status": "OK",
    "key": "visiblePhotoCharacteristics"
    }
    ],
    "overall": {
    "value": "76.0",
    "status": "OK"
    }
    },
    "liveness": {
    "livenessScore": {
    "value": "98.9%",
    "status": "OK"
    },
    "photoQuality": {
    "value": "116.9"
    },
    "overall": {
    "value": "98.9",
    "status": "OK"
    }
    },
    "faceRecognition": {
    "existingUser": true,
    "overall": {
    "value": "88.4",
    "status": "OK"
    }
    },
    "governmentValidation": {
    "recognitionConfidence": {
    "value": "81.3",
    "status": "OK"
    },
    "validationStatus": {
    "value": "0",
    "status": "OK",
    "key": "ok"
    },
    "ocrValidation": [
    {
    "value": "true",
    "status": "OK",
    "key": "issueDate"
    },
    {
    "value": "true",
    "status": "OK",
    "key": "firstName"
    },
    {
    "value": "true",
    "status": "OK",
    "key": "maternalLastName"
    },
    {
    "value": "true",
    "status": "OK",
    "key": "paternalLastName"
    },
    {
    "value": "true",
    "status": "OK",
    "key": "ocr"
    },
    {
    "value": "true",
    "status": "OK",
    "key": "personalId"
    },
    {
    "value": "true",
    "status": "OK",
    "key": "electorsKey"
    },
    {
    "value": "true",
    "status": "OK",
    "key": "emissionNumber"
    },
    {
    "value": "true",
    "status": "OK",
    "key": "registrationDate"
    }
    ],
    "overall": {
    "value": "81.3",
    "status": "OK"
    }
    },
    "overall": {
    "value": "86.2",
    "status": "OK"
    }
    }

Example:

// Get score
const score = await onBoarding.getScore({ token });

ocrData

Description:

Fetch OCR data

Parameters:

NameTypeDescription
tokenstringaccess token.

Returns:

  • Object:

    JSON object

  • Parameters:

    NameTypeDescription
    birthDatestringtimestamp of birthday.
    namestringuser full name.
    addressstringaddress.
    genderstringuser gender.
    typeOfIdstringtype of ID
    .........
  • JSON example (actual data returned depends on id type):

    {
    "_id": "5e0b133d8c00860016e7e8db",
    "_createdAt": 1577784125675,
    "_updatedAt": 1577784125675,
    "apiKey": "f35844c8bcb417be1a64fd6e8d622cc93b0fcfaa",
    "name": "DIEGO MONDRAGON OCARANZA",
    "phone": "+381605211222",
    "birthDate": "632966400000",
    "address": "COL PEDREGAL DE SAN ANGEL 01900\nALVARO OBREGON ,D.F.",
    "gender": "M",
    "typeOfId": "INE",
    "countryCode": "MEX",
    "cic": "114988030",
    "numeroEmisionCredencial": "01",
    "tamperedConfidence": 0.17613846,
    "idValidationScore": "Attention",
    "fakeIdConfidence": 0.9999198,
    "paperFrontConfidence": 0.0,
    "screenFrontConfidence": 1.7501973e-6,
    "paperBackConfidence": 0.0,
    "screenBackConfidence": 3.5412256e-7,
    "retouchFrontConfidence": 0.004690816,
    "classification": 14,
    "frontSharpness": 59,
    "frontGlare": 100,
    "frontShadowConfidence": 0.8312062,
    "backSharpness": 67,
    "backGlare": 100,
    "backShadowConfidence": 0.6933477,
    "issueDate": 2014,
    "expirationDate": 2024,
    "croppedIDFaceRef": "5e0b26528c00860016e7e8fa",
    "fullFrameFrontIDRef": "5e0b26528c00860016e7e8fb",
    "fullFrameBackIDRef": "5e0b26dc8cdb170011c6e705",
    "frontIdProcessed": true,
    "backIdProcessed": true,
    "idValidationFinished": true,
    "paternalLastNameValid": true,
    "maternalLastNameValid": false,
    "nameValid": true,
    "curpValid": true,
    "ocrValid": true,
    "claveDeElectorValid": true,
    "numeroEmisionCredencialValid": true,
    "registrationDateValid": true,
    "issueDateValid": true
    }
  • Insurance card JSON example (actual data returned depends on insurance card type):

    {
    "_id": "5e0b133d8c00860016e7e8db",
    "_createdAt": 1577784125675,
    "_updatedAt": 1577784125675,
    "apiKey": "f35844c8bcb417be1a64fd6e8d622cc93b0fcfaa",
    "name": "John Smith",
    "cardId": "114988030",
    "phone": "+14535634564",
    "email": "john.smith@outlook.com",
    "birthDate": "632966400000",
    "groupNumber": "154",
    "groupName": "Group A",
    "address": "725 5th Ave, New York, NY 10022",
    "gender": "M",
    "typeOfId": "AdultCare",
    "employer": "Walmart",
    "payerID": "8739",
    "countryCode": "US",
    "issueNumber": "01",
    "tamperedConfidence": 0.17613846,
    "idValidationScore": "Attention",
    "fakeIdConfidence": 0.9999198,
    "paperFrontConfidence": 0.0,
    "screenFrontConfidence": 1.7501973e-6,
    "paperBackConfidence": 0.0,
    "screenBackConfidence": 3.5412256e-7,
    "retouchFrontConfidence": 0.004690816,
    "classification": 14,
    "frontSharpness": 59,
    "frontGlare": 100,
    "frontShadowConfidence": 0.8312062,
    "backSharpness": 67,
    "backGlare": 100,
    "backShadowConfidence": 0.6933477,
    "issueDate": 2014,
    "expirationDate": 2024,
    "croppedIDFaceRef": "5e0b26528c00860016e7e8fa",
    "fullFrameFrontIDRef": "5e0b26528c00860016e7e8fb",
    "fullFrameBackIDRef": "5e0b26dc8cdb170011c6e705",
    "frontIdProcessed": true,
    "backIdProcessed": true,
    "idValidationFinished": true,
    "paternalLastNameValid": true,
    "maternalLastNameValid": false,
    "nameValid": true,
    "cardValid": true,
    "ocrValid": true,
    "issueDateValid": true,
    "expirationDateValid": true
    }

Example:

// get ocr data from the ID
const ocrData = await onBoarding.ocrData({ token })
const { birthDate, name, address, gender, ... } = ocrData

getImages

Description:

Returns images for the user. The values from the images that can be returned: selfie, croppedFace, croppedIDFace, document, signature, fullFrameFrontID, fullFrameBackID, croppedFrontID, croppedBackID.

Parameters:

NameTypeDescription
tokenstringaccess token.
bodyObjectCheck body object table and example

Body:

NameTypeDescription
imagesArray of StringsNeeds to be an array with one or more of the following: selfie, croppedFace, croppedIDFace, document, signature, fullFrameFrontID, fullFrameBackID, croppedFrontID, croppedBackID

Returns:

  • Object: JSON object
  • JSON example:

    {
    "croppedFace": "iVBORw0KGgoAAAANSUhEUgAAAaQAAAGkCAIAAADxLsZ....",
    "croppedIDFace": "iVBORw0KGgoAAAANSUhEUgAAAaQAAAGkCAIAAADxLsZ..."
    }

Example:

// Get images
const images await onBoarding.getImages({
token: sessionToken,
body: { images: ['croppedIDFace', 'selfie'] },
})

approve

Description:

Approve a customer.

Parameters

NameTypeDescription
tokenstringaccess token to be used in all other methods.

Returns:

  • Object:

    JSON object

  • JSON example:

    {
    "success": true,
    "customerId": "incode",
    "token": "token"
    }

Example:


onBoarding.approve({ token }).then({customerId} => {
console.log(customerId,':)')
...
});

renderLogin

Description:

This method renders the Face Login Component

Parameters:

NameTypeDescription
containerHTMLElement
onSuccessfunctionCallback when the detect was success
onErrorfunctionCallback when an error happend
stopAtAccessDeniedboolean (optional)Flag to trigger onErro callback when access denied event happend
stopAtErrorboolean (optional)Flag to trigger onErro callback when any error event happend
tokenObjectThe session object

Example:

incode.renderLogin(container, {
onSuccess: onSuccess,
onError: onError,
});

publishKeys

Description:

Send encryption information that will be sent in every intercepted call.

Parameters:

NameTypeDescription
tokenstringCiti side generated token.

Returns:

  • Object:

    JSON object

  • JSON example:

    {
    "success": true
    }

Example:

onBoarding.publishKeys(token);

sendFingerprint

Description:

Sends relevant information about the user's device and Incode's Web SDK:

  • Browser version
  • Device model
  • Application type (Web application)
  • OS and OS version
  • Incode's Web SDK version
  • User's IP address

Parameters:

NameTypeDescription
tokenstringaccess token to be used in all other methods.

Returns:

  • Object:
{
"success": boolean,
"sessionStatus": string
}

Example:

onBoarding.sendFingerprint({ token: session.token }).then((response) => {
console.log(response.success);
});

addPhone

Description:

Add phone number to an interview. If a customer with that phone number already exists, an error will be thrown.

Parameters:

NameTypeDescription
tokenstringaccess token to be used in all other methods.
phonestringcustomer's phone number

Returns:

  • Object:

    JSON object

  • JSON example:

    {
    "success": true
    }

Example:

onBoarding.addPhone({ token, phone }).then((res) => res);

Example:

const response = await onBoarding.processGovernmentValidation({
token,
});

getCustomFields

Description:

This method is used for fetching custom fields for the current session.

Parameters:

NameTypeDescription
tokenstringThe session object.

Returns:

  • Object:

    JSON object

  • `Parameters:

    NameTypeDescription
    customFieldsobjectObject of custom fields with type that was inserted.
  • JSON example:

    { "customFields": { "watchlistName": "John Smith" } }

Example:

incode.getCustomFields({
token: session.token,
});

addCustomFields

Description:

This endpoint is used for adding custom fields to current onboarding session.

Parameters:

NameTypeDescription
tokenstringaccess token to be used in all other methods.
fieldsobjectObject of custom fields that will be inserted.

Returns:

  • Object:

    JSON object

  • `Parameters:

    NameTypeDescription
    successbooleanIf the call was called correctly.
  • JSON example:
    {
    "success": true
    }

Example:

incode.addCustomFields({
token: session.token,
fields: { watchlistName: name },
});