Skip to main content

Inference

Overview

This set of RESTful APIs is designed to run inference jobs on specific models. It also provides functionality to check the status of inference jobs and allows downloading the results once the job is completed. All the APIs are secured and need API-Key to access them.

Please refer to the slide gIQ-Authentication for detailed instructions on how to generate the API key.


Endpoints

1. Get Inference Models

Overview

This API retrieves a list of active and registered models available for running inference jobs. It returns detailed information about each model, including its name, version, and associated labels.


Endpoint

  • URL: https://giq.ae/public/inference/models
  • Method: GET
  • Produces: application/json

Description

Retrieves a list of active and registered models that can be used for inference jobs. The response includes details such as model name, version, type, and associated labels.

Request Headers

NameTypeRequiredDescription
AuthorizationStringYesBearer token for authentication.

Request Example (cURL)

curl -X GET "https://giq.ae/public/inference/models" \
--header 'X-API-KEY: apiKey'

Error Handling

Status CodeError MessageDescription
401 UnauthorizedUnauthorized RequestOccurs if the provided authentication token is invalid or missing.
500 Internal Server ErrorInternal Server ErrorIndicates a server-side error while processing the request.

Notes

  • Ensure that a valid bearer token is provided in the Authorization header.
  • This API returns only active and registered models available for inference jobs.

2. Run Inference Job

Overview

This API runs an inference job on a specific model using a given file. Optionally, a geometry can be provided for spatial inference.


Endpoint

  • URL: https://giq.ae/public/inference/run
  • Method: POST
  • Produces: application/json

Description

Initiates an inference job on the specified model using the provided file. The job can be customized by providing an optional geometry for more specific area.

Request Parameters

NameTypeRequiredDescription
modelIdStringYesThe unique identifier of the model to be used for inference which was returned from (Get Inference Models) api.
fileIdStringYesThe unique identifier of the file to be analyzed. (Which should be returned from File Upload API)
geometryStringNoAn optional geometry in Well-Known Text (WKT) format to focus the inference on a specific area.

Request Example (cURL)

curl -X POST "https://giq.ae/public/inference/run" \
-H "accept: application/json" \
-H "X-API-KEY: apiKey" \
-F "modelId=model123" \
-F "fileId=file456" \
-F "geometry=POLYGON((30 10, 40 40, 20 40, 10 20, 30 10))"

Response

The response returns the status and details of the inference job that was initiated.

Response Fields

FieldTypeDescription
idStringUnique identifier for the inference job.
statusInferenceJobStatusThe current status of the inference job (SUBMITTED, RUNNING, DONE, etc.).
typeInferenceJobTypeThe type of inference job being run.
startedAtDateThe date and time when the inference job started.

Error Handling

Status CodeError MessageDescription
404 Not FoundFile not foundIndicates that the file associated with the given fileId was not found.
500 Internal Server ErrorError occurred while running inference.Indicates a server-side error occurred while initiating the inference job.

Notes

  • Ensure that the fileId and modelId provided are valid.
  • The optional geometry parameter allows the inference job to be focused on a specific area of interest.

3. Get Inference Job status

Overview

This API retrieves the current status of a previously initiated inference job by its unique identifier.


Endpoint

  • URL: https://giq.ae/public/inference/status
  • Method: GET
  • Produces: application/json

Description

Fetches the status and details of an inference job using the inferenceId provided. This allows the user to track the progress of an ongoing or completed inference job.

Request Parameters

NameTypeRequiredDescription
inferenceIdStringYesThe unique identifier of the inference job.

Request Example (cURL)

curl -X GET "https://giq.ae/public/inference/status?inferenceId=inferenceJob123" \
-H "accept: application/json" \
-H "X-API-KEY: apiKey"

Response

The response will contain details of the inference job, including its current status, type, and when it started.

Response Fields

FieldTypeDescription
idStringUnique identifier for the inference job.
statusInferenceJobStatusThe current status of the inference job (PENDING, RUNNING, COMPLETED, etc.).
typeInferenceJobTypeThe type of inference job being run.
startedAtDateThe date and time when the inference job started.

Error Handling

Status CodeError MessageDescription
404 Not FoundInference Job not foundIndicates that the inference job associated with the given inferenceId was not found.
500 Internal Server ErrorInternal Server ErrorIndicates a server-side error while processing the request.

Notes

  • Ensure that the inferenceId provided is correct to avoid a "Job not found" error.
  • This API helps track the status of the job, which could be in various stages such as PENDING, RUNNING, or COMPLETED.

4. Download Inference Job result

Overview

This API allows users to download the result of an inference job by streaming the file to the client. The StreamingResponseBody handles large file downloads efficiently, ensuring the file is downloaded in chunks.


Endpoint

  • URL: https://giq.ae/public/inference/download
  • Method: GET
  • Produces: application/octet-stream

Description

This API streams the output file of an inference job to the client. It is designed to handle large files by sending them in chunks, preventing memory issues during the download process.

Request Parameters

NameTypeRequiredDescription
inferenceJobIdStringYesThe unique identifier of the inference job whose file you want to download.

Request Example (cURL)

curl -X GET "https://giq.ae/public/inference/download?inferenceJobId=inferenceJob123" \
-H "accept: application/octet-stream" \
-H "X-API-KEY: apiKey"

Response

The API streams the file associated with the given inference job. The file is downloaded in chunks to handle large file sizes efficiently.

Success Response

  • Content-Disposition: attachment; filename=<filename>
  • Content-Type: application/octet-stream

Error Handling

Status CodeError MessageDescription
400 Bad RequestInferenceJob id cannot be null or emptyIndicates that the inferenceJobId parameter was missing or empty.
500 Internal Server ErrorError occurred while streaming the fileIndicates a server-side error occurred while processing the file download request.

Notes

  • The StreamingResponseBody ensures large files are streamed efficiently, reducing memory usage on both the server and client sides.
  • Make sure that the inferenceJobId is valid, as invalid IDs will return a 400 or 404 error.
  • If the file download is incomplete, an error will be raised, and the user is encouraged to contact support.