Where you put your api key. Requesting inferences requires a "Model Run" permission on the project where sources exist
{"Authorization": "Api-Key XXXXXXX-XXXXXXX-XXXXXXX"}
projectIdentifier*
string
Identifier of the project that contains the source files. Either id or the unique name of the project.
modelIdentifier*
string
Identifier of the model that will be run. Either id or the unique name of the model.
sourceIds*
array
array of source ids pointing to where models will be run
outputLayerIdentifier*
string|integer
layer in which predictions will be generated
groupName*
String
name of the group user belongs to
jobId*
Integer
Integer representing the inference job that was spawned
Authorization*
string
Where you put your api key. Requesting inferences requires a "Model Run" permission on the project where sources exist
{"Authorization": "Api-Key XXXXXXX-XXXXXXX-XXXXXXX"}