Run Request
The endpoint expects the image to be served as binary
. Below is an curl and python example
cURL
- get image
wget https://fki.tic.heia-fr.ch/static/img/a01-122-02-00.jpg -O test.jpg
- send cURL request
curl --request POST \
--url https://{ENDPOINT}/ \
--header 'Content-Type: image/jpg' \
--header 'Authorization: Bearer {HF_TOKEN}' \
--data-binary '@test.jpg'
- the expected output
{"text": "INDLUS THE"}
Python
- get image
wget https://fki.tic.heia-fr.ch/static/img/a01-122-02-00.jpg -O test.jpg
- run request
import json
from typing import List
import requests as r
import base64
ENDPOINT_URL=""
HF_TOKEN=""
def predict(path_to_image:str=None):
with open(path_to_image, "rb") as i:
b = i.read()
headers= {
"Authorization": f"Bearer {HF_TOKEN}",
"Content-Type": "image/jpeg" # content type of image
}
response = r.post(ENDPOINT_URL, headers=headers, data=b)
return response.json()
prediction = predict(path_to_image="test.jpg")
prediction
expected output
{"text": "INDLUS THE"}
Inference API (serverless) does not yet support generic models for this pipeline type.