Not able to get exact result as inference endpoint not receiving any "parameteres"

#12
by shezibabu - opened

Following is the code to send requests to inference endpoint. Problem is when i send "parameters" with "inputs" as mentioned in https://huggingface.co/docs/inference-endpoints/supported_tasks#additional-parameters. Parameters are not accepted by the model.

import requests

API_URL = API_URL
headers = {
"Authorization": "Bearer XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX",
"Content-Type": "application/json"
}

def query(payload):
response = requests.post(API_URL, headers=headers, json=payload)
return response.json()

output = query({
"inputs": "The answer to the universe is",
})

Is there any solution?

Here is how i am trying to send payload

single_payload = [{
"inputs": sent,
"parameters":{
"num_beams":5,
"num_beam_groups":5,
"num_return_sequences":10,
"repetition_penalty":10.0,
"diversity_penalty":3.0,
"no_repeat_ngram_size":2,
"temperature":0.7,
"max_length": 2048
}
}]

and i am getting results like the following

[{'generated_text': "A man's life is marked by fluctuations, ranging from sudden changes to death."}]

Sign up or log in to comment