Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bad request/validation errorsand default value overwrite #371

Closed
waleedqk opened this issue Aug 1, 2024 · 0 comments · Fixed by #370
Closed

bad request/validation errorsand default value overwrite #371

waleedqk opened this issue Aug 1, 2024 · 0 comments · Fixed by #370
Assignees
Labels
bug Something isn't working

Comments

@waleedqk
Copy link

waleedqk commented Aug 1, 2024

Describe the bug

When the user just provides generated_tokens as false without specifying other parameters, they can end up with an obscure 400/422. This likely happens because there are other fields that default to true without the user knowing.

Additionally, the default value for the include_stop_sequence should be null instead of true.
There is a case in the model having include_stop_sequence default to be false, but caikit always overrides it to true.

To Reproduce

curl --request POST -k \
  --url https://<hostname>/api/v1/task/classification-with-text-generation \
  --header 'Content-Type: application/json' \
  --data '{
  "inputs": "Why is Jason such a jerk?",
  "model_id": "llm-model",
  "guardrail_config": {
    "input": {
      "models": { }
    },
    "output": {
      "models": {
              "detector-1": {"threshold": 0.0}, "detector-2": {"threshold": 0.2}
        }
    }
  },
  "text_gen_parameters": {
    "max_new_tokens": 30,
    "min_new_tokens": 30,
    "generated_tokens": false
  }
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants