Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Anthropic error : API failed prompt must start with "\n\nHuman:" turn #386

Open
kaminosekai54 opened this issue Nov 29, 2023 · 3 comments
Open
Labels
bug Something isn't working feat/model Feature: models

Comments

@kaminosekai54
Copy link

Hello,
I'm trying to use spacy-llm for ner task using different models.

When I want to use Claude-2-v1 or Claude-1-v1, I get the same error :
Traceback (most recent call last):
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\spacy_llm\models\rest\anthropic\model.py", line 73, in _request
r.raise_for_status()
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\requests\models.py", line 1021, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://api.anthropic.com/v1/complete

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "C:\Users\alexis\Desktop\git-repo\ner-benchmark\spacy-llm-test.py", line 23, in
nlp = assemble("fewshot.cfg")
^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\spacy_llm\util.py", line 48, in assemble
return assemble_from_config(config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\spacy_llm\util.py", line 28, in assemble_from_config
nlp = load_model_from_config(config, auto_fill=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\spacy\util.py", line 587, in load_model_from_config
nlp = lang_cls.from_config(
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\spacy\language.py", line 1864, in from_config
nlp.add_pipe(
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\spacy\language.py", line 821, in add_pipe
pipe_component = self.create_pipe(
^^^^^^^^^^^^^^^^^
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\spacy\language.py", line 709, in create_pipe
resolved = registry.resolve(cfg, validate=validate)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\confection_init_.py", line 756, in resolve
resolved, _ = cls.make(
^^^^^^^^^^
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\confection_init
.py", line 805, in _make
filled, _, resolved = cls.fill(
^^^^^^^^^^
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\confection_init
.py", line 860, in _fill
filled[key], validation[v_key], final[key] = cls.fill(
^^^^^^^^^^
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\confection_init
.py", line 877, in _fill
getter_result = getter(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\spacy_llm\models\rest\anthropic\registry.py", line 33, in anthropic_claude_2
return Anthropic(
^^^^^^^^^^
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\spacy_llm\models\rest\base.py", line 64, in init
self._verify_auth()
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\spacy_llm\models\rest\anthropic\model.py", line 51, in _verify_auth
raise err
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\spacy_llm\models\rest\anthropic\model.py", line 43, in _verify_auth
self(["test"])
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\spacy_llm\models\rest\anthropic\model.py", line 97, in call
responses = [
^
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\spacy_llm\models\rest\anthropic\model.py", line 98, in
_request({"prompt": f"{SystemPrompt.HUMAN} {prompt}{SystemPrompt.ASST}"})
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\spacy_llm\models\rest\anthropic\model.py", line 81, in _request
raise ValueError(error_msg) from ex
ValueError: Request to Anthropic API failed: {'type': 'invalid_request_error', 'message': 'prompt must start with "\n\nHuman:" turn'}

I am using Spacy-LLM version 0.6.4 and python 3.11.5.

My code is :

from spacy_llm.util import assemble

# load the model settings from the config file
nlp = assemble("fewshot.cfg")
doc = nlp("In    most patients with isolated unilateral retinoblastoma , tumor development is initiated by somatic inactivation of both alleles of the RB1 gene .")

# print the llm output
print([(ent.text, ent.label_) for ent in doc.ents])

My config file is :

[paths]
examples = null

[nlp]
lang = "en"
pipeline = ["llm"]

[components]

[components.llm]
factory = "llm"

[components.llm.task]
@llm_tasks = "spacy.NER.v3"
labels = ["DISEASE", "GENE", "CHEMICAL"]
description = Entities related to biomedical text, 
    specifically focusing on diseases, genes, and chemicals.
    Adjectives, verbs, adverbs are not entities.
    Pronouns are not entities.
    Entities should represent concrete entities within the biomedical domain.

[components.llm.task.label_definitions]
DISEASE = "Known diseases or medical conditions, e.g., diabetes, cancer. Entities should refer to specific diseases rather than general health conditions."
GENE = "Genes or genetic elements mentioned in the text. Entities should represent specific genes or genetic elements, and not general terms related to genetics. For example, 'BRCA1' is an entity, while 'genetic' is not."
CHEMICAL = "Chemicals or substances, e.g., drugs, molecules. Entities should represent specific chemical compounds or substances. General terms related to chemistry, such as 'chemical' or 'substance,' are not entities."

[components.llm.task.examples]
@misc = "spacy.FewShotReader.v1"
path = "examples.json"

[components.llm.model]
@llm_models = "spacy.Claude-2.v1"
config = {"max_tokens_to_sample": 1024}

I tried with the ner v2 with and without examples.
I still get the same error.

My API key is also set up as env variable.

I don't know if I miss something or if it's an error.

Thanks in advence for the help.

@rmitsch rmitsch added bug Something isn't working feat/model Feature: models labels Nov 29, 2023
@rmitsch
Copy link
Collaborator

rmitsch commented Nov 29, 2023

Hi @kaminosekai54, thanks for reporting this. This seems to be due to a recent change in Anthropic's API. We'll release a fix for this soon.

@rmitsch
Copy link
Collaborator

rmitsch commented Nov 30, 2023

Hm, I can't reproduce that. Our prompt already includes "\n\nHuman:", so other than I initially thought, there's no change in Anthropic's API in this regard. Could you modify the spacy-llm code to output the raw prompt when the exception is hit?

@kaminosekai54
Copy link
Author

Hi,
Sorry for the long delay answer.

Sure, I added a print in the request methode of the antropic model.py function :
I print the json_data when the http error is rased :

        def  _request(json_data: Dict[str, Any]) -> Dict[str, Any]:
            r = self.retry(
                call_method=requests.post,
                url=self._endpoint,
                headers=headers,
                json={**json_data, **self._config, "model": self._name},
                timeout=self._max_request_time,
            )
            try:
                r.raise_for_status()
            except HTTPError as ex:
                print(json_data)
                print(json_data["prompt"])
                print(str(json_data["prompt"]))
                res_content = srsly.json_loads(r.content.decode("utf-8"))
                # ... function still unchanged after

I don't know if it's what you wanted, but here is the output :
{'prompt': 'SystemPrompt.HUMAN testSystemPrompt.ASST'}
SystemPrompt.HUMAN testSystemPrompt.ASST
SystemPrompt.HUMAN testSystemPrompt.ASST

It's folloed by the same error message :

{'prompt': 'SystemPrompt.HUMAN testSystemPrompt.ASST'}
SystemPrompt.HUMAN testSystemPrompt.ASST
SystemPrompt.HUMAN testSystemPrompt.ASST
Traceback (most recent call last):
File "C:\Users\alexis\Desktop\git-repo\ner-benchmark\spacy_llm\models\rest\anthropic\model.py", line 74, in _request
r.raise_for_status()
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\requests\models.py", line 1021, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: https://api.anthropic.com/v1/complete

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "C:\Users\alexis\Desktop\git-repo\ner-benchmark\spacy-llm-test.py", line 25, in
nlp = assemble(f'spacy-configs/{modelName}fewshot.cfg')
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\alexis\Desktop\git-repo\ner-benchmark\spacy_llm\util.py", line 48, in assemble
return assemble_from_config(config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\alexis\Desktop\git-repo\ner-benchmark\spacy_llm\util.py", line 28, in assemble_from_config
nlp = load_model_from_config(config, auto_fill=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\spacy\util.py", line 587, in load_model_from_config
nlp = lang_cls.from_config(
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\spacy\language.py", line 1864, in from_config
nlp.add_pipe(
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\spacy\language.py", line 821, in add_pipe
pipe_component = self.create_pipe(
^^^^^^^^^^^^^^^^^
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\spacy\language.py", line 709, in create_pipe
resolved = registry.resolve(cfg, validate=validate)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\confection_init
.py", line 756, in resolve
resolved, _ = cls.make(
^^^^^^^^^^
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\confection_init
.py", line 805, in _make
filled, _, resolved = cls.fill(
^^^^^^^^^^
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\confection_init
.py", line 860, in _fill
filled[key], validation[v_key], final[key] = cls.fill(
^^^^^^^^^^
File "C:\Users\alexis\miniconda3\envs\benchmark\Lib\site-packages\confection_init
.py", line 877, in _fill
getter_result = getter(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\alexis\Desktop\git-repo\ner-benchmark\spacy_llm\models\rest\anthropic\registry.py", line 69, in anthropic_claude_2
return Anthropic(
^^^^^^^^^^
File "C:\Users\alexis\Desktop\git-repo\ner-benchmark\spacy_llm\models\rest\base.py", line 64, in init
self._verify_auth()
File "C:\Users\alexis\Desktop\git-repo\ner-benchmark\spacy_llm\models\rest\anthropic\model.py", line 51, in _verify_auth
raise err
File "C:\Users\alexis\Desktop\git-repo\ner-benchmark\spacy_llm\models\rest\anthropic\model.py", line 43, in _verify_auth
self(["test"])
File "C:\Users\alexis\Desktop\git-repo\ner-benchmark\spacy_llm\models\rest\anthropic\model.py", line 101, in call
responses = [
^
File "C:\Users\alexis\Desktop\git-repo\ner-benchmark\spacy_llm\models\rest\anthropic\model.py", line 102, in
_request({"prompt": f"{SystemPrompt.HUMAN} {prompt}{SystemPrompt.ASST}"})
File "C:\Users\alexis\Desktop\git-repo\ner-benchmark\spacy_llm\models\rest\anthropic\model.py", line 85, in _request
raise ValueError(error_msg) from ex
ValueError: Request to Anthropic API failed: {'type': 'invalid_request_error', 'message': 'prompt must start with "\n\nHuman:" turn'}

I don't know if I might miss a package or something, I just installed spacy-llm with the pip command provided in the doc.

It seams the systemPrompt instance are not interpreted, I don't understand why.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working feat/model Feature: models
Projects
None yet
Development

No branches or pull requests

2 participants