Running with Ollama throws Pydantic error #3

Open
opened 2025-07-06 15:38:48 +00:00 by glenux · 1 comment
Owner

Notes:


I was trying to run this with a local Ollama (with llm-ollama installed) setup, but I get the following error message:

WARNING: Diff is large; truncating to 4000 characters.
INFO: HTTP Request: GET http://127.0.0.1:11434/api/tags "HTTP/1.1 200 OK"
INFO: HTTP Request: POST http://127.0.0.1:11434/api/show "HTTP/1.1 200 OK"
INFO: HTTP Request: POST http://127.0.0.1:11434/api/show "HTTP/1.1 200 OK"
Traceback (most recent call last):
  File "/usr/bin/llm", line 33, in <module>
    sys.exit(load_entry_point('llm==0.23', 'console_scripts', 'llm')())
             ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^
  File "/usr/lib/python3/dist-packages/click/core.py", line 1161, in __call__
    return self.main(*args, **kwargs)
           ~~~~~~~~~^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/click/core.py", line 1082, in main
    rv = self.invoke(ctx)
  File "/usr/lib/python3/dist-packages/click/core.py", line 1697, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
                           ~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^
  File "/usr/lib/python3/dist-packages/click/core.py", line 1443, in invoke
    return ctx.invoke(self.callback, **ctx.params)
           ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/click/core.py", line 788, in invoke
    return __callback(*args, **kwargs)
  File "/usr/local/lib/python3.13/dist-packages/llm_commit.py", line 253, in commit_cmd
    message = generate_commit_message(diff, commit_style, model=model, max_tokens=max_tokens, temperature=temperature, hint=hint)
  File "/usr/local/lib/python3.13/dist-packages/llm_commit.py", line 178, in generate_commit_message
    response = model_obj.prompt(
        prompt,
    ...<10 lines>...
        temperature=temperature
    )
  File "/usr/lib/python3/dist-packages/llm/models.py", line 746, in prompt
    options=self.Options(**options),
            ~~~~~~~~~~~~^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/pydantic/main.py", line 214, in __init__
    validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
pydantic_core._pydantic_core.ValidationError: 1 validation error for Options
max_tokens
  Extra inputs are not permitted [type=extra_forbidden, input_value=100, input_type=int]
    For further information visit https://errors.pydantic.dev/2.10/v/extra_forbidden

Any ideas? My understanding was, that ollama is API-compatible with OpenAI?

Notes: * Copy of issue from upstream project: https://github.com/GNtousakis/llm-commit/issues/17 * Original message from [@saemideluxe](https://github.com/saemideluxe) --- I was trying to run this with a local Ollama (with llm-ollama installed) setup, but I get the following error message: ``` WARNING: Diff is large; truncating to 4000 characters. INFO: HTTP Request: GET http://127.0.0.1:11434/api/tags "HTTP/1.1 200 OK" INFO: HTTP Request: POST http://127.0.0.1:11434/api/show "HTTP/1.1 200 OK" INFO: HTTP Request: POST http://127.0.0.1:11434/api/show "HTTP/1.1 200 OK" Traceback (most recent call last): File "/usr/bin/llm", line 33, in <module> sys.exit(load_entry_point('llm==0.23', 'console_scripts', 'llm')()) ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^ File "/usr/lib/python3/dist-packages/click/core.py", line 1161, in __call__ return self.main(*args, **kwargs) ~~~~~~~~~^^^^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/click/core.py", line 1082, in main rv = self.invoke(ctx) File "/usr/lib/python3/dist-packages/click/core.py", line 1697, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^ File "/usr/lib/python3/dist-packages/click/core.py", line 1443, in invoke return ctx.invoke(self.callback, **ctx.params) ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/click/core.py", line 788, in invoke return __callback(*args, **kwargs) File "/usr/local/lib/python3.13/dist-packages/llm_commit.py", line 253, in commit_cmd message = generate_commit_message(diff, commit_style, model=model, max_tokens=max_tokens, temperature=temperature, hint=hint) File "/usr/local/lib/python3.13/dist-packages/llm_commit.py", line 178, in generate_commit_message response = model_obj.prompt( prompt, ...<10 lines>... temperature=temperature ) File "/usr/lib/python3/dist-packages/llm/models.py", line 746, in prompt options=self.Options(**options), ~~~~~~~~~~~~^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/pydantic/main.py", line 214, in __init__ validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self) pydantic_core._pydantic_core.ValidationError: 1 validation error for Options max_tokens Extra inputs are not permitted [type=extra_forbidden, input_value=100, input_type=int] For further information visit https://errors.pydantic.dev/2.10/v/extra_forbidden ``` Any ideas? My understanding was, that ollama is API-compatible with OpenAI?
Author
Owner

It seems that :

IMHO, that should be handled by the datasette LLM project (with some abstraction layer) not by llm-commit.

It seems that : * Ollama does support the max_tokens parameter (see the discussion here [Ollama max tokens parameter langchain-ai/langchain#14714](https://github.com/langchain-ai/langchain/discussions/14714) ). * Ollama is using another data parameter name (see Max token question [ollama/ollama-python#101](https://github.com/ollama/ollama-python/issues/101) ). IMHO, that should be handled by the datasette LLM project (with some abstraction layer) not by llm-commit.
glenux changed title from Running with Ollama throws Pydantic error to [bug] Running with Ollama throws Pydantic error 2025-07-06 15:41:49 +00:00
glenux added this to the Default project 2025-07-06 15:52:48 +00:00
glenux changed title from [bug] Running with Ollama throws Pydantic error to Running with Ollama throws Pydantic error 2025-07-06 16:16:26 +00:00
Sign in to join this conversation.
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference: glenux/llm-commit-gen#3
No description provided.