Skip to content

[BUG] OpenAIModel ignores vLLM reasoning field (delta.reasoning / message.reasoning) #2182

@austinmw

Description

@austinmw

Checks

  • I have updated to the lastest minor and patch version of Strands
  • I have checked the documentation and this is not expected behavior
  • I have searched ./issues and there are no duplicates of my issue

Strands Version

latest

Python Version

latest

Operating System

macos

Installation Method

pip

Steps to Reproduce

Strands 1.36.0 still checks reasoning_content, while current vLLM 0.19.1 emits delta.reasoning and message.reasoning, so reasoning is dropped for OpenAI-compatible vLLM backends.

Expected Behavior

support both old and new field names.

Actual Behavior

only supports delta.reasoning_content currently

Additional Context

No response

Possible Solution

No response

Related Issues

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions