Skip to content

fix: handle BaseModel result in convert_to_model to prevent TypeError#5459

Open
aayushbaluni wants to merge 1 commit intocrewAIInc:mainfrom
aayushbaluni:fix/5458-pydantic-json-loads-type-error
Open

fix: handle BaseModel result in convert_to_model to prevent TypeError#5459
aayushbaluni wants to merge 1 commit intocrewAIInc:mainfrom
aayushbaluni:fix/5458-pydantic-json-loads-type-error

Conversation

@aayushbaluni
Copy link
Copy Markdown

Summary

Fixes #5458

When combining ConditionalTask + output_pydantic + guardrails + context, a guardrail retry can cause the agent to return a Pydantic BaseModel instance instead of a raw JSON string. This instance is then passed to convert_to_model(), which calls json.loads(result) on it — but json.loads expects str | bytes | bytearray, not a model object:

TypeError: the JSON object must be str, bytes or bytearray, not DrugClassificationOutput

Root Cause

In lib/crewai/src/crewai/utilities/converter.py, convert_to_model() assumes result is always a string (line 188: json.loads(result, strict=False)). When _invoke_guardrail_function in task.py retries via agent.execute_task() and the agent returns structured output, result is a BaseModel instance.

Fix

Added an early check in convert_to_model(): if result is already a BaseModel, serialize it to JSON with model_dump_json() before proceeding. This ensures json.loads always receives a string, regardless of whether the result came from a first-pass or retry execution.

The fix is minimal (6 lines) and does not change any behavior for string results.

Made with Cursor

When a guardrail retry causes the agent to return a Pydantic BaseModel
instance instead of a JSON string, json.loads() raises TypeError
because it expects str/bytes, not a model object. Serialize the model
to JSON before proceeding with validation.

Fixes crewAIInc#5458

Made-with: Cursor
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

1 participant