-
Notifications
You must be signed in to change notification settings - Fork 4.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: Unable to use json
on chat response
#14580
Comments
To serialize a chat response containing import json
from typing import Any
class CustomJSONEncoder(json.JSONEncoder):
def default(self, obj: Any) -> Any:
if isinstance(obj, Choice):
return obj.__dict__
if isinstance(obj, Usage):
return obj.__dict__
return super().default(obj)
# Assuming `response` is the chat response object containing `Choice` and `Usage` objects
response_json = json.dumps(response, cls=CustomJSONEncoder)
print(response_json) In this example, |
There's a similar fix applied for |
(But also, an easy workaround is just taking the fields you want) |
First, I don't recommend directly calling Secondly, if it's necessary, you can implement it yourself by encoding, like this: import os
import json
from typing import Any
from llama_index.core.base.llms.types import ChatMessage
from llama_index.llms.openai import OpenAI
from openai.types.chat.chat_completion import Choice
from openai.types.completion_usage import CompletionUsage
def openai_json_encoder(obj: Any) -> Any:
if isinstance(obj, (Choice, CompletionUsage)):
return obj.model_dump()
return json.JSONEncoder().default(obj)
llm = OpenAI()
response = llm.chat(messages=[
ChatMessage(role="system", content="You are a helpful assistant."),
ChatMessage(role="user", content="Hi, how are you?")
])
response.json(encoder=openai_json_encoder) Lastly, if you have any other questions, we can continue to discuss. |
Of course, we can extend from ChatResponse, implement OpenAIChatResponse, and then use pydantic config like this: class OpenAIChatResponse(ChatResponse):
class Config:
json_encoders = {
Choice: lambda x: x.dict(),
CompletionUsage: lambda x: x.dict()
} However, I don't think this is necessary. |
Okay I understand. Thanks. |
The method |
Bug Description
chat response has
Choice
andUsage
can be json serializeVersion
any version
Steps to Reproduce
Relevant Logs/Tracbacks
No response
The text was updated successfully, but these errors were encountered: