Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We��ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Add claude 3.5 sonnet support to Bedrock InvokeAPI #14624

Closed
nfrnunes opened this issue Jul 8, 2024 · 2 comments
Closed

[Bug]: Add claude 3.5 sonnet support to Bedrock InvokeAPI #14624

nfrnunes opened this issue Jul 8, 2024 · 2 comments
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized

Comments

@nfrnunes
Copy link

nfrnunes commented Jul 8, 2024

Bug Description

There is a missing comma, that prevents the usage of streaming for haiku and sonnet3.5 models:
from llama_index.llms.bedrock.utils import STREAMING_MODELS
STREAMING_MODELS

{'amazon.titan-text-express-v1',
'amazon.titan-tg1-large',
'anthropic.claude-3-5-sonnet-20240620-v1:0anthropic.claude-3-haiku-20240307-v1:0',
'anthropic.claude-3-opus-20240229-v1:0',
'anthropic.claude-3-sonnet-20240229-v1:0',
'anthropic.claude-instant-v1',
'anthropic.claude-v1',
'anthropic.claude-v2',
'anthropic.claude-v2:1',
'meta.llama2-13b-chat-v1',
'mistral.mistral-7b-instruct-v0:2',
'mistral.mistral-large-2402-v1:0',
'mistral.mixtral-8x7b-instruct-v0:1'}

ValueError: Model anthropic.claude-3-5-sonnet-20240620-v1:0 does not support streaming

Version

v0.10.0

Steps to Reproduce

from llama_index.llms.bedrock.utils import STREAMING_MODELS
STREAMING_MODELS

{'amazon.titan-text-express-v1',
'amazon.titan-tg1-large',
'anthropic.claude-3-5-sonnet-20240620-v1:0anthropic.claude-3-haiku-20240307-v1:0',
'anthropic.claude-3-opus-20240229-v1:0',
'anthropic.claude-3-sonnet-20240229-v1:0',
'anthropic.claude-instant-v1',
'anthropic.claude-v1',
'anthropic.claude-v2',
'anthropic.claude-v2:1',
'meta.llama2-13b-chat-v1',
'mistral.mistral-7b-instruct-v0:2',
'mistral.mistral-large-2402-v1:0',
'mistral.mixtral-8x7b-instruct-v0:1'}

Relevant Logs/Tracbacks

from llama_index.llms.bedrock.utils import STREAMING_MODELS
STREAMING_MODELS

{'amazon.titan-text-express-v1',
'amazon.titan-tg1-large',
'anthropic.claude-3-5-sonnet-20240620-v1:0anthropic.claude-3-haiku-20240307-v1:0',
'anthropic.claude-3-opus-20240229-v1:0',
'anthropic.claude-3-sonnet-20240229-v1:0',
'anthropic.claude-instant-v1',
'anthropic.claude-v1',
'anthropic.claude-v2',
'anthropic.claude-v2:1',
'meta.llama2-13b-chat-v1',
'mistral.mistral-7b-instruct-v0:2',
'mistral.mistral-large-2402-v1:0',
'mistral.mixtral-8x7b-instruct-v0:1'}
@nfrnunes nfrnunes added bug Something isn't working triage Issue needs to be triaged/prioritized labels Jul 8, 2024
Copy link

dosubot bot commented Jul 8, 2024

Hello @nfrnunes! I'm here to assist you with any bugs, questions, or contributions. Let's work together to resolve any issues you encounter.

To fix the missing comma in the STREAMING_MODELS dictionary, you need to add a comma between "anthropic.claude-3-5-sonnet-20240620-v1:0" and "anthropic.claude-3-haiku-20240307-v1:0".

Here is the corrected code:

STREAMING_MODELS = {
    "amazon.titan-tg1-large",
    "amazon.titan-text-express-v1",
    "anthropic.claude-instant-v1",
    "anthropic.claude-v1",
    "anthropic.claude-v2",
    "anthropic.claude-v2:1",
    "anthropic.claude-3-5-sonnet-20240620-v1:0",
    "anthropic.claude-3-haiku-20240307-v1:0",  # Added comma here
    "anthropic.claude-3-opus-20240229-v1:0",
    "anthropic.claude-3-sonnet-20240229-v1:0",
    "meta.llama2-13b-chat-v1",
    "mistral.mistral-7b-instruct-v0:2",
    "mistral.mixtral-8x7b-instruct-v0:1",
    "mistral.mistral-large-2402-v1:0",
}

This will separate the haiku and sonnet3.5 models correctly and prevent the ValueError you encountered [1].

To continue talking to Dosu, mention @dosu.

@logan-markewich
Copy link
Collaborator

This has since been added

pip install -U llama-index-llms-bedrock

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working triage Issue needs to be triaged/prioritized
2 participants