Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to solve this error #34

Open
its-athira opened this issue Jul 29, 2023 · 6 comments
Open

how to solve this error #34

its-athira opened this issue Jul 29, 2023 · 6 comments

Comments

@its-athira
Copy link

its-athira commented Jul 29, 2023

error2

@j-min
Copy link
Owner

j-min commented Jul 29, 2023

Hi @its-athira, thanks for your interest. But there's not enough information from the screenshot, as I don't know your environment, package version, etc. Could you please share more information about your environment, your edits and where the error comes from?

@its-athira
Copy link
Author

its-athira commented Jul 30, 2023 via email

@j-min
Copy link
Owner

j-min commented Jul 30, 2023

I don't have access to the linked colab. Would you please make it public?

@its-athira
Copy link
Author

its-athira commented Jul 30, 2023 via email

@j-min
Copy link
Owner

j-min commented Jul 31, 2023

You're using the latest transformers version==4.31.0, but the current codebase is based on 4.2.1, as mentioned in requirements.txt

Here (https://github.com/j-min/VL-T5/blob/HF-4.31.0/VL-T5/src/modeling_t5.py), I'm working on a version compatible to 4.31.0, which you might wanna check. For tokenization, I suggest to use VLT5Tokenizer instead of VLT5TokenizerFast for now (check #21)

@its-athira
Copy link
Author

its-athira commented Jul 31, 2023 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants