Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question]: Imports no longer working #14610

Open
1 task done
RonRastorguev opened this issue Jul 7, 2024 · 2 comments
Open
1 task done

[Question]: Imports no longer working #14610

RonRastorguev opened this issue Jul 7, 2024 · 2 comments
Labels
question Further information is requested

Comments

@RonRastorguev
Copy link

RonRastorguev commented Jul 7, 2024

Question Validation

  • I have searched both the documentation and discord for an answer.

Question

I am on llamaindex version 0.10.30. I have a lot of import statements:

`from llama_index.llms.openai import OpenAI

from llama_index.core.indices.vector_store.base import VectorStoreIndex

from llama_index.core.storage import StorageContext

from llama_index.core.indices import load_index_from_storage

from llama_index.core.service_context import ServiceContext

from llama_index.core.readers import SimpleDirectoryReader, download_loader

from llama_index.core import ChromaVectorStore

from llama_index.embeddings.huggingface import HuggingFaceEmbedding

from llama_index.core.retrievers import VectorIndexRetriever

from llama_index.core.query_engine import RetrieverQueryEngine

from llama_index.core.chat_engine import CondensePlusContextChatEngine

from llama_index.core.node_parser import SentenceSplitter

from llama_index.core.types import ChatMessage

from llama_index.core.vector_stores.types import MetadataFilters

from llama_index.core.vector_stores.types import MetadataFilter

from llama_index.postprocessor.colbert_rerank import ColbertRerank

from llama_index.core import base_query_engine`

The imports are different than those in examples because for some reason when I do follow the examples, for instance importing the VectorStoreIndex by doing from llama_index.core import VectorStoreIndex I get the error: ImportError: cannot import name 'VectorStoreIndex' from 'llama_index.core' (unknown location)

Using these import statements were the only way I could make the text not white, and not cause the flagging by the IDE. When I run the code now it says:
from llama_index.core.image_retriever import BaseImageRetriever
ModuleNotFoundError: No module named 'llama_index.core.image_retriever'

I never use the BaseImageRetriever though.

How do I go about fixing this?

@RonRastorguev RonRastorguev added the question Further information is requested label Jul 7, 2024
@logan-markewich
Copy link
Collaborator

Seems like a botched update from v0.9.x

Just install with a fresh venv, should be fine

@KanrongYu
Copy link

had the same issue with you. Though was able to fix some of these, e.g. changing the import to from llama_index.legacy import VectorStoreIndex, but this seems non-sustainable. In the end, do what @logan-markewich suggests and no issue at all when running sample code in the new venv

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
3 participants