-
Notifications
You must be signed in to change notification settings - Fork 4.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding vector store for Azure Cosmos DB NoSql #14158
base: main
Are you sure you want to change the base?
Conversation
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
@logan-markewich can you please review this PR. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great job Gaby! Just need to clean up a bit and I had one pending question.
# print("This is node") | ||
# print(node) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
leftover print statements - these are not the only ones, let's make sure to remove those altogether from the entire file
"Requirement already satisfied: httpx in c:\\users\\t-garagundi\\appdata\\local\\programs\\python\\python312\\lib\\site-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-embeddings-openai) (0.27.0)\n", | ||
"Requirement already satisfied: llamaindex-py-client<0.2.0,>=0.1.18 in c:\\users\\t-garagundi\\appdata\\local\\programs\\python\\python312\\lib\\site-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-embeddings-openai) (0.1.19)\n", | ||
"Requirement already satisfied: nest-asyncio<2.0.0,>=1.5.8 in c:\\users\\t-garagundi\\appdata\\local\\programs\\python\\python312\\lib\\site-packages (from llama-index-core<0.11.0,>=0.10.1->llama-index-embeddings-openai) (1.6.0)\n", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
all of these should be removed
"Requirement already satisfied: llama-index-llms-openai<0.2.0,>=0.1.13 in c:\\users\\t-garagundi\\appdata\\local\\programs\\python\\python312\\lib\\site-packages (from llama-index) (0.1.22)\n", | ||
"Requirement already satisfied: llama-index-multi-modal-llms-openai<0.2.0,>=0.1.3 in c:\\users\\t-garagundi\\appdata\\local\\programs\\python\\python312\\lib\\site-packages (from llama-index) (0.1.6)\n", | ||
"Requirement already satisfied: llama-index-program-openai<0.2.0,>=0.1.3 in c:\\users\\t-garagundi\\appdata\\local\\programs\\python\\python312\\lib\\site-packages (from llama-index) (0.1.6)\n", | ||
"Requirement already satisfied: llama-index-question-gen-openai<0.2.0,>=0.1.2 in c:\\users\\t-garagundi\\appdata\\local\\programs\\python\\python312\\lib\\site-packages (from llama-index) (0.1.3)\n", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same here
for item in self._container.query_items( | ||
query='SELECT TOP @k c.id, c.embedding, c.text, c.metadata, VectorDistance(c.embedding,@embedding) AS SimilarityScore FROM c ORDER BY VectorDistance(c.embedding,@embedding)', | ||
parameters=[{"name": "@k", "value": params['k']}, | ||
# {"name": "@embedding_key", "value": params["path"]}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is this comment needed?
Description
Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.
Fixes # 14159
Added vector store capabilities with Azure Cosmos DB NoSql by creating AzureCosmosDBNoSqlVectorSearch class as well as tests for adding to the class and query.
Read more about this: https://learn.microsoft.com/en-us/azure/cosmos-db/nosql/vector-search
New Package?
Did I fill in the
tool.llamahub
section in thepyproject.toml
and provide a detailed README.md for my new integration or package?Version Bump?
Did I bump the version in the
pyproject.toml
file of the package I am updating? (Except for thellama-index-core
package)Type of Change
Please delete options that are not relevant.
How Has This Been Tested?
Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration
Suggested Checklist:
make format; make lint
to appease the lint gods