Skip to content

How to serialize a VectorStoreIndex? #8077

Closed Answered by logan-markewich
BrianP8701 asked this question in Q&A
Discussion options

You must be logged in to vote

@BrianP8701 the bot was a little off

You can use the following to save/load to disk. You can also change the storage backend using fsspec, to use something like S3

index.storage_context.persist(persist_dir="./storage")

from llama_index import StorageContext, load_index_from_storage
storage_context = StorageContext.from_defaults(persist_dir="./storage")

# optional service context
loaded_index = load_index_from_storage(storage_context, service_context=service_context)

https://docs.llamaindex.ai/en/stable/core_modules/data_modules/storage/save_load.html#using-a-remote-backend

Replies: 2 comments 2 replies

Comment options

You must be logged in to vote
2 replies
@BrianP8701
Comment options

@dosubot
Comment options

Comment options

You must be logged in to vote
0 replies
Answer selected by logan-markewich
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants