Spanner for LangChain
Quick Start
In order to use this library, you first need to go through the following steps:
Installation
Install this library in a virtualenv using pip. virtualenv is a tool to create isolated Python environments. The basic problem it addresses is one of dependencies and versions, and indirectly permissions.
With virtualenv, it’s possible to install this library without needing system install permissions, and without clashing with the installed system dependencies.
Supported Python Versions
Python >= 3.9
Mac/Linux
pip install virtualenv
virtualenv <your-env>
source <your-env>/bin/activate
<your-env>/bin/pip install langchain-google-spanner
Windows
pip install virtualenv
virtualenv <your-env>
<your-env>\Scripts\activate
<your-env>\Scripts\pip.exe install langchain-google-spanner
Vector Store Usage
Use a vector store to store embedded data and perform vector search.
from langchain_google_sapnner import SpannerVectorstore
from langchain.embeddings import VertexAIEmbeddings
embeddings_service = VertexAIEmbeddings(model_name="textembedding-gecko@003")
vectorstore = SpannerVectorStore(
instance_id="my-instance",
database_id="my-database",
table_name="my-table",
embeddings=embedding_service
)
See the full Vector Store tutorial.
Document Loader Usage
Use a document loader to load data as LangChain Document
s.
from langchain_google_spanner import SpannerLoader
loader = SpannerLoader(
instance_id="my-instance",
database_id="my-database",
query="SELECT * from my_table_name"
)
docs = loader.lazy_load()
See the full Document Loader tutorial.
Chat Message History Usage
Use ChatMessageHistory
to store messages and provide conversation
history to LLMs.
from langchain_google_spanner import SpannerChatMessageHistory
history = SpannerChatMessageHistory(
instance_id="my-instance",
database_id="my-database",
table_name="my_table_name",
session_id="my-session_id"
)
See the full Chat Message History tutorial.
Spanner Graph Store Usage
Use SpannerGraphStore
to store nodes and edges extracted from documents.
from langchain_google_spanner import SpannerGraphStore
graph = SpannerGraphStore(
instance_id="my-instance",
database_id="my-database",
graph_name="my_graph",
)
See the full Spanner Graph Store tutorial.
Spanner Graph QA Chain Usage
Use SpannerGraphQAChain
for question answering over a graph stored in Spanner Graph.
from langchain_google_spanner import SpannerGraphStore, SpannerGraphQAChain
from langchain_google_vertexai import ChatVertexAI
graph = SpannerGraphStore(
instance_id="my-instance",
database_id="my-database",
graph_name="my_graph",
)
llm = ChatVertexAI()
chain = SpannerGraphQAChain.from_llm(
llm,
graph=graph,
allow_dangerous_requests=True
)
chain.invoke("query=Where does Sarah's sibling live?")
See the full Spanner Graph QA Chain tutorial.
Contributions
Contributions to this library are always welcome and highly encouraged.
See CONTRIBUTING for more information how to get started.
Please note that this project is released with a Contributor Code of Conduct. By participating in this project you agree to abide by its terms. See Code of Conduct for more information.
License
Apache 2.0 - See LICENSE for more information.
Disclaimer
This is not an officially supported Google product.