Snowflake Loader for LLM
Recently my second contribution to Llamaindex "SnowflakeReader" was merged to Lllamahub repository. This loader connects to Snowflake (using SQLAlchemy under the hood). The user specifies a query and extracts Document objects corresponding to the results. This loader is designed to be used as a way to load data into LlamaIndex and/or subsequently used as a Tool in a LangChain Agent.
Usage
Option 1: Pass your own SQLAlchemy Engine object of the database connection
Here's an example usage of the SnowflakeReader.
from llama_index import download_loader SnowflakeReader = download_loader('SnowflakeReader') reader = SnowflakeReader( engine= your_sqlalchemy_engine, ) query = "SELECT * FROM your_table" documents = reader.load_data(query=query)
Option 2: Pass the required parameters to esstablish Snowflake connection
Here's an example usage of the SnowflakeReader.
from llama_index import download_loader SnowflakeReader = download_loader('SnowflakeReader') reader = SnowflakeReader( account='your_account', user='your_user', password='your_password', database='your_database', schema='your_schema', warehouse='your_warehouse', role='your_role', # Optional role setting proxy='http://proxusername:proxypassword@myproxy:port' # Optional proxy setting ) query = "SELECT * FROM your_table" documents = reader.load_data(query=query)
Langchain compatibility
documents = [ doc.to_langhchain_format() for doc in documents ]
Semantic Kernel compatibility
Convert documents to MemoryRecord
objects
documents = [ doc.to_semantic_kernel_format() for doc in documents ]
Comments