Let Your LLM
Fetch Data from Databases
LLM Framework
db-ally is a framework for building robust structured data retrieval for your RAG pipeline or standalone LLM app. Built by engineers for engineers with production in mind, it easily integrates with your existing code base.
Practical Use Cases
With our experience developing AI agents, AI copilots, multimodal systems, recommendation engines, and AI assistants, we deliver tailored Large Language Model (LLM) solutions.
Key Features
If traditional querying data with LLMs failed you, db-ally is the solution you should definitely give a shot. db-ally is powered by an IQL, making it easy for LLMs to accurately query structured data.
Seamless Integrations
We currently have integrations with OpenAI and Meta’s LLaMA LLMs. Our expertise includes support for multiple data sources, with PostgreSQL as a key integration. Additionally, we are integrated with LangChain and leverage Meta FAISS for vector search.
We are actively working on integrating Gemini, expanding data source support to include Snowflake and GraphQL, and incorporating OpenTelemetry. We are also developing integrations with other vector databases, such as Chroma and Weaviate.
Explore How It Works
Watch the video where one of the creators of db-ally discusses the framework and its values.
Frequently Asked Questions
How may I contact db-ally team for support or inquiries?
Please reach out to us at db-ally
Why does db-ally use Intermediate Query Language (IQL) instead of directly generating SQL?
In contrast to available solutions, db-ally generates Intermediate Query Language (IQL) rather than directly producing SQL queries. This approach offers several significant benefits: Faster & Cheaper LLM Responses: Using IQL requires fewer tokens overall, resulting in more efficient and cost-effective responses from large language models (LLMs). Technology Agnostic: IQL can be transformed into a wide selection of data sources, providing flexibility and compatibility with various database systems. Reliability & Security: End-users have full control over the executed queries, ensuring greater security and reducing the risk of executing malicious SQL payloads. Easier to Work with for LLMs: Complex domain knowledge can be embedded in IQL methods, making it simpler for LLMs to comprehend and generate accurate responses. By leveraging IQL, db-ally delivers a robust, versatile, and secure solution for querying structured data using natural language.
What technologies does db-ally support, and how flexible is it for integration?
At its core, db-ally is designed to be agnostic to any specific technology, offering a wide selection of integrations to meet diverse needs. Our flexible framework ensures that integrating additional technologies is straightforward for any third party. Here is the initial scope of supported technologies: Data Source: PostgreSQL: Our initial integration supports PostgreSQL as the primary data source, ensuring robust and reliable data management. Large Language Models (LLMs): OpenAI & Anyscale Endpoints: db-ally supports these LLMs, enabling powerful natural language processing capabilities. Vector Search Capabilities: Meta FAISS: For efficient and scalable vector search, db-ally integrates with Meta FAISS. Observability: Langchain’s Langsmith: To ensure comprehensive observability and monitoring, we utilize Langsmith from Langchain. This modular and technology-agnostic design allows db-ally to easily incorporate additional technologies, making it a versatile and future-proof solution for your data querying needs.
How does db-ally handle domain-specific knowledge compared to Text2SQL solutions?
db-ally excels in embedding complex domain-specific knowledge into its Intermediate Query Language (IQL), which offers several benefits over traditional Text2SQL solutions: Simplified Representation: IQL encapsulates complex logic into easier methods for LLMs to understand and work with, reducing the complexity of natural language queries. Custom Methods: Users can define custom methods within IQL to handle specific domain requirements, ensuring the generated queries are accurate and contextually relevant. Enhanced LLM Understanding: IQL allows LLMs to generate more precise and relevant responses by providing a structured and simplified representation of domain knowledge, improving overall query accuracy. These features make db-ally particularly effective in scenarios requiring detailed domain-specific knowledge, where traditional Text2SQL solutions might struggle.
Can db-ally integrate with other data sources and technologies beyond the initial scope?
Yes, db-ally is designed flexibly, allowing easy integration with various data sources and technologies. While our initial scope includes PostgreSQL, OpenAI, Anyscale Endpoints, Meta FAISS, and Langchain’s Langsmith, our technology-agnostic approach ensures that third parties can integrate additional data sources and technologies as needed. This makes db-ally a versatile solution capable of adapting to various applications and environments.
How does db-ally compare to Text2SQL solutions?
While Text2SQL solutions directly generate SQL queries from natural language inputs, db-ally uses an Intermediate Query Language (IQL) to provide several advantages: Security: IQL acts as a protective layer, reducing the risk of SQL injection attacks and other security vulnerabilities common in direct Text2SQL approaches. Reliability: db-ally’s IQL ensures more consistent and reliable query generation, avoiding the unpredictability often seen with Text2SQL methods. Performance: By reducing the token count needed for LLM responses, db-ally offers faster and more cost-effective performance than traditional Text2SQL systems. Complex Query Handling: IQL simplifies the representation of complex domain-specific logic, making it easier for LLMs to generate accurate responses without getting bogged down by intricate SQL syntax.