chatdatabase.github.io - ChatDB: Augmenting LLMs with Databases as Their Symbolic Memory

Description: DESCRIPTION META TAG

keywords should be placed here (44)

Example domain paragraphs

Large language models (LLMs) with memory are computationally universal. However, mainstream LLMs are not taking full advantage of memory, and the designs are heavily influenced by biological brains. Due to their approximate nature and proneness to the accumulation of errors, conventional neural memory mechanisms cannot support LLMs to simulate complex reasoning. In this paper, we seek inspiration from modern computer architectures to augment LLMs with symbolic memory for complex multi-hop reasoning. Such a

While large language models get more capable in language organization and knowledge reasoning, one of their main issues is handling long contexts, e.g., GPT-4 works with 32K sequence length, and Claude works with 100K. This is a practical issue while chaining LLMs into software applications for daily and industrial applications: as a personal chatbot, it forgets about your preferences, as every day is a new day for an LLM; as a business analytic tool, it can only process data captured within a small time-wi

Here, we introduce ChatDB , a novel framework integrating symbolic memory with LLMs. ChatDB explores ways of augmenting LLMs with symbolic memory to handle contexts of arbitrary lengths. Such a symbolic memory framework is instantiated as an LLM with a set of SQL databases. The LLM generates SQL instructions to manipulate the SQL databases autonomously (including insertion, selection, update, and deletion), aiming to complete a complex task requiring multi-hop reasoning and long-term symbolic memory. This c

Links to chatdatabase.github.io (4)