Chatbot for documentation, that allows you to chat with your data. Privately deployable, provides AI knowledge sharing and integrates knowledge into your AI workflow
Open-Source Documentation Assistant
DocsGPT is a cutting-edge open-source solution that streamlines the process of finding information in the project documentation. With its integration of the powerful GPT models, developers can easily ask questions about a project and receive accurate answers.
Say goodbye to time-consuming manual searches, and let DocsGPT help you quickly find the information you need. Try it out and see how it revolutionizes your project documentation experience. Contribute to its development and be a part of the future of AI-powered assistance.
We’re eager to provide personalized assistance when deploying your DocsGPT to a live environment.
You can find our roadmap here. Please don’t hesitate to contribute or create issues, it helps us improve DocsGPT!
Name | Base Model | Requirements (or similar) |
---|---|---|
Docsgpt-7b-mistral | Mistral-7b | 1xA10G gpu |
Docsgpt-14b | llama-2-14b | 2xA10 gpu’s |
Docsgpt-40b-falcon | falcon-40b | 8xA10G gpu’s |
If you don’t have enough resources to run it, you can use bitsnbytes to quantize.
🔍 🔥 Cloud Version
💬 🎉 Join our Discord
📚 😎 Guides
🏠 🔐 How to host it locally (so all data will stay on-premises)
Application - Flask app (main application).
Extensions - Chrome extension.
Scripts - Script that creates similarity search index for other libraries.
[!Note]
Make sure you have Docker installed
On Mac OS or Linux, write:
./setup.sh
It will install all the dependencies and allow you to download the local model, use OpenAI or use our LLM API.
Otherwise, refer to this Guide for Windows:
Download and open this repository with git clone https://github.com/arc53/DocsGPT.git
Create a .env
file in your root directory and set the env variables and VITE_API_STREAMING
to true or false, depending on whether you want streaming answers or not.
It should look like this inside:
LLM_NAME=[docsgpt or openai or others]
VITE_API_STREAMING=true
API_KEY=[if LLM_NAME is openai]
See optional environment variables in the /.env-template and /application/.env_sample files.
Navigate to http://localhost:5173/.
To stop, just run Ctrl + C
.
For development, only two containers are used from docker-compose.yaml (by deleting all services except for Redis and Mongo).
See file docker-compose-dev.yaml.
Run
docker compose -f docker-compose-dev.yaml build
docker compose -f docker-compose-dev.yaml up -d
[!Note]
Make sure you have Python 3.12 installed.
.env
file in the project folder:
.env
.(check out application/core/settings.py
if you want to see more config options.)
a) On Mac OS and Linux
python -m venv venv
. venv/bin/activate
b) On Windows
python -m venv venv
venv/Scripts/activate
model/
folder:model/
folder.wget https://d3dg1063dc54p9.cloudfront.net/models/embeddings/mpnet-base-v2.zip
unzip mpnet-base-v2.zip -d model
rm mpnet-base-v2.zip
pip install -r application/requirements.txt
flask --app application/app.py run --host=0.0.0.0 --port=7091
.celery -A application.app.celery worker -l INFO
.[!Note]
Make sure you have Node version 16 or higher.
husky
and vite
(ignore if already installed).npm install husky -g
npm install vite -g
npm install --include=dev
.npm run dev
.Please refer to the CONTRIBUTING.md file for information about how to get involved. We welcome issues, questions, and pull requests.
We as members, contributors, and leaders, pledge to make participation in our community a harassment-free experience for everyone, regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, level of experience, education, socio-economic status, nationality, personal appearance, race, religion, or sexual identity and orientation. Please refer to the CODE_OF_CONDUCT.md file for more information about contributing.
The source code license is MIT, as described in the LICENSE file.
This project is supported by: