GenAI Stack (old)
v0.1.0
v0.1.0
  • Getting Started
    • 📚Introduction
    • 🚀Quickstart with colab
    • 📘Default Data Types
    • 🪛Installation
  • Components
    • ✨Introduction
    • 🚜Data Extraction and Loading
      • 🔥Quickstart
      • 📖Advanced Usage
    • 🔮Vector Database
      • 🔥Quickstart
      • 📦Chromadb
      • 📦Weaviate
      • 📖Advanced Usage
    • 📤Retrieval
    • 🦄LLMs
      • OpenAI
      • GPT4All
      • Custom Model
      • 📖Advanced Usage
  • Example Use Cases
    • 💬Chat on PDF
    • âš¡Chat on Webpage
    • 📜Chat on PDF with UI
  • 🧑CONTRIBUTING.md
Powered by GitBook
On this page
  • Python Implementation
  • CLI Implementation
  1. Example Use Cases

Chat on PDF

PreviousAdvanced UsageNextChat on Webpage

Last updated 1 year ago

Python Implementation

Since we have a PDF default data loader we can use it directly from .

from genai_stack.model import OpenAIGpt35Model

model = OpenAIGpt35Model.from_kwargs(
 fields={"openai_api_key": "Paste your Open AI key"}
)
model.add_source("pdf", "valid_pdf_path_or_url")
model.predict("<Any question on top of the pdf>")

CLI Implementation

etl.json

{
    "etl": "langchain",
    "source": {
        "name": "PyPDFLoader",
        "fields": {
            "file_path": "/your/pdf/path"
        }
    },
    "vectordb": {
        "name": "chromadb",
        "class_name": "genai_stack"
    }
}

Run the ETL command

genai-stack etl --config_file etl.json

model.json

{
    "model": {
        "name": "gpt4all"
    },
    "retriever": {
        "name": "langchain"
    },
    "vectordb": {
        "name": "chromadb",
        "class_name": "genai_stack"
    }
}

Run the model server

genai-stack start --config_file model.json

You can make predictions on this model server:

import requests

url = "http://127.0.0.1:8082/predict"
res = requests.post(url, data={"query": "<Any question on top of the pdf>"})
print(res.content)
💬
here