GenAI Stack (old)
v0.2.0
v0.2.0
  • Getting Started
    • 💬Introduction
    • 🚀Quickstart with colab
    • 📘Default Data Types
    • 🪛Installation
  • Components
    • ✨Introduction
    • 🚜ETL
      • 🔥Quickstart
      • 🦜Langchain
      • 🦙LLama Hub
    • 🌱Embeddings
      • 🔥Quickstart
      • 🦜Langchain
      • 📖Advanced Usage
    • 🔮Vector Database
      • 🔥Quickstart
      • 📦Chromadb
      • 📦Weaviate
      • 📖Advanced Usage
    • 📚Prompt Engine
      • 🔥Quickstart
      • 📖Advanced Usage
    • 📤Retrieval
      • 🔥Quickstart
      • 📖Advanced Usage
    • ️️️🗃️ LLM Cache
      • 🔥Quickstart
    • 📦Memory
      • 🔥Quickstart
      • 📖Advanced Usage
    • 🦄LLMs
      • OpenAI
      • GPT4All
      • Hugging Face
      • Custom Model
  • Advanced Guide
    • 💻GenAI Stack API Server
    • 🔃GenAI Server API's Reference
  • Example Use Cases
    • 💬Chat on PDF
    • 💬Chat on CSV
    • 💬Similarity Search on JSON
    • 📖Document Search
    • 💬RAG pipeline
    • 📚Information Retrieval Pipeline
  • 🧑CONTRIBUTING.md
Powered by GitBook
On this page
  • Setup environment
  • Installation
  • How to run LLM?
  1. Getting Started

Installation

PreviousDefault Data TypesNextIntroduction

Last updated 1 year ago

Setup environment

Create environment

python3 -m venv env

Activate environment

For Mac & Linux

source env/bin/activate

For Windows(Powershell)

env\Scripts\Activate.ps1

Note: For more information about the Python environment please visit the docs .

Installation

  • Installation from pypi

    Install latest version

    pip install genai_stack

    Install a particular version

    pip install genai_stack==0.2.5
  • Install from github

    pip install git+https://github.com/aiplanethub/genai-stack.git

That's it your local setup is ready. Let's go ahead & test it.

How to run LLM?

Once the installation is complete you're good to go.

Note: Here we will be running just an LLM model without any vector stores. We will cover vector stores in the vector store section.

Run in a local environment

Currently, we support the following models:

Import the required model(Here we will use the gpt4all model) and initialize it and predict it.

from genai_stack.model import Gpt4AllModel

llm = Gpt4AllModel.from_kwargs()
model_response = llm.predict("How many countries are there in the world?")
print(model_response["result"])

If you directly used Python shell you will get the output if you're using a file to execute the file.

python3 <file_name.py>
# Response from the above command
There are currently 195 recognized independent states in the world.

Now you know how to use the GenAI Stack locally.

🪛
here
GPT4all
GPT3