GenAI Stack (old)
v0.2.0
v0.2.0
  • Getting Started
    • 💬Introduction
    • 🚀Quickstart with colab
    • 📘Default Data Types
    • 🪛Installation
  • Components
    • ✨Introduction
    • 🚜ETL
      • 🔥Quickstart
      • 🦜Langchain
      • 🦙LLama Hub
    • 🌱Embeddings
      • 🔥Quickstart
      • 🦜Langchain
      • 📖Advanced Usage
    • 🔮Vector Database
      • 🔥Quickstart
      • 📦Chromadb
      • 📦Weaviate
      • 📖Advanced Usage
    • 📚Prompt Engine
      • 🔥Quickstart
      • 📖Advanced Usage
    • 📤Retrieval
      • 🔥Quickstart
      • 📖Advanced Usage
    • ️️️🗃️ LLM Cache
      • 🔥Quickstart
    • 📦Memory
      • 🔥Quickstart
      • 📖Advanced Usage
    • 🦄LLMs
      • OpenAI
      • GPT4All
      • Hugging Face
      • Custom Model
  • Advanced Guide
    • 💻GenAI Stack API Server
    • 🔃GenAI Server API's Reference
  • Example Use Cases
    • 💬Chat on PDF
    • 💬Chat on CSV
    • 💬Similarity Search on JSON
    • 📖Document Search
    • 💬RAG pipeline
    • 📚Information Retrieval Pipeline
  • 🧑CONTRIBUTING.md
Powered by GitBook
On this page
  1. Components
  2. LLMs

Hugging Face

How to configure & use it?

Supported parameters

  • model (Optional[str]): The name or identifier of the Hugging Face model to use. This parameter is optional, and its default value is "nomic-ai/gpt4all-j".

  • model_kwargs (Optional[Dict]): Keyword arguments passed to the Hugging Face model (optional).

  • pipeline_kwargs (Optional[dict]): Keyword arguments passed to the Hugging Face pipeline (optional).

  • task (str): The task associated with the model. Valid options include 'text2text-generation', 'text-generation', and 'summarization'.

  • pipeline (pipeline): Pass pipeline directly to the component. If pipeline is passed, all other configs are ignored. Running in a Colab/Kaggle/Python scripts(s)```python

from genai_stack.model import HuggingFaceModel
from genai_stack.stack.stack import Stack

llm = HuggingFaceModel.from_kwargs()
Stack(model=llm)  # Initialize stack
model_response = llm.predict("How many countries are there in the world?")
print(model_response["output"])
  • Import the model from genai_stack.model

  • Instantiate the class with parameters you want to customize

PreviousGPT4AllNextCustom Model

Last updated 1 year ago

🦄