Giter Site home page Giter Site logo

Comments (6)

fcakyon avatar fcakyon commented on May 13, 2024 2

Yes @igoralvarezz it is currently supported. You can create an instance of HuggingFaceLLM and use it in the autollm pipeline as:

import torch

from llama_index.llms import HuggingFaceLLM
from autollm import AutoServiceContext, AutoVectorStoreIndex, AutoQueryEngine

llm = HuggingFaceLLM(
    context_window=4096,
    max_new_tokens=256,
    query_wrapper_prompt,
    tokenizer_name="StabilityAI/stablelm-tuned-alpha-3b",
    model_name="StabilityAI/stablelm-tuned-alpha-3b",
    device_map="auto",
    stopping_ids=[50278, 50279, 50277, 1, 0],
)

service_context = AutoServiceContext.from_defaults(llm=llm)
vector_store_index = AutoVectorStoreIndex.from_defaults()
query_engine = AutoQueryEngine.from_instances(vector_store_index. service_context)

from autollm.

fcakyon avatar fcakyon commented on May 13, 2024 1

I was not aware OpenAI and Claude2 could be hosted locally. I stand corrected.

They don't work locally, but they do have a larger context window.

You should use HuggignFace and Ollama models as local alternatives with autollm.

from autollm.

fcakyon avatar fcakyon commented on May 13, 2024

Hello @jonny7737, OpenAI provides 16K context windiw models, clause-v2 has 100K context window.

Moreover, you can use any opensource HuggingFace and Ollama models with autollm (since we are using LiteLMM underhood). Feel free to ask more questions!

from autollm.

jonny7737 avatar jonny7737 commented on May 13, 2024

I was not aware OpenAI and Claude2 could be hosted locally. I stand corrected.

from autollm.

Meathelix1 avatar Meathelix1 commented on May 13, 2024

I was not aware OpenAI and Claude2 could be hosted locally. I stand corrected.

They cant.. The only thing you are hosting locally would be the database that these models speak too, or the model from the Local LLM.

from autollm.

igoralvarezz avatar igoralvarezz commented on May 13, 2024

I was not aware OpenAI and Claude2 could be hosted locally. I stand corrected.

They don't work locally, but they do have a larger context window.

You should use HuggignFace and Ollama models as local alternatives with autollm.

Is there a way to use local (offline) models from huggingface without the use of Ollama (currently not available for Windows OS yet)?

from autollm.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.