GGUF is a new format introduced by the llama.cpp team on August 21st 2023. It is a replacement for GGML, which is no longer supported by llama.cpp.
Here is an incomplete list of clients and libraries that are known to support GGUF:
llama.cpp
. The source project for GGUF. Offers a CLI and a server option.
llama-cpp-python
, a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.
LM Studio
, an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.
text-generation-webui
, the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.
KoboldCpp
, a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.
GPT4All
, a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.
LoLLMS Web UI
, a great web UI with many interesting and unique features, including a full model library for easy model selection.
Faraday.dev
, an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.
candle
, a Rust ML framework with a focus on performance, including GPU support, and ease of use.
ctransformers
, a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.
Multi_verse_modelInex12-7B-GGUF huggingface.co is an AI model on huggingface.co that provides Multi_verse_modelInex12-7B-GGUF's model effect (), which can be used instantly with this MaziyarPanahi Multi_verse_modelInex12-7B-GGUF model. huggingface.co supports a free trial of the Multi_verse_modelInex12-7B-GGUF model, and also provides paid use of the Multi_verse_modelInex12-7B-GGUF. Support call Multi_verse_modelInex12-7B-GGUF model through api, including Node.js, Python, http.
Multi_verse_modelInex12-7B-GGUF huggingface.co is an online trial and call api platform, which integrates Multi_verse_modelInex12-7B-GGUF's modeling effects, including api services, and provides a free online trial of Multi_verse_modelInex12-7B-GGUF, you can try Multi_verse_modelInex12-7B-GGUF online for free by clicking the link below.
MaziyarPanahi Multi_verse_modelInex12-7B-GGUF online free url in huggingface.co:
Multi_verse_modelInex12-7B-GGUF is an open source model from GitHub that offers a free installation service, and any user can find Multi_verse_modelInex12-7B-GGUF on GitHub to install. At the same time, huggingface.co provides the effect of Multi_verse_modelInex12-7B-GGUF install, users can directly use Multi_verse_modelInex12-7B-GGUF installed effect in huggingface.co for debugging and trial. It also supports api for free installation.
Multi_verse_modelInex12-7B-GGUF install url in huggingface.co: