LiteLLM ist eine Open-Source-Bibliothek, die das Ausfüllen von LLM und das Einbetten von Aufrufen vereinfacht. Sie bietet eine bequeme und benutzerfreundliche Schnittstelle zum Aufrufen verschiedener LLM-Modelle.
Um LiteLLM zu verwenden, müssen Sie die 'litellm'-Bibliothek importieren und die erforderlichen Umgebungsvariablen für die LLM-API-Schlüssel (z. B. OPENAI_API_KEY und COHERE_API_KEY) setzen. Sobald die Umgebungsvariablen festgelegt sind, können Sie eine Python-Funktion erstellen und LLM-Ausfüllaufrufe mit LiteLLM durchführen. LiteLLM ermöglicht es Ihnen, verschiedene LLM-Modelle zu vergleichen, indem es einen Demo-Spielplatz bereitstellt, auf dem Sie Python-Code schreiben und die Ausgaben anzeigen können.
Hier ist der LiteLLM Discord: https://discord.com/invite/wuPM9dRgDw. Für weitere Discord-Nachrichten klicken Sie bitte hier(/de/discord/wupm9drgdw).
LiteLLM Github link: https://github.com/BerriAI/litellm
Von Genevieve am Mai 22 2024
Master LLM Training: 15 Expertentipps für den Erfolg im Jahr 2023!
Social Media Listening
Power Each AI Agent With A Different LOCAL LLM (AutoGen + Ollama Tutorial)
In this video, I show you how to power AutoGen AI agents using individual open-source models per AI agent, this is going to be the future AI tech stack for running AI agents locally. Models are powered by Ollama and the API is exposed using LiteLLM. Enjoy :) Join My Newsletter for Regular AI Updates 👇🏼 https://forwardfuture.ai/ My Links 🔗 👉🏻 Subscribe: https://www.youtube.com/@matthew_berman 👉🏻 Twitter: https://twitter.com/matthewberman 👉🏻 Discord: https://discord.gg/xxysSXBxFW 👉🏻 Patreon: https://patreon.com/MatthewBerman Media/Sponsorship Inquiries 📈 https://bit.ly/44TC45V Links: Instructions - https://gist.github.com/mberman84/ea207e7d9e5f8c5f6a3252883ef16df3 Ollama - https://ollama.ai LiteLLM - https://litellm.ai/ AutoGen - https://github.com/microsoft/autogen https://www.youtube.com/watch?v=VJ6bK81meu8 https://www.youtube.com/watch?v=PUPO2tTyPOo https://www.youtube.com/watch?v=FHXmiAvloUg https://www.youtube.com/watch?v=10FCv-gCKug https://www.youtube.com/watch?v=V2qZ_lgxTzg https://www.youtube.com/watch?v=vU2S6dVf79M
AutoGen Studio: Build Self-Improving AI Agents With No-Code
I've been working on this video for a whole month in order to make the topic of AutoGen Studio as simple as possible. Non programmers that are curious about AI agents were always on my mind when I was making it. I hope you find it useful! 🤖 Join my Discord community: https://discord.gg/GGhr7pyTHD 📰 My tutorials on Medium: https://medium.com/@mayaakim 🐦 My twitter profile: https://twitter.com/Maya_Akim To rent a GPU from Massed Compute (autogen preinstalled) follow the link ⤵️ https://bit.ly/maya-akim Code for 50% discount: MayaAkim Prompts, model performance and skills for this project are here: https://github.com/majacinka/autogen-experiments Links: Chris Amato lesson: https://www.youtube.com/watch?v=Yd6HNZnqjis Satya Nadella: https://www.youtube.com/watch?v=0pLBvgYtv6U AutoGen Studio: https://microsoft.github.io/autogen/blog/2023/12/01/AutoGenStudio Autogen research: https://arxiv.org/pdf/2308.08155.pdf lilian’s blog: https://lilianweng.github.io/posts/2023-06-23-agent/ crewai: https://github.com/joaomdmoura/crewAI Wired article about dating AI agents: https://www.wired.com/story/volar-dating-app-chatbot-screen-matches/ Andrej Karpathy's tweet: https://x.com/karpathy/status/1748043513156272416?s=20 alphaCodium: https://github.com/Codium-ai/AlphaCodium/tree/main?tab=readme-ov-file prompting techniques: https://www.promptingguide.ai/ lessons about multi agents: https://www.youtube.com/watch?v=mGmhOHUoNMY&list=PL86282B88B486B92C Agent description selects next speaker: https://microsoft.github.io/autogen/blog/2023/12/29/AgentDescriptions/ people talking over each other: https://www.youtube.com/watch?v=3EsVPQwcd7U Node editor: https://ide.x-force.ai/ litellm: https://docs.litellm.ai/ litellm how to call any model: https://docs.litellm.ai/docs/simple_proxy litellm mistral: https://docs.litellm.ai/docs/providers/mistral litellm gemini: https://docs.litellm.ai/docs/providers/gemini mistral available models: https://docs.mistral.ai/platform/endpoints/ LM studio: https://lmstudio.ai/ open llm leaderboard: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard Open Source models that can do function calling: https://huggingface.co/models?other=function+calling&sort=likes function calling: https://cookbook.openai.com/examples/how_to_call_functions_with_chat_models https://platform.openai.com/docs/guides/function-calling function calling datasets: https://huggingface.co/datasets?sort=trending&search=function+calling 3 open source models I tested (with 7B parameters) https://huggingface.co/Trelis/Llama-2-7b-chat-hf-function-calling-v2 https://huggingface.co/TheBloke/airoboros-mistral2.2-7B-GPTQ https://huggingface.co/rizerphe/CodeLlama-function-calling-6320-7b-Instruct-hf Multi Agent System Research: https://arxiv.org/pdf/2309.07864.pdf Time code: 0:00 - 1:09 - intro 1:10 - 1:53 3 factor formula 1:54 - 6:00 factor 1 & installation 6:00 - 10:02 factor 2 10:03 - 12:24 run models through api 12:25 - 14:05 run models locally 14:06 - 16:51 factor 3 16:52 - 19:39 ARO system 19:40 - 20:39 smart models comparison 20:40 - 26:07 open source models performance 26:08 - 27:05 agent classifications #autogen #autogenstudio #aiagents
Langroid: Chat to a CSV file using Mixtral (via Ollama)
In this video, we'll learn about Langroid, an interesting LLM library that amongst other things, lets us query tabular data, including CSV files! It delegates part of the work to an LLM of your choice under the hood, and I decided to take it for a spin using Kaggle's world population dataset. We give it three different questions to answer and then I write Pandas code to check the results. The results are sometimes good, sometimes downright hallucinations! #pandas #llm #mixtral #litellm #ollama Resources Langroid - https://github.com/langroid/langroid/tree/ace9a67076786eb9eb9292375fbc92f71a5633e1 litellm - https://litellm.ai/ Mixtral (via Ollama) - https://ollama.ai/ World population data - https://www.kaggle.com/datasets/tanishqdublish/world-data-population?resource=download Code - https://github.com/mneedham/LearnDataWithMark/blob/main/langroid-sandbox/query_csv.py
Insgesamt müssen 10 Social Media-Daten zum Anzeigen freigeschaltet werden