Harness the Power of Large Language Models in Jupyter with Jupyter AI

Harness the Power of Large Language Models in Jupyter with Jupyter AI

Table of Contents:

  1. Introduction
  2. Installing the Jupyter AI Extension
  3. Using OpenAI in Jupyter
    • 3.1. Loading the Extension
    • 3.2. Defining the OpenAI API Key
    • 3.3. Using Jat GPT Magic Commands
  4. Using Hugging Face Hub in Jupyter
    • 4.1. Specifying the Environment Variable
    • 4.2. Using the Hugging Face Hub Magic Command
  5. Troubleshooting with Hugging Face Hub
  6. Exploring Available Models and Providers
  7. Conclusion

Introduction

Welcome back to another Python video! In this Tutorial, we will explore how to use the powerful Large Language Models in Jupyter, specifically focusing on Jupiter AI. By following the steps outlined below, you will be able to set up and utilize OpenAI and Hugging Face Hub in Jupyter. So, let's get started!

Installing the Jupyter AI Extension

Before we can use large language models in Jupyter, we need to install the Jupyter AI extension. This can be done by running either pip install jupyter-ai or pip install jupyter-ai-magics, depending on whether you plan to use Jupyter Lab as well. If you only intend to use it in Jupyter notebooks, the latter command is sufficient. Install the extension to proceed further with the setup.

Using OpenAI in Jupyter

3.1. Loading the Extension

Once the Jupyter AI extension is installed, we need to load the extension before using it. To do this, simply call the following magic command at the beginning of your notebook: %load_ext jupyter_ai.magics. This will ensure that the required functionality is available for use.

3.2. Defining the OpenAI API Key

To use OpenAI in Jupyter, we need to specify an environmental variable called OPENAI_API_KEY. This key is essential for accessing OpenAI's language models. If you haven't registered for a key, you can do so for free on OpenAI's website. After registering, create your own API keys and assign them to the environmental variable.

3.3. Using Jat GPT Magic Commands

With the extension loaded and the API key defined, we can now use the Jat GPT magic command to interact with the OpenAI language model. Simply type %%ai.openai.GPT followed by your question or input. For example, you can ask a question such as "Please generate a sample application with a file uploader for PDF and WORD Documents." Jat GPT will process your query and provide a generated code snippet as output.

One advantage of using OpenAI is that it offers a free version. By using your own API key, you can take advantage of OpenAI's language models without any additional cost.

Using Hugging Face Hub in Jupyter

Apart from OpenAI, you can also use Hugging Face Hub in Jupyter to access different models. Setting up Hugging Face Hub requires a few additional steps.

4.1. Specifying the Environment Variable

In association with Hugging Face Hub, we need to define an environmental variable called HUGGINGFACE_HUB_API_TOKEN. When using Hugging Face Hub, this token is necessary to access the models. Specify your token in this variable.

4.2. Using the Hugging Face Hub Magic Command

Similar to the OpenAI magic command, we need to use %%ai.huggingface.Hub to interact with models from the Hugging Face Hub. After specifying the model provider (in this case, Hugging Face), include a colon and mention the specific repo ID you wish to use. For example, you can try using the model "llama-13B" from the Open Assistant repository. Ask your question after the magic command, and the model will generate a response. However, keep in mind that due to potential timeout errors, the process might not always succeed.

Troubleshooting with Hugging Face Hub

If you encounter any errors when using Hugging Face Hub, such as the timeout error Mentioned above, there are a few potential reasons. It could be due to the model itself or the current workload on the server. In such cases, it is recommended to try using a different repository or model.

Exploring Available Models and Providers

To find out what models and providers are available to you, you can use the magic command %%ai followed by list. This command will produce a list displaying the available models and whether the necessary environmental variables are set.

Conclusion

In this tutorial, we have explored how to use large language models in Jupyter, specifically focusing on Jupiter AI. By following the steps outlined above, you can set up OpenAI and Hugging Face Hub in Jupyter, allowing you to access a wide range of language models. Remember to register for API keys and define the necessary environmental variables before using these models. Experiment with different questions and models to see what fantastic results you can achieve. Happy coding!


Highlights:

  • Learn how to use the powerful large language models in Jupyter using Jupiter AI
  • Install the Jupyter AI extension and load the necessary magic commands
  • Use OpenAI in Jupyter by defining the OpenAI API key and utilizing Jat GPT magic commands
  • Incorporate Hugging Face Hub in Jupyter by specifying the Hugging Face Hub API token and using the respective magic command
  • Troubleshoot potential errors and timeouts when using Hugging Face Hub in Jupyter
  • Explore the available models and providers using the magic command
  • Gain a better understanding of how to harness the capabilities of large language models in Jupyter

FAQ:

Q: Can I use the OpenAI language models for free? A: Yes, OpenAI offers a free version of their language models. By registering for an API key, you can use OpenAI's models without any additional cost.

Q: How can I troubleshoot errors when using Hugging Face Hub in Jupyter? A: If you encounter errors, such as timeouts, when using Hugging Face Hub, it can be due to various factors. Try using a different repository or model to see if the issue persists.

Find AI tools in Toolify

Join TOOLIFY to find the ai tools

Get started

Sign Up
App rating
4.9
AI Tools
20k+
Trusted Users
5000+
No complicated
No difficulty
Free forever
Browse More Content