ApX AutoML automates the machine learning workflow, streamlining data preparation, model selection, and deployment for data scientists and ML engineers.
Simply sign up, upload your data, and let ApX automate the data preparation and model training processes.
ApX Machine Learning Company name: ApX Machine Learning .
ApX Machine Learning Login Link: https://apxml.com/auth/login
ApX Machine Learning Sign up Link: https://apxml.com/register
Social Listening
Run DeepSeek-r1 Locally (the easy way!)
Setup and run deepseek-r1 (or other LLM models) locally using Ollama. We'll cover some tips & tricks for working with Ollama from Command Prompt, choosing an appropriate model for your computer hardware, and how to setup a web GUI using Docker & Open WebUI. In addition to the distilled deepseek-r1 models I would recommend trying out other AI models like Mistral and Llama3.2 Hardware used in this video: System 1 - Laptop with GTX 1660 Ti 6 GB VRAM, 32 GB System RAM System 2- Desktop with RTX 3070 8 GB VRAM, 64 GB System RAM Links referenced in the video: Ollama download and models https://ollama.com/ SAT Math Word Example Problems https://www.varsitytutors.com/sat_mathematics-help/solving-word-problems?page=4 GPU/VRAM Requirements for Deepseek https://apxml.com/posts/gpu-requirements-deepseek-r1 Deepseek Distillation Benchmark https://api-docs.deepseek.com/news/news250120 Docker Setup https://docs.docker.com/desktop/setup/install/windows-install/ Open WebUI https://github.com/open-webui/open-webui Timestamps 0:00 Intro 0:57 Downloading & Installing Ollama 1:58 Picking an LLM model 2:58 First prompts 5:50 Distilled vs Full Deepseek 7:03 Hardware (VRAM / GPU) requirements 8:20 Running a model using shared memory 9:54 Other LLM models ( Mistral and Llama 3.2 ) 10:40 Open WebUI Demo 11:37 Docker Setup Tips 13:32 Setting up Open WebUI Docker Container 13:43 Open WebUI Tips