Science and Technology
Science and Technology
Tue, January 28, 2025
[ Last Tuesday
] - MSN
[ Last Tuesday
] - news10nbc
[ Last Tuesday
] - Mint
[ Last Tuesday
] - MSN
[ Last Tuesday
] - MSN
[ Last Tuesday
] - NextBigFuture
[ Last Tuesday
] - MSN
[ Last Tuesday
] - Forbes
[ Last Tuesday
] - VentureBeat
Mon, January 27, 2025
[ Last Monday
] - MSN
[ Last Monday
] - MSN
[ Last Monday
] - MSN
[ Last Monday
] - MSN
[ Last Monday
] - MSN
[ Last Monday
] - MSN
[ Last Monday
] - MSN
Using Ollama to Install Deepseek 14B on a Laptop
- Langchain used ollama to install Deepseek 14B on a laptop. They used for a local deep researching model. $ ollama pull deepseek-r1:14b $ export TAVILY_API_KEY= $ uvx -refresh -from "langgraph-cli [inmem]" -with-editable . -python 3.11 langgraph dev
The article from NextBigFuture discusses the use of Ollama, a tool designed to simplify the installation and running of large language models (LLMs) on local hardware, specifically focusing on installing the DeepSeek 14B model on a laptop. It explains that DeepSeek 14B, a model with 14 billion parameters, can be run on consumer-grade hardware like a laptop with 32GB of RAM, thanks to Ollama's efficient management of resources. The article provides a step-by-step guide on how to install Ollama, download the DeepSeek 14B model, and run it, highlighting the ease of setup and the minimal system requirements. It also touches on the performance capabilities of the model, noting that while it's not as powerful as models with more parameters, it still offers significant capabilities for local use, making advanced AI accessible without the need for cloud computing resources.
Read the Full NextBigFuture Article at:
[ https://www.nextbigfuture.com/2025/01/using-ollama-to-install-deepseek-14b-on-a-laptop.html ]
Read the Full NextBigFuture Article at:
[ https://www.nextbigfuture.com/2025/01/using-ollama-to-install-deepseek-14b-on-a-laptop.html ]
Contributing Sources
Similar Science and Technology Articles
[ Tue, Jan 21st
] - NextBigFuture