Deep Seek

What a pleasant surprise this is, something you can run locally on your computer, or use it for a fraction of the cost that comes with OpenAI or Anthropic’s Calude !

DeepSeek-V3 is completely open-source and free. (https://github.com/deepseek-ai/DeepSeek-V3)

If you don’t have the hardware resources for it, it is also available through a website identical to that of ChatGPT and an incredibly affordable API.

How affordable ?

Deep Seek: $0.14 per million input tokens and $0.28 per million output tokens.
Claude AI : $3.00 per million input tokens and $15.00 per million output tokens
ChatGPT : $2.50 per million input tokens and $10.00 per million output tokens

So, the bottom line is that deep seek is fifty times cheaper than Claude AI, and around 35 times cheaper than openAI ! that is, two percent and three percent of the price, But what about quality

in most scenarios, it is comparable, in some cases Deep Seek wins, in other cases, claude or ChatGPT wins, but it is up there obviously !

Ollama

1- Installing

1.1 – Linux

On Debian linux, Installing Ollama is a one liner, just enter the following in your terminal

curl -fsSL https://ollama.com/install.sh | sh

Yup, that is it, move on to using Ollama

1.2 Windows and MAC

Just go to https://ollama.com/, download it, and run the installer ! you are done

Using it !

Using Ollama is simple, just open your terminal window or command prompt , then activate your conda environment (Or venv) , and run the following command, for the sake of this example, I will run

conda activate projectName
ollama run llama3.2

llama3.3 with its 70 billion parameters will require a minimum of 64GB of ram, so don’t try that unless you have the RAM for it ! for comparison, 3.2 has 2 billion, which is around 3% of 3.3

It should now probably download about 2GBs of data (The model llama3.2 has around 2 billion parameters) And you are done, Now you can ask it anything

For example, create an article for me explaining this and that,

Once done, just enter “/bye” to exit the ollama prompt and quit the session

If you want to for example clear the context or do anything else, just use the command /? for a list of commands

Now, you have used the lama3.2, but on this ollama models page, you will find that there are many others that you can use !

Others include models that help you with coding, or models that are more targeted towards chat-bot QA, either way, you should take a close look at them, even if for the fun of it

Ollama API

So, above, your terminall allowed you to chat with the model, much like what you do when you open Claude or ChatGPT, if you want to access things via API, here is how