Sunday, 19 May 2024

LLM´s

A few days ago I read one of the latest blog entries from Miguel Grinberg about how LLMs work and I was wondering if you can run an LLM in your own computer, even if it doesn´t have a GPU. The answer is yes (as long as you have some patience with the output) and there are some tutorials out there on how to do it. I finally followed this one from Ryan Stewart, which works straight out of the box.

I don´t have a use case in mind, other than being curious about how things work. So many new tools out there lately!

Edit 2024-05-26

for future reference https://yc.prosetech.com/running-your-very-own-local-llm-6d4db99c0611


No comments: