ollama AI explorations: more who wrote it best
Some further experiments running ollama with local language models. After my previous experiments, I got a further strix halo laptop (ryzen AI 395). In particular, this one had 128GB of memory instead of 64GB. This is important because this chip … Continue reading →
