Well, 2000€ for a “Pro” model of the Macbook 14" with only 8GB RAM sounds a bit strange, tbf. And +230€ for +8GB is straight up greedy.
They said “Actually, 8GB on an M3 MacBook Pro is probably analogous to 16GB on other systems” and well , that’s definitely not the case for their upcoming AI usecases, because - and many people seem to overlook that - their RAM is shared RAM (or as they call it “unified memory”), which means that the GPU is limited by these 8GB of (V)RAM because it can only use what is left by the System usage.
Which model with how many parameters du you use in ollama? With 8GB you should only be able to use the smallest models, which ist faaaar from ideal: