Ollama is now updated to run the fastest on Apple silicon, powered by MLX, Apple's machine learning framework.
This change unlocks much faster performance to accelerate demanding work on macOS:
- Personal assistants like OpenClaw
- Coding agents like Claude Code, OpenCode,… https://twitter.com/ollama/status/2038835449012351197/video/1
中文: Ollama 现已更新,使用苹果芯片运行速度最快,该芯片由苹果机器学习框架(MLX)提供支持。
这一更改可解锁更快的性能,从而加速在 macOS 上的高要求工作:
- 像OpenClaw这样的个人助理
- 编码代理,如Claude Code、OpenCode等......