i build an that lets you run large language models on your phone using MLX. the best models include llama 3.2 1b and 3b which are highly optimized for apple silicon. the app includes a shortcut allowing you to instantly access local intelligence. would love to hear your thoughts!
i build an that lets you run large language models on your phone using MLX. the best models include llama 3.2 1b and 3b which are highly optimized for apple silicon. the app includes a shortcut allowing you to instantly access local intelligence. would love to hear your thoughts!