If you're on a M1, it comes with a decent gpu so it's actually possible to run some models locally. Here are some of my favorite apps to us:
Diffusers – Run Stable Diffusion locally. Integrates with Apple's CoreML for fast performance
Aiko – Use Whisper to transcribe audio files. It's a huge app (~2GB) so if you have issues with that, you can use MacWhisper but half the time I tried using it, it would crash :(
LlamaChat – Chat with Llama, Alpaca, and GPT4All models all running locally on your computer. Another one to look at is Lore, but that's more of a playground for accessing OpenAI models but local models should be coming soon.