Hello everyone,
I’m reaching out because I wanted to share a new Third-Party Action Plugin I just put together for Keyboard Maestro that I thought the community might find useful.
The plugin provides a native Keyboard Maestro action to interface directly with local LLMs using Ollama. It securely and synchronously pauses macro execution until the model finishes generating its response, and it natively returns the output so it can be routed directly into other actions (like saving to a variable or the clipboard).
The plugin is available immediately here: https://github.com/Itignos/keyboardmaestro_to_ollama
I’d of course be very happy to hear any suggestions for improvements, adjustments, or bug reports.
I’m hoping this plugin can be my small contribution back to the community for anyone who wants to start seamlessly bridging local AI models into their everyday macros.
Best regards,
Matthias