Apple Intelligence integration into Keyboard Maestro

I'm sure many people here watched this year's Apple WWDC announcement of Apple Intelligence imagining how it could be integrated into Keyboard Maestro. Let's compile some ideas for actions or potential macros that utilise said actions as a fun brainstorming session (I would obviously love to see any integration into KM happen for real).

The low hanging fruit would an action that accepts a text input to be used as a prompt for AI, and outputs generated text from AI as typing / pasting / a variable result. This would have tons of useful applications.

You could pretty straightforwardly create your own personal PA respond to emails for you:

Create a macro that triggers when you receive an email from a recipient (or relevant subject, email body, etc) (Mail > rules > 'from contains' > trigger AppleScript... etc). Then, with a pre-configured personal-assistant prompt set to a variable, supply this prompt, along with the email's subject and content, compiled to one larger variable, to the AI action. Return the results to a newly composed reply mail.

I would like to do that. I don't know if there will be any such API.

1 Like

I did promote this idea in a thread that I posted here while the WWDC keynote was underway.

In case you aren't aware, you can already do this to an extent. There's already a feature in macOS which I use called "Type to Siri" which anyone can enable. A macro in KM can trigger Type to Siri and the same macro can type any natural English question into the Siri text box. Right now, Siri provides its answer as a large Notification Window. I don't believe there is any API to submit these questions to Siri or get the answer back from Siri, but if you are desperate you can use Find Image and OCR actions inside KM to fetch the answer from the notification window. It's ugly, but it can work. And at the very worst, this may also work in Sonoma using its AI features.

And more interestingly, Apple was described this "Type to Siri" feature in one of its first statements about its new AI during the WWDC keynote. So Apple is acutely aware that users want this feature. Even so, that's no guarantee they will provide an API for it.

For example, right now you can write a KM macro to open up the Type to Siri interface, then paste in, "Who is the president of Kenya" and you will instantly get both a verbal and notification response from Siri. It's not too difficult to use a KM macro to perform an OCR of the notification window. So if you are willing to go to this effort, the solution already exists.

Off Topic, but "Type to Siri" is extremely important in a world where everybody on a bus has iPhones and AI / Natural Language is increasingly the main way you interact with your phone and you can't have 20 people all talking to their phone out loud.

I suspect that's going to be a problem in the future once the 'social barrier' for talking to your phone in public goes away, we will hear more and more people talking to / hearing responses from their phones in public places.

Don't worry, soon we'll all have neural implants and we'll just think to our phones :).


1 Like

whew ?

You guys actually switch Siri on ?

Though with institutions gradually failing, and the temperature generally rising a bit, it may also be prudent to allow for a period of frequent and surprisingly prolonged power outages.

( hang on to your supplies of paper and pencil )