How to get Apple Intelligence to answer your questions using KM

This thread shows how you can use KM to get an answer to a question using Apple Intelligence.

All you need to do, to get an answer, is to use the KM macro to execute a specific Shortcut, placing the text of your question as the input to the shortcut, like this... (You can place ANY QUESTION you want in the input field. Eg, "Who is the president of Mozambique?")

image

And you also need to create a Shortcut by that name ("AI Solve", in this example) with the following content:

In my example code I used the "On-Device" language model. Of course, your options currently are:

image

Apple's On-Device model is trained on data from 2023, but has some facts from 2024, so any answers could be out of date. (Eg, who is the Prime Minister of Canada?... will return an old name.) I presume this model allows an infinite number of questions. Im not sure if that's true for the following models.

The Private Cloud Compute model seems to work. It possibly has a more recent model than the On-Device model. But it seems to be at least one year old, so it's not current on the news.

The ChatGPT model works (as long as you call it from KM, as I show above, instead of running it from the Shortcut window, which has the odd problem that it forces you to manually type your question.) I suspect there is a limit to how many times you can call it per day, but I didn't test that yet.

Remember, all models seem to return RTF format, which means to extract the data into your own KM macro, you may have to deal with some text extraction issues. There may be a way to strip the RTF out, using KM, but it may damage the answer, so I'm not going to provide that part of the solution here today.

Clearly more work is needed, but this is a good start. This is a wonderful new feature, and could potentially make KM macros far more powerful. I'm very excited about it. One of my plans is to create a chatbot with it that runs on macOS.

EDIT: there was a typo in my first image above, "yo" instead of "to", but the AI correctly interpreted it anyway! So that's proof that it really works well!

1 Like

I might have to turn Apple AI back on to try this out. When I first tried Apple AI, it mostly just got in my way while not adding anything of value to my workflow, and of course it also required a certain amount of mouse clicks, which was very slow. Also, at that time, it wasn't available for my device's language, and it just wasn't worth all the trade-offs to keep it on, but I think it must be available in more languages by now?

The one thing I did miss after turning it off, though, was the type-to-siri feature. I think there's an accessibility setting that does something similar, but that has more friction to use than the top-level feature included with the AI. I'd never even turned siri on any of my Mac's before because having to talk to my computer is so much slower than just giving it commands directly from the keyboard.

Some people might prefer to talk rather than type, and talking may even be faster for some people. I have a chatbot working, and it's very fast, but I want to solve some issues before I upload it.

The very first question that I asked the Private Cloud Compute model today, was "How much is 2 divided by 3?." And the answer Apple Intelligence gave was, "2 divided by 3 is 0.6667. In fractional form it is one half." :sweat_smile: It's still giving the wrong answer. Even so, the long term future for this feature is bright.

True enough! And indeed some people can only use their voice to operate a computer. I of course meant it's so much slower for me to talk to my computer but inadvertently used a plural pronoun instead of a singular first-person.

All AI seems to be so hilariously bad at calculations that I am genuinely curious as to why since famously computers were originally invented expressly for computing.