Waiting on the Spinning Wheel of Death?

I found a temporary fix. Just keep trying to click on the thing that I'm trying to click on until it finally goes away. :wink: But I will definitely try your ideas! This is not the first time I've bumped into the problem. (It would be nice if there were a "Wait until application is responsive" option in KM.) Thanks!

That might be more difficult to do than you think, and there's a way to manage it for most apps already:

Note -- these are actions and will install in the macro currently being edited.

Keyboard Maestro Actions.kmactions (1.8 KB)

Keyboard Maestro Export

You would want to have a reasonable time-out period. (See the gear menu or contextual menu of the pause-until action.)

Whups... This might not work, because you can't make it app-specific...

It could probably be done with AppleScript UI-Scripting though.

If I was dealing with a specific known application I might do something like this:

Keyboard Maestro Actions.kmactions (2.6 KB)
Keyboard Maestro Export

You aren't saying that there are limits to Peter's ability to do magic, are you???? :wink:

I'll add looking for an enabled menu item on the list of things to try. That is an elegant solution if it works!

Thanks again, by the way. This is far and away the most useful and responsive forum I've run across.

(As a quick caveat, this is all part of an effort to script an app that does all it can to be unscriptable. I badly need the automations for my work. But I also talk to the developer from time to time and have "gently" been suggesting that the app really needs a nice API. I'm thinking that if I keep posting videos of ugly automation workarounds on the app's forum some day he might see the need. He's a good guy, and the same basic approach got him to add a custom URL scheme, so fingers crossed...)

What's the app?

LiquidText. It is far and away the best solution that I've found for interacting with text (PDFs) in a serious way, which is why I find this worth the effort.

My workflows often involve digging into stuff in LiquidText and then extracting and linking to notes/images/extracts etc. from Obsidian. But out of the box everything is built around moving the mouse and clicking here and there. While there are (finally) a restricted set of keyboard shortcuts, it is not uncommon for a simple, repetitive task to take half a dozen clicks in different locations.

The good news is that they are capable of listening and their development turn around cycle is fairly rapid. The less good news is that they have yet to fully embrace the idea that automations and integrations done out of house are a good thing.

1 Like

I will add that not being able to get the location of the text cursor on a Mac makes the problem much harder. For example, when you tell LiquidText to open a comment box, it rearranges the screen to suit the new addition. That is well and good, except that there are menus that can only be opened by clicking on the comment box. There is a text cursor in the box, but the mouse cursor is wherever it was in the first place. I've played with dropping things like :nerd_face::nerd_face::nerd_face::nerd_face: into the comment. That makes it possible to find again with ⌘F, but since the font size depends on window zoom it's useless for found images. If I could get the position of the text cursor I'd be all set, but for whatever reason Apple seems to keep that information well and truly hidden. I've tried UI scripting, but that turns out to be hopeless.

Why does anybody write an app that is all about user interaction and NOT include an API?

Because they're ignorant and/or lazy, and because the vast majority of users have no idea of what an API is or what can be done with it.

It's horrifying how few users know what it's like to turn a multi-minute or even multi-hour job into a macro that runs in a few seconds or less...

1 Like

If you are interested, here is a video of a macro to take selected PDFs in DEVONthink and import them into LiquidText.

Indeed. I've always seen a Mac as a really nice windowing system on top of a Unix box, so have always made heavy use of the command line to automate where possible. I never was a big fan of Automator, but when I discovered Keyboard Maestro my world got much, much better!

I'm curious. How many people sell their services writing for-the-purpose macros for people? I'm not interested in the job (other than for the occasional client), or getting someone else to do the work. But there have to be uncounted thousands of people who would benefit if they knew what was possible, and if they didn't have to come up the learning curve themselves.

Turn off the 'Display' checkmark in the found-image action after you've gotten things working. The macro will run a bit faster.

I know, but seeing the green box popping up to make it clear what is happening furthers my ulterior motives. :wink:

1 Like

Shameful that the dev is depending upon the spinning pizza cursor and doesn't provide an actual progress dialog.

1 Like

To be fair, I'm pushing it hard. Those crop up when importing a half a gigabyte PDF. I bring books into LiquidText, and sometimes the only way to do the conversion is with screen grabs and OCR. The results can be huge. I suspect that most people never see such a thing.

1 Like

I don't believe that's Apple -- it's the application's responsibility to reveal such things, either directly or via Accessibility. See BBEdit for how it could be done, but there's a difference between the insertion point in a document and a UI widget...

This seems to be a known Apple thing. Here is an old post from 2014 that talks about it: Get text cursor position.

It's possible that the situation has changed. @ccstone Do you have any insight here?

Again, not an "Apple thing". The OS has no "knowledge" of the position of the insertion point -- that's an application or document property. You can get the position if the app allows such via its own APIs, but that's unfortunately uncommon -- even in Apple's own apps (Grrr! And that is an "Apple thing".).

1 Like

Hey Guys,

At some level the macOS knows where the cursor is – that is to say when established APIs are used and Accessibility is properly implemented.

I believe it was @CJK who posted some pretty esoteric AppleScript UI-Scripting code that got the cursor location (not mouse) in some quite unexpected places.

His code may or may not have needed the speech API to be turned on – I can't recall – and I'm not going to spend time fishing for it.

The upshot of this is that Apple could do much better IF they wanted to.

-Chris

1 Like

My gut says that one could probably get there with ASObjC, developer.apple.com, a lot of time, and a case or two of Scotch. I'll pass on buying that ticket!

I play with scripting the UI when I have to. But there's something about writing code with 47 levels of item 1 of position of text area 4 of UI element 2 of row 7 of table 3 of splitter group 3 of splitter group 2 of group 1 of sheet 2 of...., knowing that with the next update of the app it's all going to change...

Anyway, yes, Apple could make this all much easier if they wanted to. It's actually kind of remarkable that this level of automation is possible at all. My understanding is that when OSX was released they came very close to shipping without a Terminal app. Thanks to the screams from Linux users. The idea of automation had to fight strong headwinds. We owe people like Sal Soghoian a huge debt of gratitude.

Then along comes Keyboard Maestro, handing out copies of the keys to the kingdom. There will always be things to grumble about, but this world is very, very much better than it might have been.

My argument is that the OS does not know where the cursor is, but when the app implements the proper APIs the OS can ask where the cursor is. So while Apple can make the tools available, it's up to the devs to make them useable (and so available to us).