Issue with shell script action - Task failed with status -1

I have this macro that checks all files inside a list of folders and checks which ones were modified between the current date and time and the last time I ran the macro (this old date is saved in a global variable).
Then it copies those files to a specific location, keeping the same structure when it comes to the directories (I think that's called "recursive"?)

I notice that I get the "Task failed with status -1" error in the Engine.log file quite a lot and it seems to be related to this action:

image

I already changed the menu on the left to only include the variable I'm using:
image

In this case $KMVAR_Local__indFolderPath refers to the folder being created, before the actual file is copied there.

Any tips on how to fix this?
@Nige_S could this be one of those cases where I should be creating a temporary file for this script and then execute the script from that file? This variable can indeed include a loooooong path, depending on where the folder is located (if it includes multiple sub folder and each folder contains long names, like system folders and all that).
If you think this could be the case, would you recommend that I

  1. create a temporary file, read it, delete it, then repeat this process for each path, or
  2. find a way to create a temporary folder and give each temporary file for each path a unique name, read from it, and then at the end of the script just delete the folder with all the temporary files?

I doubt it -- if your paths are >100k characters in length you should rethink your filing system, not your macro :wink:

Make sure it is that action by searching in the Editor for the action number that's logged:

More likely you are passing in an invalid path with your variable -- perhaps containing a | character or something similarly "illegal", or accented characters that are fine in KM or Terminal but can't be handled by the shell script action with its different "base" environment settings.

I think I'd tackle this problem in a different manner, using one of two approaches.

Approach one is to use a backup app to handle your backups—that's what they're for, and they're incredibly good at it, and require almost no effort on your part to use well. Personally, I use CarbonCopyCloner.

I created a task group that contains all of my "critical" backups, and that task group runs each task in the group once per hour, at the minutes past the hour shown in the name:

The cloud backup was running when I took that screenshot. It's normally not onscreen in this huge manner; just in the menu bar and I have it show a progress window in the corner while it's backing up. There's great control over the scheduling, too:

I could write a Keyboard Maestro macro that emulates much of this behavior, but it would be a huge task and I'd spend more time troubleshooting it than it's worth—and it would still fail under conditions that a backup app will handle.

If you don't want to invest in more software, though, then I'd recommend replacing 99% of your macro with one shell script command, rsync. Here's the top of its man page:

rsync is a fast and extraordinarily versatile file copying tool. It can copy locally, to/from another hos over any remote shell, or to/from a remote rsync daemon. It offers a large number of options that control every aspect of its behavior and permit very flexible specification of the set of files to be copied. It is famous for its delta-transfer algorithm, which reduces the amount of data sent over the network by sending only the differences between the source files and the existing files in the destination. Rsync is widely used for backups and mirroring and as an improved copy command for everyday use.
ă…¤
rsync finds files that need to be transferred using a "quick check" algorithm (by default) that looks for files that have changed in size or in last-modified time. Any changes in the other preserved attributes (as requested by options) are made on the destination file directly when the quick check indicates that the file's data does not need to be updated.

It's a very powerful program, and the man page includes extensive examples, including this one:

rsync -avz foo:src/bar /data/tmp
ă…¤
This would recursively transfer all files from the directory src/bar on the machine foo into the /data/tmp/bar directory on the local machine. The files are transferred in archive mode, which ensures that symbolic links, devices, attributes, permissions, ownerships, etc. are preserved in the transfer. Additionally, compression will be used to reduce the size of data portions of the transfer.

(The -a flag is the important one there, as it's what sets "archive mode," which keeps everything the same as it copies.)

But you don't have to transfer from another machine; leave out foo: and make it look like this:

rsync -avz /path/to/folder/to/back/up /path/to/backup/folder

And that's your entire macro. Put it in as a shell script action on a periodic trigger, and it will only copy files that have changed since the last time you ran rsync. I still use rsync as a backup tool myself, because I use it to copy files off our web server to local backups.

One very important note: The version of rsync bundled with macOS is very old:

/usr/bin/rsync --version
rsync  version 2.6.9  protocol version 29
Copyright (C) 1996-2006 by Andrew Tridgell, Wayne Davison, and others.

If you're going to take this approach, use Homebrew or MacPorts to install a much newer and better version:

$ rsync --version
rsync  version 3.3.0  protocol version 31
Copyright (C) 1996-2024 by Andrew Tridgell, Wayne Davison, and others.

-rob.

1 Like

I use Carbon Copy Cloner as well.
My macro has a different purpose. To begin with, my backup disks are not that big to allow multiple (and big) snapshots, so I'm constantly having to delete snapshots, which ends up defeating the purpose of having snapshots.
Also, when I make changes to my computer such as moving folders that contain GBs of data, a simple backup can take hours to complete (also my backup disks are old and HDD, so I can't take advantage of USB 3 yes).
My macro allows me to keep just modified files and usually my folders with those backup files are quite small, like 2-3GB or so, and this task usually takes 1 or 2 minutes.

I will check your option with rsync when I have the time and energy to dive into something completely new to me. :wink:

Here's some issues though:
1 - I can't install anything using Homebrew, because Homebrew doesn't work with Catalina. Tried it multiple times, multiple different approaches, always failed. I mean it seems to be installed (Homebrew 4.3.6), but when I tried to install stuff using it and all that, I always got errors, so maybe it was installed, but not properly. They even have a warning that explicitly says that trying to install it on Catalina will make things not work as they should.
2 - The folders I want to copy are all in different locations and not only that, inside certain folders, there are folders I don't want to include. On my macro I have a shell script that uses something called prune.

I mean the macro is working 50% of the time, I just need to find a way to make it work around this issue, which seems to be a matter of tweaking certain actions.

If you feel like diving deep into an adventure :wink:
Backup recently modified files (triggered by external AS) .kmmacros (64 KB)

Part of this macro, especially the script-building section was actually suggested by other users here.

I didn't realize your OS was that old—Homebrew probably won't work well there. MacPorts, though, offers installers for many older OSes, including Catalina:

You can do the same with rsync, telling it to exclude certain folders when it runs (via the --exclude option). And I wouldn't bother trying to string all the disparate folders into one massive command, I'd just run rysnc once for each parent-level folder you want to back up.

I may try to take a look later, but pretty busy today.

-rob.

Any hyper long paths are more related to how deep the files that need to be copied are and even some system files that contain longer names.

Yes, that's the action. I copied the number and I have a macro that finds actions by their UID and I noticed that it's always the same action, that "mkdir" script.

I never use |, so that wouldn't be the issue. Also, if that was the issue, then the macro wouldn't work ever, but I could run the macro now, it shows that error, and run it right away again and it completes without issues, so I guess the issue is probably what you said, that it's the For Each that's too fast.

I also changed the macro to use a Create Folder action instead as suggested by @RogerB so I will see how that goes. Fingers crossed...

I will check that. I also have Anaconda, which is another package manager, right?
I'm pretty new to all that, so bear with me if I say something ridiculous haha

I will explore this option with some test folders once I have the time to focus on that 100%

So I would have multiple shell scripts, each with the rsync command, one per folder I want to scan? And then the folders I want to exclude, the script would have the exclude option?

And you were saying that this would allow me to only copy files since the last time rsync ran? What if I have another macro that uses rsync? Wouldn't that create a conflict? Or can I choose which macro used the rsync and use that as the "reference"? Hope I'm making sense...
And if not, my macro already has a "system" to check and save the last run, so maybe I can still use that to avoid any conflicts?

Sure. No worries and no rush. I will perform the test with the Create Folder action first and see how that goes.

"...or something similarly illegal". And since we have no overall picture of your macro or your file system, it would be quite possible for you to make things such that a badly-named folder could cause an error only occasionally.

Given the emergence in the other thread of the actual underlying error message, I'd say so too.