Filesystem database

Is anyone aware of any tools that could create and update a simple database of specified files on a few different filesystems that i could reference and modify from keyboard maestro?

I'm trying to plan and execute a workflow for my company that automates the moving / copying of our project folders based on several parameters, some triggered by users and other automated. I'm a mediocre coder at best, but i've built some complicated macro sets that have saved our company a TON of time and brought efficiency up quite a bit.

I've implemented several of the desired tasks using keyboard maestro and bash scripts, but something i've been struggling with is that I essentially need to do a scan on the file tree, or manually maintain a dictionary, or file with an ordered list with the file structure of our cloud storage as well as 2 on premise servers, and it's quite cumbersome and slow to do so.

I think i could probably figure out how to implement it just using keyboard maestro, or maybe Hazel? But just curious if anyone knew of a more streamlined way to get at this or any tools that might help.

You could take a look at something like tree to show all files in the hierarchy.

Can you provide any more information about how you would act upon the data once you have it? That may guide how you should get/store it.

Thanks. I'll have a look at that. A full file tree would be useful, but not neccessary as I basically only need to store the path specific folders. All other data i can either parse from the path string, or run additional queries on the contents of the folder if needed at the time i need them. The folders are all at predictable locations at a predictable depth from the root of the volume.

Mostly I'm looking to be able to:

  • Search for a "project" folder by name or number (both are contained in the folder name and can be parse from the path)
  • After a search, show some details about the contents of the folder. Then move or copy the folder into a different folder in another location, depending on where it resides. Then "mirror" that copy or move on another filesystem. Subsequently "write" to the database or list where the project now lives.

As i'm working through this more, i may be able to make a simple set of "find" commands stored as a list sufficient for my needs, as some tweaks to my macro I made today have made it quite a bit faster to do that scan.

At my work I have implemented a macro to open job-folders from a network share for my colleagues.

Our share is on a Windows server. The server indexes the share 3 directory-levels down every 5 minuttes.
The macro below then reads the index and uses the Prompt With List to allow the user to type a customer name, number, job name or number to easily open a folder.

Like this:
CleanShot 2023-10-26 at 11.09.06

1 Like

Thanks for sharing! How are you creating the index file?

I am running a bat-file every 5 minutes as a scheduled task on the Windows server.
Here is the contents of the two bat-files used.

Generate-3-depths-txt-of-Kunder.bat

@echo off
powershell -command "C:\Scripts\list-files-depth.bat S:\Kunder 3 > C:\Scripts\000-Index-3-levels-deep.txt"
attrib -h S:\Kunder\000-Index-3-levels-deep.txt
move /Y C:\Scripts\000-Index-3-levels-deep.txt S:\Kunder\000-Index-3-levels-deep.txt 
attrib +h S:\Kunder\000-Index-3-levels-deep.txt

list-files-depth.bat

@echo off
setlocal
set currentLevel=0
set maxLevel=%2
if not defined maxLevel set maxLevel=1

:procFolder
pushd %1 2>nul || exit /b
if %currentLevel% lss %maxLevel% (
  for /d %%F in (*) do (
    echo %%~fF
    set /a currentLevel+=1
    call :procFolder "%%F"
    set /a currentLevel-=1
  )
)
popd

At the start I had a macro to create the index on each Mac. But since I have a server it seemed wastefull to have every Mac create a index every 5 minutes.