JXA Reusable Code "Libraries"?

For those of you writing JXA scripts, how are you handling reusable code "libraries"? Especially in code you might share with others.

For example, let's say I have something called "FileUtils", which I reuse in many places. Right now I just copy it into the JXA script I'm working on. It's kind of a ghetto way of reusing code, but it is what it is.

For me, each set of reusable code is in its own class, generally with static methods. This acts like a namespace. Here's a trimmed-down example:

class FileUtils {

	static readTextFile(path) {
		// ...
	}

	static writeTextFile(text, path) {
		// ...
	}
}

const text = FileUtils.readTextFile("/Path/To/Text/File");

The downside of this is when you have a class with a lot of members - you don't want to copy the entire thing into each JXA script.

So I came up with a way to let me select which members I want in the class when I paste the code in. It's multi-select, and if you select something that requires one of the other members, they get included automatically:

image

 
...which would give me:

class FileUtils {

	static readTextFile(path) {
		// ...
	}

	static writeTextFile(text, path) {
		// ...
	}

	static fileExists(path) {
		// ...
	}

	static #fileOrFolderExists(path) {
		// ...
	}

	static #getNSErrorMessage(nsError, message) {
		// ...
	}
}

It's a fairly involved method to implement, but it's really simple to use.

So how do you guys handle this? I'm hoping someone will come along and show me a much better way.

Given a library file simply consisting of a series of function definitions, perhaps, for a small example:

Expand disclosure triangle to view short JS library
// Tuple (,) :: a -> b -> (a, b)
const Tuple = a =>
    // A pair of values, possibly of
    // different types.
    b => ({
        type: "Tuple",
        "0": a,
        "1": b,
        length: 2,
        *[Symbol.iterator]() {
            for (const k in this) {
                if (!isNaN(k)) {
                    yield this[k];
                }
            }
        }
    });


// enumFromTo :: Int -> Int -> [Int]
const enumFromTo = m =>
    // Enumeration of the integers from m to n.
    n => Array.from(
        { length: 1 + n - m },
        (_, i) => m + i
    );


// even :: Int -> Bool
const even = n =>
    // True if 2 is a factor of n.
    0 === n % 2;


// first :: (a -> b) -> ((a, c) -> (b, c))
const first = f =>
    // A simple function lifted to one which applies
    // to a tuple, transforming only its first item.
    ([x, y]) => Tuple(f(x))(y);


// mapAccumL :: (acc -> x -> (acc, y)) -> acc ->
// [x] -> (acc, [y])
const mapAccumL = f =>
    // A tuple of an accumulation and a list
    // obtained by a combined map and fold,
    // with accumulation from left to right.
    acc => xs => [...xs].reduce(
        ([a, bs], x) => second(
            v => [...bs, v]
        )(
            f(a)(x)
        ),
        [acc, []]
    );

// partition :: (a -> Bool) -> [a] -> ([a], [a])
const partition = p =>
    // A tuple of two lists - those elements in
    // xs which match p, and those which do not.
    xs => [...xs].reduce(
        (a, x) => (
            p(x)
                ? first
                : second
        )(ys => [...ys, x])(a),
        Tuple([])([])
    );


// second :: (a -> b) -> ((c, a) -> (c, b))
const second = f =>
    // A function over a simple value lifted
    // to a function over a tuple.
    // f (a, b) -> (a, f(b))
    xy => Tuple(
        xy[0]
    )(
        f(xy[1])
    );

then to import a subset of those functions, without having to explicitly mention any helper functions in there which they many depend on, I will use a pattern like:

    // importedFrom :: CSV String -> FilePath -> IO Dict
    const importedFrom = fNames =>
        // eslint-disable-next-line no-new-func
        fp => Function(
            [
                readFile(fp),
                `return { ${fNames} };`
            ]
                .join("\n")
        )();

So that I can then write code like:

const main = () => {
    const
        { enumFromTo, even, partition } = importedFrom(
            "enumFromTo, even, partition"
        )(
            "~/Desktop/shortLib.jxa"
        ),

        evenOdd = partition(
            even
        )(
            enumFromTo(0)(10)
        );

    return JSON.stringify(
        evenOdd
    );
}

Full calling code:

Expand disclosure triangle to view JS source
(() => {
    "use strict";

    const main = () => {
        const
            { enumFromTo, even, partition } = importedFrom(
                "enumFromTo, even, partition"
            )(
                "~/Desktop/shortLib.jxa"
            ),

            evenOdd = partition(
                even
            )(
                enumFromTo(0)(10)
            );

        return JSON.stringify(
            evenOdd
        );
    }

    // --------------------- GENERIC ---------------------


    // importedFrom :: CSV String -> FilePath -> IO Dict
    const importedFrom = fNames =>
        // eslint-disable-next-line no-new-func
        fp => Function(
            [
                readFile(fp),
                `return { ${fNames} };`
            ]
                .join("\n")
        )();


    // readFile :: FilePath -> IO String
    const readFile = fp => {
        // The contents of a text file at the
        // given file path.
        const
            e = $(),
            ns = $.NSString
                .stringWithContentsOfFileEncodingError(
                    $(fp).stringByStandardizingPath,
                    $.NSUTF8StringEncoding,
                    e
                );

        return ObjC.unwrap(
            ns.isNil()
                ? e.localizedDescription
                : ns
        );
    };

    return main();
})();
1 Like

Thanks for the post. Question (rhetorical): How many new concepts can one person learn in a day before their head explodes?

I'm not sure whether to blame this head-exploding on you, or ChatGPT.

 
Holy crap, that was some deep concepts right there. And although I understand them in conceptual form, it may take my 67-year-old brain a while to digest them.

Thanks! (I think...)

I wouldn't lean too heavily on LLMs for semantic insight – it's not what they produce – LLMs model syntax pretty well (getting up to the region of 97% - 99% now), but their semantics are still barely better than a coin toss.

Inevitable really – nothing in their training materials models what text actually is, or how it (or successful code) actually functions within the rest of human activity.

See, for example, the Wolfram benchmarks, which shed light on just how little help I got from Anthropic's Sonnet (Claude) this afternoon on using Mathematica hash and ampersand anonymous functions to express currying of functions applied to two or more values.

Claude.ai kept on expressing complete confidence in successive code revisions which all either failed to compile or failed to work, and eventually gave up :slight_smile:

A fun way to burn some time, but nothing to make a habit of.

(Unless, of course, the LLM offers to pay a subscription to make use of you, but that seems unlikely – they get a bit shifty on property rights to the sources of their approximate retrievals and restitchings)

Yeah, I understand that it all needs to be take with a grain of salt, and a discerning eye. With that said, I sue wish I had this back in the day.

Definitely fun, but we're probably lucky that they haven't been around for 20 years –
the mountains of technical debt piling up right now (auto-stitched code that appears to work on light testing, but isn't actually understood) will turn, sooner than we expect, into roaring avalanches of total garbage.

( Especially as the process is inexorably autotrophic – the proportion of LLM input that is already LLM output can only rise :frowning: )

2 Likes

Incidentally, the trick of that approach to selective import from a library file of functions is really just that where you have a JS object in which the key and value are the same, for example:

{
      "alpha": "alpha",
      "beta": "beta",
      "gamma": "gamma"
}

JS syntax allows you to rewrite it as the simpler, and equivalent

{
    alpha,
    beta,
    gamma
}

this means that if you end a JS library file with a return value referencing a subset of the functions in a pattern like:

return {
    enumFromTo,
    even,
    partition
}

then that whole source file can be evaluated to a record (JS Object), in which the value of each function name is the definition of that function.

In the caller, we are essentially getting access to the named subset of functions (and the whole context of their definition, including any helper functions or shared constants etc) by writing

const {
    enumFromTo,
    even,
    partition
} = resultOfEvaluatingLibrarySource;

The only additional trick of the importedFrom function is that it doesn't depend on the library ending with the return of an object like:

return {
    enumFromTo,
    even,
    partition
}

Instead, at run-time, it creates a copy of the library file with an appended (custom) return expression, specifying which function name bindings we want in the calling evaluation space:

[
    readFile(fp),
    `return { ${fNames} };`
].join("\n")

Thanks. That's how I understood it.

A handful of potential issues:

  1. This has to load slower than if the code was already included in the script.
  2. If the KM macro that uses said code is to be shared, the user will need to put the "library" code somewhere the consumer code can get to it.
  3. This leads to some potential versioning issues, if the code is shared among multiple consumers. I suppose that a good versioning naming system could help with that, although you might end up with a lot of file versions.
  4. And as you mentioned, this doesn't work for functions that require other functions - at least, not as presented.
1 Like

That part may be a misunderstanding – I thought I had said the opposite ?

( In the example above, the functions which we explicitly import do depend on unmentioned helper functions like Tuple, first and second, and that technique does give the called functions access, at run-time, to the private functions in the library-parse object. )

We can't see Tuple, first and second, but partition can.

Yeah, after I posted that, I got to wondering if I might have been wrong, for exactly the reason you specified.

Good to know that even when I'm wrong, I can still be right too!

1 Like

External library files are probably not the right form in which to distribute macro parts – I agree.

I find they work well, however, while building and testing individual modules of slightly bigger things.

Performance, even with rather large library files, turns out, on current machines, to be surprisingly good – subliminal for me – and it saves me the work of iteratively discovering, and manually patching, dependencies on helper functions.

(Simplest, of course, is just to import the lot)

A great point. I'll consider it, thanks!

FWIW, another approach to:

  1. importing a subset of files from a large library, while
  2. not having to name any helper functions on which they depend.

(Directly enriching this with the names of imported functions)

The example below assumes an import from jsPrelude.js, which is at:

RobTrew/prelude-jxa: Generic functions for macOS and iOS scripting in Javascript

the traverse function, which has a particularly complex sub-tree of (type-specific) dependencies within the library, is used as an example.

Expand disclosure triangle to view JS source
(() => {
    "use strict";

    const main = () => {
        importFrom(
            "Left, Right, enumFromThenTo, even, traverse"
        )(
            "~/prelude-jxa/jsPrelude.js"
        );

        // Testing the imports
        return f();
    };

    // ---------------------- TEST -----------------------

    const f = () =>
        traverse(
            x => even(x)
                ? Right(3 ** x)
                : Left(`Expected even integer, saw: ${x}`)
        )(
            enumFromThenTo(0)(2)(10)
        );

    // --------------------- GENERIC ---------------------

    // importFrom :: CSV String -> IO ()
    const importFrom = fnNamesCSV =>
        // eslint-disable-next-line no-new-func
        fp => {
            const imports = Function(
                [
                    readFile(fp),
                    `return { ${fnNamesCSV} };`
                ]
                    .join("\n")
            )();

            return (
                Object.keys(imports).forEach(
                    k => this[k] = imports[k]
                ),
                imports
            );
        };


    // readFile :: FilePath -> IO String
    const readFile = fp => {
        // The contents of a text file at the
        // given file path.
        const
            e = $(),
            ns = $.NSString
                .stringWithContentsOfFileEncodingError(
                    $(fp).stringByStandardizingPath,
                    $.NSUTF8StringEncoding,
                    e
                );

        return ObjC.unwrap(
            ns.isNil()
                ? e.localizedDescription
                : ns
        );
    };

    // sj :: a -> String
    const sj = (...args) =>
        // Abbreviation of showJSON for quick testing.
        // Default indent size is two, which can be
        // overriden by any integer supplied as the
        // first argument of more than one.
        JSON.stringify.apply(
            null,
            1 < args.length && !isNaN(args[0])
                ? [args[1], null, args[0]]
                : [args[0], null, 2]
        );

    return sj(main());
})();

Thanks. I actually like the method I've come up with. Yes, it does require copying some of the library code into the consumer script, but I would probably do that anyway, and it's really easy to use (now that I've written some supporting code.) I takes almost no time at all.

With that said, I see a lot of benefit in how you're doing it. It's possible that if I hadn't build up my libraries like I have, I might have wanted to do what you've done.

And let's face it - I'm a typical developer in one respect: I built what I'm using, and it gives me a feeling of pride and accomplishment to use it. It's like being a woodworker (which I am) and enjoying using the router table I built, instead of buying something.

That doesn't mean I'm completely averse to using other people's code - in fact, my libraries are strewn with code from other people, a lot of it from you. :stuck_out_tongue:

And thanks for that, by the way. :facepunch:

1 Like