Why does `Math.floor()` result include `.0` decimal (via `Execute Javascript action`)


Is there a reason why the result of Math.floor() includes decimals?




This is a reduced test case. The actual macro would receive a value from the Front Browser's current page.

When I run Math.floor(101.01) directly in the browser's console*, it returns 101 as expected.

(*I tested in Safari and Chrome.)


Funny enough, if I use Keyboard Maestro's native FLOOR() I do receive 101.

FLOOR(101.01) results in: 101

Also, I'm aware of the funniness in JS re: floating point calculations. But I wouldn't think that has anything to do this, since it would seem the browser is processing Math.floor() and Keyboard Maestro is adding the .0. Or maybe it's the browser adding that .0 ?


Because that is what is returned by do JavaScript.

Presumably, the do JavaScript code is returning a number (which is a float), and the browser is returning the floating point number, and then AppleScript is converting that to a string and including the “.0”.

The result is correct, and you have not specified any formatting so it is entirely reasonable.

You can work around this in a number of ways. You could force the string conversion to happen in JavaScript, by using:


Or you could use Keyboard Maestro’s Set Variable to Calculation action with the Formatting option to force the formatting to your desired precision.


I think I see what you mean and I see the same when I run your example in Script Editor.app

It was unexpected because of what I see when I run this directly in Safari.

Screen Shot 2023-04-19 at 3.06.40 PM

I guess browser Consoles are outputting a float with .0 differently than Apple Script?

Also in my mind, floor() = Output integer, not float.

(I also didn't realize AS was in the middle of all of this. Which contributed to my confusion. I guess I should have.)

This works well.

I didn't even know that Format Result feature existed. Cool!



Not necessarily.

In C for example, the floor function is defined as:

double floor (double x);

That is, it takes a double width floating point and returns a double width floating point.

Also, it depends on the allowable types of the language. For example, in Keyboard Maestro all variables are strings. In other languages, they might have only floats and strings, with no specific type for integer.

Yes, in the Console, it is Safari taking the floating point result and turning it in to a string. Or maybe JavaScript. In AppleScript (and with Keyboard Maestro), it is AppleScript converting the resulting floating point to a string.

Yes. Really, if you want a specific formatted output this would be the most robust solution.

Or you could do the FLOOR function in a Keyboard Maestro calculation, and Keyboard Maestro takes pains to never return a calculation in a format like 123.0 (unless you specify the format explicitly of course!).

1 Like

JS does now have a special, recently introduced, and rarely used BigInt type, but it has to be deliberately initialised with a special syntax, and there are no automatic coercions to it.

JS Math library operations are all Float -> Float
No fixed size integers are involved or defined.

The details of the JS Number type (a Float which uses a total of 64 bits) are here:

Number - JavaScript | MDN

JS does not define a fixed size integer type at all.

That makes sense. I can definitely see, in principal, why floor() shouldn't automatically imply literally converting a float to an int.

I probably shouldn't have written what I did above. Since that was more of a perception about the result rather than what is really going on under-the-hood. Especially considering all numbers are floats in JS (with the exception of BigInt that @ComplexPoint mentioned).

AFAIK, that output from Safari is a number, not a string. But it can be coerced into a string, as you suggested.

Screen Shot 2023-04-20 at 2.15.39 PM

Regardless of this detail, my fundamental confusion had to do with an action named Execute JavaScript in Front Browser, returning a value to Keyboard Maestro that was different from what I would see when executing the same JS directly in that front browser.

And that was because I didn't realize AppleScript was sitting in the middle and treats this case differently than Safari.

Yes, I used that at one point too. It also worked as expected.

Thanks for this.


I'm still just a little curious about why browser JS engines choose to return 101 while AppleScript running JS (for example) returns 101.0 ?

Here's what I found:

Although, unless I'm missing something (which I might be), this still doesn't exactly answer why browser JS engines choose to not show the decimal and ever to its right, while other JS engines do.


AppleScript can handle real or integer types. When converting a real to a string, it adds at least one decimal digit, presumably because this ensures the reverse translation would convert it back to a real instead of an integer.

set v to 101.1 - 0.1 -- displays as 101.0 of class real
set v to 101.0 -- displays as 101.0 of class real
set v to 101 -- displays as 101 of class integer
class of v

do javascript returns from JavaScript the variable as a real (since as we've discussed, all JavaScript variables are essentially floating point numbers).

Then as a real it is converted to a string (for display or for placing in a Keyboard Maestro variable) by AppleScript, hence it gets “.0”.

It's a number in Safari's JavaScript engine, but in the Console it's a textual representation of that number -- the number itself isn't being coerced to a string but what you see in the Console is.

Number -> textual representation of number will have different rules in different implementations. You can see this for yourself in Script Editor by comparing AppleScript and JXA results:

And now you know that AppleScript is the "conduit" between JavaScript and KM, everything returned to KM is text, and AS has its own rules as to how it coerces numbers to text -- you'll be ready for every idiosyncrasy you see!

1 Like

Thank you for this and the examples.

I understand.

Yes. And this is what I'm a little curious about.

I guess I'm looking for something like a blog post, mailing list discussion, etc. about why browser consoles output what they do in contrast to something like AppleScript.

I bet it has something to do with JavaScript being loosely typed, and all of the judgement calls that have to be made as a result.

1 Like

One thing to bear in mind is that console.log,
while available in many (though not all) JavaScript implementations,
is not part of the JavaScript language, and is not defined by the ECMAScript standard.

(It's just a legacy library method which was commonly supplied in early browser embeddings of JS, and which is often implemented in some form or another by commercial suppliers of other types of JS embedding)

Script Editor, for example, provides its own (and distinct) stringification of
script return values. Again, not part of JS, just a supplier-defined utility.

If you need a particular stringification, you can write your own. Any JS Number value is a fixed width Float, so there's not much complexity to deal with.



I'd just like to find a discussion that explains the backstory behind why each vender implemented it the way they did.

Thanks everyone!