That's a fascinating question. I always like using accessibility features of MacOS to solve problems. I have used accessibility in MacOS to solve problems with apps before. This idea of yours would open up new possibilities if it were possible. At the moment, my gut is saying no. But I don't give up easily. I'm hoping that someone with more AppleScript knowledge can answer this. But there may not be an API for it.
It's inevitable that many apps don't provide any accessibility metadata.
Another thing to keep in mind is that MacOS Monterrey has a SUPERB ability to perform OCR on any window, and I've used it to help solve problems with apps like the ones you are alluding to. I can't begin to tell you how superb the OCR abilities are.
It's inevitable that ALMOST ALL apps provide some form of text display which OCR can access. My sails are set in the direction of using OCR (and away from Find Image) to deal with difficult apps. But I'm intrigued by your idea.
A hybrid solution, which may not help you, is for OCR (or Find Image) to read the accessibility cues from the screen. This is the kind of hacking that I love to do.