An update on this project. We've not made a decision yet. The delay is because we want to figure out a structural way to connect the testing tool to Uniface, and without any dependency on the testing tool vendor. (which was the case with the ‘old tools’, when where was some kind of .dll which contained mapping from the testing product to Uniface). We have the idea to leverage the Microsoft Active Accessibility (MSAA) functionality which is typically used as a utility for end users that are impaired in some way. Consider that it is a series of handles which a tool can use to know what it is (a command button), where it is on the screen (coordinates X:Y), what it’s state is (It is not pressed), what it can do (mouse hover causes a tool tip, pressing it does something, pressing and holding if for more than 5 seconds does something else, etc, etc). We decided to start this with a high level POC based on our shortlist, and it looks like this is achievable. To put the functionality into Uniface is estimated at about a man years worth of effort, so we need to see how we can do this effectively. For example it doesn't have to be done for all widgets, we could think about bringing in some contract help, etc, etc. But it needs to be done, so we need to figure that out. Watch this space.
Author: Adrian Gosbell (
adrian.gosbell@synapse-i.jp)