I've decided to set up a fortnightly update on this blog to establish some kind of a regular working regime on my effort. This will be reflected on my public calendar displayed on this blog. On each update, I'll document projects of interest to myself that have been listed by Leap. I will also document my own thoughts and progress on my own project(s).
This week, I've found an interesting project implementing a
Leap gesture training framework. This has been noted on my "Leap Motion" tab.
Meanwhile, nothing major has been achieved on my own efforts to develop on the Leap device. In the following notes, I'll try to avoid exposing any parts of the Leap API I am legally not allowed to publicize. I have been approaching this from a development environment I have been used to - a commandline based Linux-like C/C++ environment. Currently, my only viable development environment has been on my MacBook Pro laptop. This has resulted in a number of difficulties:
1. Leap's C++ API library for the Mac uses Apple's dylib framework. I've had difficultly linking against it using the standard g++ (or clang++) tools on the commandline like I usually do using Emacs + Make + standard .so library files. I decided to switch to learning how to use Apple's Xcode IDE environment instead.
2. Apple's Xcode has its own library linking framework and runtime behavior that is unfamiliar to me. So, I'm still struggling with some of that. Some of the interactions with the Leap library have also resulted in C++ parsing and linking errors I am unfamiliar with. So, I'll need to be spending more time polishing my C++-fu in this environment.
3. It is unclear to me if the Leap development team had intended for the library to be used by commandline tools and applications. My goal now is simply to track device events and have them printed out to screen or written out to a trace file. This does not require a GUI. Yet, I have been unable to make the library's Listeners work correctly - no events get delivered and the appropriate handlers invoked. No problems so far with using an explicit main event loop polling the device at some fixed timing interval, however. That is not my preferred implementation.
So, my results have been mixed. I am able to use an explicit loop to probe Leap data at a frame rate varying from a low of about 20 frames per second to a high (and typical) rate of about 80 frames per second. That's pretty decent. I have not yet started to acquire detailed information on the items tracked - that will be something for the next two weeks. Meanwhile, I would still rather do this in a application model that (I think) is more natural to the Leap framework.
Finally, I have not yet really decided what my own project is going to be as yet. I suspect this will come to be organically. My own ideas tend toward
tool frameworks - similar to the stuff I had been working on with my Phd, and my subsequent academic work. In particular, I would like to construct a tool to track (and probably visualize) the various performance-related characteristics or metrics of finger-hand-tool motion and Leap/Leap Application behavior. For example, given some gesture detection tool how often does the Leap device correctly report a gesture? What are the delays? How frequently does the device lose track of the things it was tracking? It is my belief that some tool like this would be of some practical use to application developers, as well as to the Leap development team.
Edit: Thanks to the Leap development team's use of Cinder in their examples, I've decided to take the effort to try it out. This should be fun.