Wednesday, October 30, 2013

Own Project Update - Multiple Domain Measurements

My own ideas over what I want to do with my own Leap device has been shaping up over the past month of getting distracted, so here's a quick rough sketch over what I have done so far and what I plan on pushing later.

I am trying to have my prior expertise with application performance measurements on supercomputers dovetail into the Leap. As such, I've been using the Leap API to measure various aspects of hand, finger, and tool motion that are tracked by the device. Of the various performance measurement tools I am used to, the SCORE-P project (http://www.vi-hps.org/projects/score-p/) is one which implements support for user-level modules that can feed information (Generally, SCORE-P expects this to be performance-related data) to it. If I can cast Leap-related measurements and metrics in terms of performance, what this allows me to do is have a very convenient mechanism to visualize and analyze how the two domains of information are correlated. So, on top of possibly developing analytical methods for gesture tracking information (e.g., is a gesture hard to achieve for individual X? How hard?), we can also measure how well the underlying library and application responds computationally to an attempt to achieve a gesture, performance-wise. From there, we can think about various visualization and analysis approaches that are different from traditional computing performance analysis, particularly when multiple domains of information come into play.

To this goal, my actual accomplishments have been minimal. I've successfully used Xcode (I had loads of initial difficulty getting into the spirit of it, as I had been used to standard Linux approaches to shared object libraries on the console) to construct my own Cinder (http://libcinder.org/) application to interface with the Leap library, and then to use the API's listeners to start tracking useful information over time. I've been able to track various interesting metrics regarding my fingers (again, I am unsure if I am legally allowed to say what, since that kinda exposes part of the API) and generate a "performance" profile of an active session of motion tracking.

My next step would be to attempt to integrate what I have so far with the SCORE-P instrumentation interface as a module. The subtle nuance here is that SCORE-P works on a data-pull model for measuring stuff. My plan is to have SCORE-P instrument key events in the application vis-a-vis gestures, as well as important events in the Leap API. That way, whenever SCORE-P asks me for motion information, it will always been in line with some key gesture event. At the same time, I get the bonus of acquiring from SCORE-P the computational performance information for free. When successful, I will acquire output in the form of either a CUBE-4 performance profile (http://www.scalasca.org/software/cube-4.x/download.html) or a Open Trace Format (OTF) trace file (http://www.tu-dresden.de/die_tu_dresden/zentrale_einrichtungen/zih/forschung/projekte/otf). There are a multitude of visualizers and analysis engines for performance based on those two formats. With the inclusion of Leap gesture data, perhaps there will be a need to develop newer framework approaches to multi-domain visualization tools.

Interesting Projects Update

I really need fewer distractions! Heh, got my focus and attention dragged away to properly reviewing two research papers in HPC (ok, it was my fault I accepted the invitation to review journal articles but hey, I consider it a duty as a member of the HPC scientific community to do it - even if I no longer have any institutional affiliation!).

Anyway, as a quick (partial) update today, here are some of my picks of rather interesting projects out there in the Leap development community over the past few weeks along with some of my own comments:

1. Project PAALM: Phalangeal Angle Approximation through the Leap Motion Controller
http://projectpaalm.blogspot.sg/

This is really neat, and ties in to some of my own perception of the limitations of the device's API as I try to use it for my own work (I don't know how much I am allowed to say in public, so I will avoid saying any).

2. LEAP Motion Head Attachment Experiment - YouTube
https://www.youtube.com/watch?v=aVopcoxizcY

Again, something I find interesting because as I use the device, I get the feeling it might be more effective a device facing the user's fingers rather than underneath them. That feeling is driven by my perception (I have not studied this in any appreciable detail as yet) that the latter positioning of the device will tend to suffer from more instances where tracked objects disappear, and will need to be re-tracked and identified as the prior objects.

3. Makeblock Robotic Arm with LeapMotion Control - YouTube
https://www.youtube.com/watch?v=-NqkYsIXkxw&feature=youtu.be

What interests me about this project is the time delay between an action from one's hand and the resulting reaction from the robot arm. In that demo, it is significant. When I project myself into performing those tasks, I find myself conscientiously trying to take those millisecond-level feedback delays into account. This ties in with some of what I want to do - performance measurements associated with the Leap device. There is an expectation of a natural delay built into our brains for the experiencing of visual feedback when we send a signal to the hands to perform a task. How much of a change (ignoring our need for tactile feedback) can our brains tolerate?

4. 3D Printer Finger Painting with Leap Motion - YouTube
https://www.youtube.com/watch?v=8wLmOkNtuRM&feature=youtu.be&noredirect=1

This is just cool and fun. I don't know if we ought to be painting 3D objects on the fly like that in a real application :)

5. BetterTouchTool
https://airspace.leapmotion.com/apps/bettertouchtool/osx?utm_source=Leap+Motion+Newsletter&utm_campaign=79a109cbe0-Consumer_Newsletter_16&utm_medium=email&utm_term=0_f0a6fbd89e-79a109cbe0-60510061

Free app from Airspace. Seems pretty cool. I'll install it to give it a try once I get over my paranoia after the craziness I've experienced after finally successfully getting Mavericks on my other (non work-related) laptop. Too terrified of breaking this laptop, which is far more important to me.

6. TedCas & Leap Motion - YouTube
https://www.youtube.com/watch?v=6d_Kvl79v6E&feature=youtu.be

Seems like an excellent reason for the use of the Leap device with technology in an operating theater environment. I am typically queasy about using the Leap itself to perform a remote operation via robotics (see my concerns about feedback delay above), but accessing documents and information in that environment using a no-touch system makes perfect sense.

7. Leap Motion Labs - Rapid Prototyping with Vuo
http://labs.leapmotion.com/post/64816899700/rapid-prototyping-with-vuo?utm_source=Leap+Motion+Developer+Newsletter&utm_campaign=0e86281121-Dev_Newsletter_34&utm_medium=email&utm_term=0_d7eaf93515-0e86281121-60510065

I've not taken a very close look at this yet, but it is interesting to me from a software engineering perspective.