Sunday, April 10, 2016

Last Post on this Blog

It has been a long time since I've done anything with my Leap Motion device. I had originally thought this could be something I could pick up full time since leaving Texas, but as it turned out I hadn't expected my life to quite turn out the way it had. At the time I got this started, I hadn't even realized I was probably already suffering from the effects of depression (see my related post on my personal blog page - http://houganger.blogspot.sg/2016/03/early-thoughts-on-mindfulness.html.) It had hit my ability to focus hard enough where it just wasn't possible for me to start projects like these from scratch.

It may be possible I could do stuff with the device again at some point in the future. In any event, I had reasoned that my blogging workflow would necessarily place the contents relevant to this intended blog in my technical blog (see http://cwleehpc.blogspot.sg/) which makes it reasonable to simply close this blog with this final post.

What I'd like to do however is to find a way to migrate the posts from this blog over to the technical one, and then simply delete this one. For now, this will suffice.

Sunday, August 17, 2014

Using Leap with the Free version of Unity 3d

When I started trying to play with Leap and Unity, I found that the asset package for Leap on Unity's online store had a version that required Unity Pro. The error you get when trying out the tutorial on Leap's Youtube video (embedded below) when using free Unity is that when you try to play with Leap's Hand Controllers on , you get something to the effect of "This plugin requires Unity Pro."


Now, Leap's website clearly states that a package for free Unity existed (which is what I downloaded.) Unfortunately, there did not seem to be any asset bundle in that zip file. So there wasn't a clear way to import the contents of that package into Unity. That left me rather confused, as I was new to Unity and still a relative Leap newbie.

After Googling for the solution it appears that the Leap Devs had documented this but did not make it very accessible. The instructions are here:

https://developer.leapmotion.com/getting-started/unity/free

There are some minor sources of confusion in the instructions however. The instructions were for the context of taking their Unity examples, and modifying them so they would work for free Unity. I was starting from a Unity project from scratch using the Youtube tutorial above. Their instructions for starting from scratch were based only on Windows folder structure. For the Mac, the files libLeap.dylib and libLeapCSharp.dylib were found in the LeapSDK/lib folder instead. This instruction probably was not completely necessary, as the subsequent commandline instructions on the same page showed how to get those files from deeper in the Unity project's folder structure not previously documented.

In any event, I got my Leap device integrated with the free version of Unity rather quickly after that. Here is a screenshot of me playing with a bouncing ball, a bouncing cube, and a gravity-tied box I could pick up and throw inside Leap's prefab bounding box. You need to get the camera within the box, as well as the directional light source. The above Youtube video covers instructions for setting the properties of some of the objects.


Some initial experiences - the hand control seemed clunky, and the lack of stereo vision on the 2D screen meant I had some difficulty reaching out to the right depth. The fingers of the default hand avatars had a lot of difficulty gripping stuff, even though the small block was supposed to be made of "rubber". It is not certain to me if my Leap hardware is old or not as good as it should be. Of course, with regard to grip and all other aspects of Unity interface, I am sure it is down to getting the physics configured just right or building a new hand model that works better than Leap's prefab hand. One thing the interface is pretty good at is bouncing a free floating ball inside an enclosed box, especially if one makes a fist.

Saturday, August 16, 2014

install_name_tool on Mac OS X

Full notes here on my technical blog:
 
Wished I had known this earlier when working with Leap SDK 1.2. Also means I won't have to worry about getting development work on Leap SDK 2.1-beta done with Cmake unless I really want to play around with both.

A Quick Update

It has been a while since I've done anything for the Leap device I have. It really has been a whole host of personal issues that kept me away. I think it mostly stemmed from the frustration of having to deal with the Xcode development environment which was not familiar to me, and for which I was (and still am a little) hesitant to dive deeply into. There were also other frustrations with the Leap API - back then, you had to implement your own code for determining which pointing objects were what whenever they went out of the device's tracking range. Also other than simply measuring my fingers, I really didn't have much of an idea of anything cool and fun I could work on. I still do not. Finally, I've been fighting a number of physical and emotional issues the past few months, including a debilitating skin eczema that still afflicts me now. So I am hardly focused on any one thing right now.

Meanwhile, the Leap SDK had grown significantly, and in pretty exciting ways. The language support has improved, and I'm particularly excited about the new way they allowed the Leap device to be used with javascript, and the Unity 3D development engine.



So I'll continue to casually play around with the new SDK, and think about what I'd like to do with it. I had been wanting to mess around with Unity 3D for a while now, so this might be a good chance to do both.

Edit: I'd also like to add that the reason I felt interested to try to play around with Leap development again was my discovery of Kitware's Cmake build features, which could help me work around a development environment's (like Xcode's) idiosyncrasies I am unfamiliar with, and just focus on the code. 

Wednesday, October 30, 2013

Own Project Update - Multiple Domain Measurements

My own ideas over what I want to do with my own Leap device has been shaping up over the past month of getting distracted, so here's a quick rough sketch over what I have done so far and what I plan on pushing later.

I am trying to have my prior expertise with application performance measurements on supercomputers dovetail into the Leap. As such, I've been using the Leap API to measure various aspects of hand, finger, and tool motion that are tracked by the device. Of the various performance measurement tools I am used to, the SCORE-P project (http://www.vi-hps.org/projects/score-p/) is one which implements support for user-level modules that can feed information (Generally, SCORE-P expects this to be performance-related data) to it. If I can cast Leap-related measurements and metrics in terms of performance, what this allows me to do is have a very convenient mechanism to visualize and analyze how the two domains of information are correlated. So, on top of possibly developing analytical methods for gesture tracking information (e.g., is a gesture hard to achieve for individual X? How hard?), we can also measure how well the underlying library and application responds computationally to an attempt to achieve a gesture, performance-wise. From there, we can think about various visualization and analysis approaches that are different from traditional computing performance analysis, particularly when multiple domains of information come into play.

To this goal, my actual accomplishments have been minimal. I've successfully used Xcode (I had loads of initial difficulty getting into the spirit of it, as I had been used to standard Linux approaches to shared object libraries on the console) to construct my own Cinder (http://libcinder.org/) application to interface with the Leap library, and then to use the API's listeners to start tracking useful information over time. I've been able to track various interesting metrics regarding my fingers (again, I am unsure if I am legally allowed to say what, since that kinda exposes part of the API) and generate a "performance" profile of an active session of motion tracking.

My next step would be to attempt to integrate what I have so far with the SCORE-P instrumentation interface as a module. The subtle nuance here is that SCORE-P works on a data-pull model for measuring stuff. My plan is to have SCORE-P instrument key events in the application vis-a-vis gestures, as well as important events in the Leap API. That way, whenever SCORE-P asks me for motion information, it will always been in line with some key gesture event. At the same time, I get the bonus of acquiring from SCORE-P the computational performance information for free. When successful, I will acquire output in the form of either a CUBE-4 performance profile (http://www.scalasca.org/software/cube-4.x/download.html) or a Open Trace Format (OTF) trace file (http://www.tu-dresden.de/die_tu_dresden/zentrale_einrichtungen/zih/forschung/projekte/otf). There are a multitude of visualizers and analysis engines for performance based on those two formats. With the inclusion of Leap gesture data, perhaps there will be a need to develop newer framework approaches to multi-domain visualization tools.

Interesting Projects Update

I really need fewer distractions! Heh, got my focus and attention dragged away to properly reviewing two research papers in HPC (ok, it was my fault I accepted the invitation to review journal articles but hey, I consider it a duty as a member of the HPC scientific community to do it - even if I no longer have any institutional affiliation!).

Anyway, as a quick (partial) update today, here are some of my picks of rather interesting projects out there in the Leap development community over the past few weeks along with some of my own comments:

1. Project PAALM: Phalangeal Angle Approximation through the Leap Motion Controller
http://projectpaalm.blogspot.sg/

This is really neat, and ties in to some of my own perception of the limitations of the device's API as I try to use it for my own work (I don't know how much I am allowed to say in public, so I will avoid saying any).

2. LEAP Motion Head Attachment Experiment - YouTube
https://www.youtube.com/watch?v=aVopcoxizcY

Again, something I find interesting because as I use the device, I get the feeling it might be more effective a device facing the user's fingers rather than underneath them. That feeling is driven by my perception (I have not studied this in any appreciable detail as yet) that the latter positioning of the device will tend to suffer from more instances where tracked objects disappear, and will need to be re-tracked and identified as the prior objects.

3. Makeblock Robotic Arm with LeapMotion Control - YouTube
https://www.youtube.com/watch?v=-NqkYsIXkxw&feature=youtu.be

What interests me about this project is the time delay between an action from one's hand and the resulting reaction from the robot arm. In that demo, it is significant. When I project myself into performing those tasks, I find myself conscientiously trying to take those millisecond-level feedback delays into account. This ties in with some of what I want to do - performance measurements associated with the Leap device. There is an expectation of a natural delay built into our brains for the experiencing of visual feedback when we send a signal to the hands to perform a task. How much of a change (ignoring our need for tactile feedback) can our brains tolerate?

4. 3D Printer Finger Painting with Leap Motion - YouTube
https://www.youtube.com/watch?v=8wLmOkNtuRM&feature=youtu.be&noredirect=1

This is just cool and fun. I don't know if we ought to be painting 3D objects on the fly like that in a real application :)

5. BetterTouchTool
https://airspace.leapmotion.com/apps/bettertouchtool/osx?utm_source=Leap+Motion+Newsletter&utm_campaign=79a109cbe0-Consumer_Newsletter_16&utm_medium=email&utm_term=0_f0a6fbd89e-79a109cbe0-60510061

Free app from Airspace. Seems pretty cool. I'll install it to give it a try once I get over my paranoia after the craziness I've experienced after finally successfully getting Mavericks on my other (non work-related) laptop. Too terrified of breaking this laptop, which is far more important to me.

6. TedCas & Leap Motion - YouTube
https://www.youtube.com/watch?v=6d_Kvl79v6E&feature=youtu.be

Seems like an excellent reason for the use of the Leap device with technology in an operating theater environment. I am typically queasy about using the Leap itself to perform a remote operation via robotics (see my concerns about feedback delay above), but accessing documents and information in that environment using a no-touch system makes perfect sense.

7. Leap Motion Labs - Rapid Prototyping with Vuo
http://labs.leapmotion.com/post/64816899700/rapid-prototyping-with-vuo?utm_source=Leap+Motion+Developer+Newsletter&utm_campaign=0e86281121-Dev_Newsletter_34&utm_medium=email&utm_term=0_d7eaf93515-0e86281121-60510065

I've not taken a very close look at this yet, but it is interesting to me from a software engineering perspective.

Wednesday, September 18, 2013

Fortnightly Update

I've decided to set up a fortnightly update on this blog to establish some kind of a regular working regime on my effort. This will be reflected on my public calendar displayed on this blog. On each update, I'll document projects of interest to myself that have been listed by Leap. I will also document my own thoughts and progress on my own project(s).

This week, I've found an interesting project implementing a Leap gesture training framework. This has been noted on my "Leap Motion" tab.

Meanwhile, nothing major has been achieved on my own efforts to develop on the Leap device. In the following notes, I'll try to avoid exposing any parts of the Leap API I am legally not allowed to publicize. I have been approaching this from a development environment I have been used to - a commandline based Linux-like C/C++ environment. Currently, my only viable development environment has been on my MacBook Pro laptop. This has resulted in a number of difficulties:

1. Leap's C++ API library for the Mac uses Apple's dylib framework. I've had difficultly linking against it using the standard g++ (or clang++) tools on the commandline like I usually do using Emacs + Make + standard .so library files. I decided to switch to learning how to use Apple's Xcode IDE environment instead.
2. Apple's Xcode has its own library linking framework and runtime behavior that is unfamiliar to me. So, I'm still struggling with some of that. Some of the interactions with the Leap library have also resulted in C++ parsing and linking errors I am unfamiliar with. So, I'll need to be spending more time polishing my C++-fu in this environment.
3. It is unclear to me if the Leap development team had intended for the library to be used by commandline tools and applications. My goal now is simply to track device events and have them printed out to screen or written out to a trace file. This does not require a GUI. Yet, I have been unable to make the library's Listeners work correctly - no events get delivered and the appropriate handlers invoked. No problems so far with using an explicit main event loop polling the device at some fixed timing interval, however. That is not my preferred implementation.

So, my results have been mixed. I am able to use an explicit loop to probe Leap data at a frame rate varying from a low of about 20 frames per second to a high (and typical) rate of about 80 frames per second. That's pretty decent. I have not yet started to acquire detailed information on the items tracked - that will be something for the next two weeks. Meanwhile, I would still rather do this in a application model that (I think) is more natural to the Leap framework.

Finally, I have not yet really decided what my own project is going to be as yet. I suspect this will come to be organically. My own ideas tend toward tool frameworks - similar to the stuff I had been working on with my Phd, and my subsequent academic work. In particular, I would like to construct a tool to track (and probably visualize) the various performance-related characteristics or metrics of finger-hand-tool motion and Leap/Leap Application behavior. For example, given some gesture detection tool how often does the Leap device correctly report a gesture? What are the delays? How frequently does the device lose track of the things it was tracking? It is my belief that some tool like this would be of some practical use to application developers, as well as to the Leap development team.

Edit: Thanks to the Leap development team's use of Cinder in their examples, I've decided to take the effort to try it out. This should be fun.