Sunday, July 8, 2012

RTK GPS for Outdoor Augmented Reality



What is an RTK GPS system?

Basically, GPS is inaccurate. I'm sure I've mentioned this before but if I haven't, it it. Very, very inaccurate. It may not seem that way because of the way we use GPS to help us locate ourselves and get directions, but often this is because averaging a position over a greater period of time leads to greater accuracy, and a constant heading is more accurate than standing still. Portable devices may have an accuracy of 5m-10m radius under the best circumstances which means while you are standing still the device things you are wandering around in various directions (a phenomenon known as "drift"). But RTK GPS systems use a signal called "correction data" from a base station at a known, surveyed point to locate roving units at an accuracy of 1 cm over several miles. This is the sort of accuracy required to do outdoor augmented reality. Instead of talking general theory I'm going to get down to the nitty gritty technical nonsense one might have to endure to set up a system like this (if you are interested, I recommend http://www.tinmith.net/ or http://www.tinmith.net/wayne/thesis/piekarski-ch0-start.pdf for a little light reading on the most advanced self contained units being used in university research). My hope is not even that someone might try and replicate the system, but more that the general technical tidbits may be otherwise useful.

During the summer of 2012, on top of my usual job running the augmented reality lab at York University, I've been working to help set up the Augmented Reality Research Lab in the Centre for Digital Humanities at Brock University in St. Catharines. In addition to the Intersense 900 hardware we have at York, John Bonnett, CRC at York also has a couple of RTK GPS devices that he's been anxiously waiting to get into the field. Fortunately, most of the tedious stuff was taken care of for me. Thanks to the UART Augmented Reality Plugins for Unity from the AEL in Georgia Tech I was able to unify the tracking technologies in the lab into one software, just as we'd done at York. The only thing that hadn't been done yet in our lab was to work with the GPS devices. One other handy addition to VRPN library from Alex Hill at Georgia Tech was the first step in getting one of these RTK GPS devices reporting positional information to talk to Unity.

 We are using the following configuration:

  • Head Tracking: Intersense InertiaCube 3 connected by serial adapter to Keyspan Adapter to USB and powered over USB. 
  • Camera: Point Grey Firefly2 over Firewire (to PCI card with external power source). 
  • HMD: emagin z800 (modified with Dual Rachet Headband). 
  • GPS: Magellan Zmax Thales RTK base/rovers with a Magellan U-link radio and external GNSS-750 GPS Receiving Antenna. 

VRPN
The first challenge was simply getting these GPS systems talking to the computer. After a little digging through the technical manuals, I realized that the line was dead because by default there is no NMEA data being reported. This is where the $PASHS,NME commands come in. The Fischer Plug on Port A of the unit connects, via serial cable (an extra component at $200), to a keyspan adapter making it USB compatible. Firing Up Wincomm (9600, 8, N, 1) I could connect to the device. The manual is out there online  and it appeared that to get this talking like a normal GPS device I needed to input $PASHS,NME,RMC,A,ON,0.2 $PASHS,NME,GGA,A,ON,0.2 $PASHS,NME,GLLA,A,ON,0.2 With positional information streaming from the rover, I at least had a sense this was possible to do. The most daunting hurdle was that for some reason VRPN wasn't talking with my GPS device. It was talking fine to my bluetooth GPS device (sort of) but it took a clue from Russell Taylor (University of North Carolina) that the RTS bit wasn't being set. With that added to the VRPN_Tracker_GPS code, the device finally started filling the cue (cbInQue). A couple more tweaks to the GPS code and VRPN was ready to use, with the GPS, in Unity.

Unity
I found the examples in UART for using GPS systems in Unity quite good, but I was a bit concerned (needlessly?) about the conversion happening to transform latitude and longitude to meters (units) for use in Unity's coordinate system. I couldn't pass up an opportunity to use Vincenty's formula wrapped up with a bow in C#. More details on that to follow.

DragonFly 2 Cameras
The trickiest part was wiring a solution for these cameras. Not being a big electronics person (remember my former post on soldering? I've gotten a *bit* better since then) I was caught in this seemingly endless debate online about whether or not an Amperage rating matters. However, instead of having to worry about heat issues on top of that (dropping the voltage, etc...) I thought these little 12VDC ryobi packs would be perfect: http://cheesycam.com/cordless-tool-batteries-solid-dc-power-packs/ And indeed they have been. After finally tracking down the right size plug for the Firewire card (hint: try an external USB hub power supply) it was just a matter of adding a switch and a 12Ohm resistor to the mix to ensure only 1Amp was running throughout the circuit. 

How does everything fit together? Where's the source code? More soon...

Tuesday, February 8, 2011

SnapDragonAR 1.0.8 Released

Lot of work went into this update. Lots of great new features including the ability to greenscreen images on to the markers to create transparent borders. Technologically it's not as advanced as creating a true alpha channel but it's a lot less processor expensive when running large amounts of video like we are doing. The result is pretty cool. The only downside is that the images can't overlap. They create a blank space. Not the end of the world, after all this software is intended to work with one movie per marker. It's only if you get totally carried away with scale and offset that it's at all noticable.
Otherwise it's becoming a seriously cool little app. It compresses your videos so you can instantly test different codecs for your project, allows you a bunch of camera configurations and works with any camera that works in Quicktime. You can even add content from http:// and rtsp:// streams. I've only tested it with static videos but I would imagine you could stream a live webcam feed directly to a marker and then chromakey it.

Tuesday, July 20, 2010

Installing opencv 2.0.0 through macports... Piece of cake or Pain and misery?

ARRGH! Okay now that out of my system... I hope someone finds this post useful.

http://opencv.willowgarage.com/wiki/Mac_OS_X_OpenCV_Port

"If you encounter errors, try installing its dependents with +universal whenever possible."

Great... but what if that doesn't solve your problem.

Mac OS 10.5.8, Macports 1.9.1, OpenCV 2.0.0
Perhaps this is a naive approach, I've since learned about checking "port variants opencv" so perhaps "port variants lame" would have turned up a helpful clue... but this is what I did to solve my issues. Take it or leave it.

My first error:

:info:build xmm_quantize_sub.c:37:23: error: xmmintrin.h: No such file or directory

If this error looks familiar it's because LAME didn't build (LAME 3.98.4).

"If I run make immediately after configure, it fails. What I have to do
to make successfully is to first comment out the following line in
config.h:

#define HAVE_XMMINTRIN_H 1"

Taken From http://www.opensource-archive.org/archive/index.php/t-88245.html

Tried even setting the define to 0 but that didn't work, so commenting it out may be your only option. Let's try building opencv again...

AND... Ka-blamo... another error. This ones worse than the first. It seems to be with ORC (formerly liboil) 0.4.5

It has no idea what the function get_cpuid is, any why should it? It's supposed to know better thanks to ifdef __APPLE__, right?

Look at the bottom of orccpu-x86.c (in the macports work directory).
orc_mmx_get_cpu_flags(void)
{
//orc_cpu_detect_kernel_support ();

#ifdef USE_I386_CPUID
return orc_mmx_detect_cpuid ();

Oops, if I'm reading this right then regardless of whether it's been defined Apple it's still going looking for the get_cpuid function because we're on an intel Mac. Naughty, naughty. Let's comment those out and cross our fingers.

orc_mmx_get_cpu_flags(void) { //orc_cpu_detect_kernel_support (); #ifdef USE_I386_CPUID return orc_mmx_detect_cpuid ();

There's another one in the matching function orc_sse_get_cpu_flags(void) too. Make sure you get both and "orc" will compile.

This post likely explains why the error may not appear for some people (looks like it depends on which assembly compiler you are using, but I'm guessing that this is a bonafide bug).:
http://sourceware.org/ml/crossgcc/2008-01/msg00017.html

Okay, so far so good, ffmpeg is the last to go... Totally expecting it to crash and burn.

Oh, okay, that worked. It's building and installing opencv now. Could I possibly be that lucky? Guess I'll have to build a project in Xcode and find out.

Thursday, July 8, 2010

Banff CAVE redux


I was fortunate enough to help restore the Banff CAVE to a functioning state. We used the Cosm Library for Max/MSP to create our own series of cave.* objects for Max. Those can be downloaded in their rough form from www.futurecinema.ca/arlab. Several modifications have to be made to use them in your own cave like environment, mainly the windows have to be aligned to work with the projectors you are using. You have to enable/disable the dual windows manually that are being used for stereo. And you will have to repurpose the cave.send and cave.jit.send patches so they are hard coded with the IP of your destination machines (and new ports may have to be added depending on the number of machines you are using). We had two renderers and one server.

Saturday, April 17, 2010

Gremlins are back - need proof?

Some strange things have been happening to the technonolgy around me.
Of course, I blame gremlins...

Friday, April 2, 2010

"So close and yet..."

Hmmm... The good news or the bad news? Well here's both.

As you can see, the ladybug2 can now talk directly to Max... But it's green and unhappy. There are two major problems, one which I couldn't have foreseen, the other I'm an idiot for not considering.

First: We were lucky enough not to have the Bayer filters included in Libdc1394v2, to try and decode the RGGB that the ladybug is outputting but as you see it doesn't work as expected. Our theory is that the code is expecting the image to be upright, but as you can see, our image directly out of memory is 90ยบ from where it should be. So why don't we just rotate the image before processing? Well, as it is we are copying the matrix once, from memory, to the outlet. This image is huge, a whopping 4608x1024. That's 6x768x1024. So of course that brings me to the second problem. Under no circumstances have I ever been amazed with the framerate of an incoming image in Max at 1024x768. Let alone 6 of them simultaneously. So is this even a good idea? Perhaps there's just too much visual information to process to be useful at all. Not one to give up too easily, I may have another solution. If we output the raw image into a jit.slab and crunch it all on the GPU (which is what they do at Point Grey if I'm not mistaken anyways), then maybe we can get the performance up to something halfway decent. Definitely open to suggestions.