This blog is NOFOLLOW Free!
Print This Post Print This Post

The Linux Chronicles – Off the Reservation


… and paying for it.

I’m just now coming up for air after about 4 days of round the clock hacking trying to get an HDR (high dynamic range) and panorama stitching program called Hugin to run.  This includes about 4 hours sleep.  I have a  nasty habit of assaulting computer problems full force and working them til they’re fixed.  If that takes two days, so be it.  That tendency is why I quit writing software professionally.  Well, one of the reasons.

Before I get into the nitty-gritty, let’s step back a moment and look at how most of us are used to getting new software.  In windows, the procedure is more or less this:

In the process of installation, the setup program might over-write system files that are needed for other programs and thus make your system unreliable but that’s another topic and issue.  For the most part the above workflow works.  Everything that the program needs is included in the package.  Simple and user-friendly.  Plus, by the time the programmer gets to the point of constructing an installation program, he usually has most of the big bugs worked out so the program almost always works.  It may have some quirks but seldom if ever do you get something that just won’t work.

The *nix world traditionally has been different.  In that world, programs are distributed as source code and every user has to compile the code using the C compiler that comes with *nix.  That is beyond tricky.  The programmer may have relied on libraries and other resources that did not come with the *nix distribution and that he had installed previously.  Say, a library to work with TIFF or JPEG graphic files.  The only way you the user know what is required is by trying to compile and having it fail.

Even worse is the Tower of Babel collection of slightly different *nixes.  Berkley BSD, SysV (old), all the Linuxes, Apple’s version of BSD, etc.  If the programmer wrote the program on a BSD system and you’re running Linux, the program will not compile nor run unless the programmer made provisions for your system.  In short, installing a new program was a major project, sometimes taking days to accomplish.

The companies that sprung up to support Linux – Redhat and others – decided to fix this situation.  They created installers called RPMs and DEB packages.  Unfortunately the two are not the same so one won’t work in place of the other.  A package is like a windows Setup program.  It contains the binary (ready to run) program plus any support libraries and other files.  The user downloads the package, runs the package manager against it and the program installs.  Neat and clean most of the time.

There still remained the problem of unreliable software.  *nix programmers have a nasty habit of taking a snapshot of their unfinished work, slapping a 0.xx revision number on it and putting it out there.  Not as a beta but as the supposedly usable program.  John’s software rule #1 – Stay away from Rev 0.xx software.

This situation was not about to displace microsoft from the desktop, something all *nix people badly want to to so the guys behind Ubuntu decided to improve things.  One, they adopted a regular release schedule.  Two, they tested and made reliable the software that is stored in their repository.  Three, they created the repository so that users could go to one place and get reliable software.  Four, the incorporated a GUI package manager called Synaptics that makes installing new software pretty much a point’n’click operation.

This system works superbly but it has two major problems.  One, since by philosophy, nothing but security fixes are released between major releases, the programs in the repository are by nature, one or more years old.  Two, users are dependent on what the repository maintainers can test and include.

What that means is that a lot of useful software is not available through the easy and reliable repository channel.  One has the option of going outside the repository – going off the reservation, as it were – but when he does, he is sailing in uncharted waters.  It’s like dipping your toe in the Bermuda Triangle.  You’ll likely get swept away.

OK, so back to HDR photography.  This is a technique that seeks to capture high dynamic range lighting situations that could not be otherwise photographed.  Say a shaded lake against a bright sky.  The technique (should be) simple.  Set up your camera on manual, preferably on a tripod.  Set the exposure to get a good shot of the foreground.  The bright background will be blown out to white but that’s OK.  Next, without changing the f/stop, change the shutter speed so that the bright background is properly exposed.  One can take several shots, bracketing the exposure to ensure getting the best of each.

Back at the computer, two or more of the best shots from each exposure class are submitted to the HDR program.  Through PFM (pure f…..g magic :-), they are combined to make one photo where everything is properly exposed.  The algorithms involved are extremely complex and some work better than others.

This is one place where the Linux world is far ahead of the PC world.  The HDR plugin for Photoshop, for example, tends to flatten the photo and make it look cartoonish.  Not terribly useful.

A couple of days ago, a friend of mine (Hi Norman) turned me on to this article and a program combo to do it.  The combo is Enfuse, a command line HDR program and Hugin, a GUI wrapper for it.  HDR was on my round tuit list so I was excited. (I should note that with most of the examples in that article and many HDR situations in general, a more satisfactory and much simpler solution is proper lighting.  Fill flash goes a long way.)

I fired off Synaptics and checked the repository.  Nothing useful.  So I went to the Hugin website.  No DEB package and no binaries.  Oh sh*t.  I decided to “step off the reservation” and install something outside the package management system (Synaptics).  I have done it before (and did it all the time back in the SysV Unix days) so how bad could it be?  Nearly 4 days of round the clock hacking is how bad.

I downloaded the tarball (like a Zip in the *nix world), unpacked it and started to compile.  It wouldn’t.  It needed libraries that I didn’t have.  So I fetched libraries.  It needed more.  Fetch more libraries.  Rinse and repeat.

Somewhere in that process I found this page.  I won’t bore you with the details but I did everything on that page!  I ended up with a program that would start and run but would not do anything.  It was badly broken.

The usual practice in software development is to periodically take a known working version, package it up in a tarball or zip and put it up somewhere for download.  This guy is lazy.  He has you just grab whatever he’s working on at the time.  That’s what that part about 1/4 down the page is about where he talks about fetching from the CVS.  That’s his source code control system.

Unfortunately I got a snapshot or “build” that is broken.  Norman was helping me with it because he has the program working on his system.  We looked up and it was 6AM and the sun was rising.  He mentions in passing that he got his version here.

He’s running Gentoo Linux which is different from Ubuntu so I didn’t pay much heed. A half a day later and just before I was about to give up, I decided to download his version and try it.  It compiled and installed cleanly!  Progress.  Unfortunately it wouldn’t work.

That’s when Norman sent me his workflow.  That is, the step by step procedure that he uses to do the job.  I went with his workflow which was different than what was intuitive and it worked!  That is, one part of the program that his workflow uses works while other parts are broken.  *sigh*  Unfortunately a lot of Linux software is that way.  Hats off to the guys who take this crap, test and massage it to make it reliable and put it in the repositories.

This is getting kinda long so I’ll stop now.  You should too unless you like reading about train wrecks.  I’m including my Linux Log below so that those with morbid curiosity can see what I went through.  Now to go out and take some HDR photos!



01/30/09 Friday

Started installing enblend and enfuse, two programs for doing panoramas and HDR blends.  Recommended by Norman.

Enblend was in the repository but enfuse was not for some reason.  Downloaded the sources from sourceforge.  Sources are in /usr/local/src/enblend-enfuse-3.2.

when I ran configure, it said that it needed libtiff.  Fetched that and stored it in /usr/local/src/ tiff-3.8.2.  Made and installed that.  Now configure says it needs liblcms.  Synaptics says that it is installed but apparently configure can’t find it.  I’m downloading the liblcms-dev package now.  If that doesn’t do it I’ll make from the sources.

That worked.  Now it wants libxmi.  Getting that from the GNU download site.  Will make from sources.

OK, have all the dependencies.  Now make breaks on a malloc declaration error.  It just ain’t working.  Went to the net again and found a different package.
Same problem.  This program appears to be made for Ubuntu Rel 8.10.

Now it wants libjpeg-devel headers.  Those appear to be in the repository.  Downloading.

Now it wants libpng-devel.  In the repository.  *sigh*


01/31/09 Saturday

Try two.  At this site

I found the following instructions.

First, get a lot of stuff:

sudo apt-get install pkg-config libtiff4-dev libboost-graph-dev libboost-thread-dev   liblcms1-dev libglew-dev libplot-dev libglut3-dev libopenexr-dev libxi-dev libxmu-dev    (all one line)

For my 8.04 system, also get:

sudo apt-get install libopenexr2ldbl

Then get the sources from

cvs -d:pserver:[email protected]:/cvsroot/enblend login
cvs -z3 -d:pserver:[email protected]:/cvsroot/enblend co -P enblend
cd enblend

(I may not have to do this)

to make

make -f Makefile.cvs

Currently waiting for the first part to finish.

02:15.  Download finished.  Going to try a make with the sources I have before I hit the CVS.

IT WORKED!  Longest compile of my life at almost an hour.  Wow.

Onward to hugin

1. make some more libraries

sudo apt-get install cmake libopenexr-dev libboost-dev boost-build libboost-thread-dev libboost-graph-dev gettext libwxgtk2.8-dev libexiv2-dev libimage-exiftool-perl libglew-dev

2.  Get the sources

svn co hugin
cd hugin

Had to download subversion.

3. Make it

cmake -DCMAKE_INSTALL_PREFIX=/usr/local .

This failed because it needed libpano13 or libpano12.  IN the repository so gotten using package manager.  Cmake worked fine after that.  repository installed libpano12.

4. Compile

sudo make install
sudo ldconfig


Compile failed on syntax errors (!) so I went back and got his example stable version using this command

svn co -r 2906 hugin

Some Perl stuff that is needed.

sudo apt-get install libimage-size-perl
sudo cpan Panotools::Script

06:30  that completed successfully.

06:45 make finished.  hugin would not run.  Failed with a shared library missing error.  Went back to the top of that wiki page and started from the beginning.

sudo apt-get install build-essential autoconf automake1.9 libtool flex bison gdb

sudo apt-get install libc6-dev libgcc1

Must make libpano15.  Apparently skipped that part before.  The older version in the repository didn’t work.

sudo apt-get install zlib1g zlib1g-dev libpng12-dev libjpeg62-dev libtiff4-dev

sudo svn co libpano13
cd libpano
sudo make
sudo make install
sudo ldconfig

cd ../hugin
ssudo udo cmake -DCMAKE_INSTALL_PREFIX=/usr/local
sudo make
sudo make install
sudo ldconfig

IT WORKED!!!  Hugin comes up

17:30.  It didn’t work.  Missing underlying applications.

Making autopano-sift-C

sudo apt-get install libxml2-dev
sudo svn co autopano-sift-C
cd autopano-sift-C
sudo cmake -DCMAKE_INSTALL_PREFIX=/usr/local .
sudo make
sudo make install

Updating perl math::matrix 4

sudu cpan
install Math::Matrix Image::Size Storable


sudo apt-get install libimage-size-perl

this does all the above perl stuff
sudo cpan Panotools::Script

installing Pan-0-Matic

sudo apt-get install libboost-dev


Attempting to install CinePaint, another HDR program.  Using this script from the Cinepaint website.

echo – install from source on ubuntu
echo License BSD 10.18.2008 [email protected]

sudo apt-get install gcc automake g++ libfltk1.1-dev libgtk2.0-dev zlib1g-dev libjpeg62-dev libpng12-dev libtiff4-dev libopenexr-dev libxpm-dev libgutenprint-dev libgutenprintui2-dev liblcms1-dev pkg-config ftgl-dev libxmu-dev libxxf86vm-dev flex python-dev libtool
sudo make install
export LD_LIBRARY_PATH=/usr/lib/local

Program continues to crash.  I reset defaults on the preferences->autopano to use Panomatic after fetching it.  That made the auto align work but broke the end result.  Only one photo file is getting passed to enfuse.  About to give up.

Let’s try something else.

Per the article here:

I’m downloading the sources for CinePaint that has an HDR plugin.  This may be another adventure but at least the author supplies a make script.  Downloading from here:

While waiting on the download, I cleaned up all the mess from the last day.  Compressed and tarred all the sources, etc.


Download of the CinePaint finally finished.  Sourceforge knocked down the download about every 3-4 minutes.  Strange.

CinePaint would not compile so end of that experiment


Finished downloading and installing the version that Norman uses.  It compiled cleanly and installed over the top of that other mess, preserving the desktop registration.

Once norm sent me his work flow it worked OK.  It’s pissy about what you feed it but once it likes the input, it works.  *sigh* a day and a half wasted on that mess.  Here is his workflow.

start hugin

go to the Images tab, load images, then click on “Create control

go to the Optimizer tab, click on “Optimize now”

maybe go to the Control points tab, and fix up control points

go to the Stitcher tab, and:

– change the idiotic default “Equirectangular” to the
normal Rectilinear”

– click on “Calculate Field of View”

– click on “Calculate Optimal Size”

– uncheck “Blended panorama”, and check “Blended
panorama (enfuse)”

– click on “Stitch now”.

Posted by neonjohn on February 1st, 2009 under Computing

Leave a Comment

Comments links could be nofollow free.