[Work Log] Dis Prop (ctd); new HPC

January 15, 2014

Continue writing dissertation proposal. Worked on SfM, MVS literature review. Spent quite some time trying to understand self-calibration. The issue is that with a general perspective camera, SfM can only perform reconstruction up to a projective ambiguity (most of the SfM literature of late is apparently reticent about this?). It turns out, if we fix at least one intrinsic parameter (how about skew = 0? or aspect ration = 1?) then we get a metric reconstruction (which maybe explains why it's no longer covered in SfM papers).

This shows that bundle adjustment can give a metric reconstruction, but the question remains how to initialize it. Pollefeys et. al (1998) propose an analytical self-calibration solution. Snavely et al. (2007) apparently just use bundle adjustment anyway, presumably using a simplified pinhole camera (no skew or ppo) with a reasonable default for f, or EXIF tags when possible. Brown and Lowe (2005) initialize each new camera with the old camera's intrinsics, no word on how the first pair is initialized (maybe obvious but I missed it?).


UA just installed their new 80-node HPC cluster and by a stroke of luck, Ive gotten early access to it! It looks like all machines have 16 CPU cores and 256 GB RAM; 60 nodes have multiple GPU cards while the other 20 have Intel PHI general purpose compute cards. It's amazing to query the cluster and see the entire thing at 0% utilization... that won't last long! Having fun poking around it and reading the IBM LSF manuals 2 (the job queueing system).

I found the following resource to be very useful, even though it's from University of Miami's personal LSF installation:

The hew official documents I could find were less useful:


Okay great! Got a general idea of how to use the system from the user guide

On gpu* machines

On phi* machines

Also MPI is supported

Discovered two large gsfs disks (192 TB and 43 TB, resp.). Apparently GSFS is a GPU-enabled encrypted disk. No write permissions ATM


Some random thoughts while using the system

Posted by Kyle Simek
blog comments powered by Disqus