Sietch Nevada

This is amazing. It’s the Sietch Nevada project from an exhibit (Out of Water | innovative technologies in arid climates) at the University of Toronto earlier this year.

View of the urban life among the water bank canals

Sietch Nevada projects waterbanking as the fundamental factor in future urban infrastructure in the American Southwest. Sietch Nevada is an urban prototype that makes the storage, use, and collection of water essential to the form and performance of urban life… A network of storage canals is covered with undulating residential and commercial structures. These canals connect the city with vast aquifers deep underground and provide transportation as well as agricultural irrigation. The caverns brim with dense, urban life: an underground Venice.


Magnetic levitation of large water droplets… and mice!

From PhysOrg.com article,

Scientists have managed to levitate young mice in research carried out for NASA. Levitated mice may help research on bone density loss during long exposures to low gravity, such as in space travel and missions to other planets.

How it works:

The scientists built a variable gravity simulator consisting of a superconducting magnet that could generate a magnetic field strong enough to levitate the water inside every cell in the mouse’s body. Water is weakly diamagnetic, which means that in the presence of a strong magnetic field the electrons in water rearrange orbit slightly, creating tiny currents in opposition to the external magnetic field. If the external magnet is strong enough, the diamagnetic repulsion of the water in the mouse tissue is enough to exactly balance the force of gravity on the body.


Counting the number of files/folders in a directory

I was looking for a fast way to get the number of folders in a directory. I didn’t need anything else, just the number of folders – this was for a folder tree, and to determine whether or not to display an expansion arrow that would pull and list the subfolders when clicked. The brute-force (read: slow) way to accomplish this is just to get an array of folders and get its length. I thought such information would exist within the directory metadata, but it turns out it doesn’t, and the brute-force way is the only way.

folder tree

Quicker (quickest?) way to get number of files in a directory with over 200,000 files

How do I find out how many files are in a directory?

The stackoverflow questions refer to files, but files and folders are seen/enumerated the same way by Windows (FindFirstFile, FindNextFile), so the same limitation exists.

This post on The Old New Thing explains the reason for the lack of such metadata.

The file system doesn’t care how many files there are in the directory. It also doesn’t care how many bytes of disk space are consumed by the files in the directory (and its subdirectories). Since it doesn’t care, it doesn’t bother maintaining that information, and consequently it avoids all the annoying problems that come with attempting to maintain the information.

Another issue most people ignored was security. … If a user does not have permission to see the files in a particular directory, you’d better not include the sizes of those files in the “recursive directory size” value when that user goes asking for it. That would be an information disclosure security vulnerability.

Yet another cost many people failed to take into account is just the amount of disk I/O, particular writes, that would be required. Generating additional write I/O is a bad idea in general…

Completely understandable… but it still sucks on the application developer’s end. One solution I toyed around with is using the folder info to make a cache, instead of just getting the count and tossing the array to the garbage collector. This means, everytime a folder is expanded, the subfolders within the subfolders of the directory being expanded would be pulled and the necessary UI widgets would be created. However, this isn’t always ideal, especially as you go deeper into a folder tree, as you can end us wasting time pulling and creating UI widgets for a lot of folders which will never be seen by the user.


Eye candy is a critical business requirement

Interesting presentation:

eye candy is a critical business requirement


Mono Winforms on OS X

Despite some issues, my experience porting .NET C#/Winforms 2.0 code to a Linux system running Mono was surprisingly pleasant. Unfortunately, the same cannot be said for porting to OS X. Put simply, Mono Winforms on OS X sucks… it sucks bad… to the point where I question why it was even released.

fragmentsync mono winforms issues

The biggest and most apparent issues seems to be visibility and z-order issue. The image above is a form containing a few panels, however, only the panel containing the green header and the “sync with” label should the displayed. The others should be invisible (Control.Visible = false). Furthermore, making one control visible and another invisible or bringing a control to the front (Control.BringToFront()) such that it obscures others is a crapshoot; sometimes it’ll work correctly, other times you’ll see the correct control briefly before it disappears and your staring at a control that should be invisible or obscured.

Performance is atrocious. Winforms itself, even under .NET on Windows, is not a terribly high-performance windowing system, but it’s unbearable under OS X, to the point where you can see very prominent visual delays as events are processed and/or controls are being rendered.

Stability is awful. It’s not uncommon to reach a point where the application’s UI simply freezes up and no longer responds to events. Worse yet, disposal of Controls don’t seem to occur properly. I’ve noticed this on dynamically created controls, on a few occasions; clicking a control resulted in a crash because the control had already been disposed, but the control was still, incorrectly, being rendered in the UI and responding to events (hence the crash).

All of these problems seem to point to an issue with Mono’s per-windowing system driver for OS X. The Mono Project Roadmap indicates an upcoming Winforms update for OS X this September with the Mono 2.6 release, hopefully, it’ll address these (and I’m sure other) critical issues, so that Winforms can become truly cross-platform. As of right now, if your looking to use Mono Winforms as a solution for porting to OS X, you’ll likely only be able to pull off a port of the simplest of applications.

On a side note, Cocoa# looks like a pretty interesting project, but the website hasn’t been updated in months, so it’s possible the project is dead.


Extracting transparency

From time to time, I’ve run across the problem of trying to get rid of a white background of an image with a design that’s mostly the same shade throughout. The hard part is not getting rid of the white background, but getting all the “transition pixels” (i.e. those that allow the design’s edges to gradually fade into the background) to have a somewhat accurate alpha (i.e. transparency) value, so that the design can then be taken and blended nicely atop an arbitrary background without the very common halo effect; this is shown below, trying to remove a white background with Photoshop’s magic wand.

halo problem with magic wand

There are ways to mitigate the issue shown above in Photoshop (see post), but none that are truly simple to the point where they can be done with a click of the mouse. This isn’t really a problem with Photoshop; after all, PS is made to be a general solution and what I’m presenting here is a very specific case.

Anyways, I finally stumbled across the idea of using a user-defined background color and foreground color, and using some bit of magic to interpolate between the values to generate a valid alpha channel. My first instinct was to compute the saturation value of a pixel and use it to find an alpha value for the pixel. However, after a bit of investigation, I realized this wasn’t the correct approach. From Wikipedia’s article on the HSL and HSV color spaces,

There are two factors that determine the “strength” of the saturation. The first is the distances between the different RGB values. The closer they are together, the more the RGB values neutralize each other, and the less the hue is emphasized, thus lowering the saturation. (R = G = B means no saturation at all.) The second factor is the distance the RGB values are from the midpoint. This is because the closer they are to 0, the darker they are until they are totally black (and thus no saturation); and, the closer they get to MAX value, the brighter they are until they are until totally white (and once again no saturation).

Note that grayscale pixels (R = G = B) are considered totally unsaturated, and this could easily lead to problematic cases – definately cases of a black design on a white background.

Moving on, I realized that lightness was a better indicator and it was surprisingly easy to calculate: l = 0.5(max + min). I decided to use a fixed background color of white, just to make things easier, so based upon the lightness of the background (1.0) and foreground color (supplied by the user), I computed the minimum (minL = lightness of the darker, foreground color) and computed the lightness value of each pixel in the image. In general, lightness values should increase as you go from the foreground color to the background color and they should be in the range [minL, 1]. I then did a simple linear scale to the [0,1] range, and did a few checks for pixels that were outside the [0,1] range (caused rogue pixels that were darker than the foreground color). I then computed the alpha value, alpha = 1.0 – l, and that was it. You can see the result below.

transparency extract

transparency extract 2

It’s not perfect. If you zoom in, you will see a white-ish halo, but it’s certainly good enough for many cases. The algorithm could also be refined by replacing the simple linear scaling from [0, minL] to [0,1] with a more “aggressive” function, skewing the lightness values toward 1.0, which could minimize or possibly eliminate the halo effect.

Will post code and application soon. Hopefully, I can also spare some time to work on improving this.


The problem with forensics

I stopped watching CSI a long time ago, but remember being annoyed by some stupid piece of dialog a long time ago where Grissom says “terminal velocity is 9.8 meters per second squared.” Turns out, lack of scientific knowledge is not that far off from real forensic “science” either.

csi

Forensic science was not developed by scientists. It was mostly created by cops, who were guided by little more than common sense. And as hundreds of criminal cases begin to unravel, many established forensic practices are coming under fire.

From CSI Myths: The Shaky Science Behind Forensics in Popular Mechanics.


Virtualbox

I recently looked into Virtualbox as a alternative to my current virtualization solution (MS Virtual PC). I was particularly convinced to give it a try after reading this article on ArsTechnia detailing some of the new features in version 3.

virtualbox with openSUSE

A quick rundown of the good and the bad (so far):

  • 64-bit guest within a 32-bit host. As a 32-bit version of Windows XP is my primary OS, I loved that I could virtualize the 64-bit version of Windows 7 and was able to get pretty decent performance. (fyi, I do have a 64-bit CPU with hardware virtualization support)
  • Unable to install openSUSE 11. I have no clue why; it booted from the CD image, but froze once I selected the menu option to start setup. However, I was able to successfully install openSUSE 11.1 without any issues.
  • Constant CPU usage. I notice some of my VM would constantly push their CPU/core to 100%. However, I think this is related to the following issue…
  • Networking problems. The default virtual ethernet adapter (PCnet) seems to cause certain VMs to freeze. I encountered this with openSUSE and Ubuntu. A solution can be found here; basically, just switch to the “Intel PRO/1000 T Server” adapter.
  • Problems installing guest additions on openSUSE. I wanted the guest additions primarily to be able to dynamically adjust the guest’s resolution, which is an increadibly powerful feature. Getting the guest additions installed is not quick and easy. Partial solutions can be found here and here. In short, first do an update to make sure you have the same kernel and kernel sources (needed for the guest additions). sudo zypper dist-upgrade Then add/update the following components: sudo zypper install gcc make automake autoconf kernel-source All of this can take a while. After all installs/updates are done, restart, then run the appropriate script off the cd image.
  • Dynamic resizing doesn’t work. If you got the guest additions installed following the steps above, you’ll find that dynamic resizing doesn’t work (however, your resolution will jump to 1024×768 from the default 800×600). There’s message after installation of the additions that alludes to this. You’ll have to edit xorg.conf (/etc/X11/xorg.conf). See here. In the Monitor section remove or comment out any “PreferredMode” options. In the Screen section remove or comment out any line with “Modes”. Restart, and dynamic resizing should now work. (Note, one minor quirk I noticed is that the openSUSE taskbar jumped from the bottom of the screen to the top when a resize was done. I’m not sure if this is an issue with Virtualbox or openSUSE.).

Overall, I’m both impressed and disappointed. Virtualbox has an impressive feature-set, but the “out-of-the-box” experience leaves a lot to be desired.


Improper usage

baby and bag

I love this icon. It was on the plastic wrapping from a new keyboard I recently bought.


Scribbles

I was cleaning and found this piece of paper. I have a habit of doing stuff like this, especially when I’m on the phone; this is how I take “notes.”

scribbles

There has to be some hidden meaning in there, right?