RAF museum

26.01.2010 22:16

I spent the better part of the day in the Royal Air Force Museum in London (and the rest of the day traveling to and from it via various combinations of walking, trains, subways and taxis).

Needless to say there's a lot of amazing technology piled up in there and I would recommend a visit to anyone interested in aircrafts. In fact there are so many machines crammed in the old hangars that it's quite a challenge to make a good photograph of any single one of them.

The order of exhibitions is a bit chaotic concerning the time line. But that can be a plus - it's interesting to compare the size of a modern jet fighter with a Spitfire.

Eurofighter Typhoon

The nose of Eurofighter Typhoon.

Handley Page Victor

Handley Page Victor

Avro Vulcan

Bomb bay of Avro Vulcan

Boeing B17G

Boeing B17G

Posted by Tomaž | Categories: Life | Comments »

Boiler conclusions

23.01.2010 23:09

Here are a couple of conclusions of my electrical water heater monitoring project.

  • On average I use 3 kWh of electrical energy per day on hot water. That's 90 kWh per month or roughly half my monthly electricity bill.
  • For comparison, space heating consumes 1120 kWh of heat per month in winter (if I can believe what the district heating company is billing me). According to my last year's car calculations, I burned up 740 kWh worth of chemical energy per month in my car.
  • Shifting all water heater's consumption to night time would save me 2.2€ per month or 26€ yearly.

Surprisingly, providing hot water seems to require an order of magnitude less energy than daily car commute and two orders of magnitude less than providing moderately warm living quarters. That's something I didn't really expect.

Accordingly, the savings are low too. A timer would perhaps pay itself back in two years or so if I don't count in my own time required to install it. And I doubt I'll still be in the same apartment two years from now.

So it's not really worth doing anything right now. At least not until there's a bigger difference in electrical energy prices between day and night or smart grid becomes reality.

Posted by Tomaž | Categories: Life | Comments »

Tube clock

22.01.2010 18:51

I finally found the time to finish assembling the "Russian" tube clock I got from Dedek Mraz. It seems I'm starting to gather quite a collection of weird time keeping devices.

Assembled Ice Tube Clock from Adafruit.

Assembling the kit was a no-brainer - instructions include three pictures for every component you need to solder and have you check parts of the circuit before continuing. The only slightly tricky bit was getting the tube to align nicely with the casing before soldering its (many) pins.

Considering the kit comes from US, it was a nice touch to include 24 h European time and date format (yes, the clock can also show the day of week and current date). The power brick only had an US power plug though. I replaced it with the EU one - the circuit itself already supported 230 V.

Also, the pictures don't really show how the display looks like. Mine glows in a blue-green color and isn't particularly bright (I have the brightness set to 55). If you remember old video recorders that used to blink "12:00" (they also used to have VFDs) - that's how the tube actually looks like.

One annoying thing I noticed is that it's hard to set the clock to the second. You can't wait for the wall clock to catch up with the frozen seconds display because the menu will time out in less than a minute.

This means it's harder to assess the accuracy. Also this circuit doesn't provide a trimmer capacitor to fine adjust the oscillator frequency. I had the idea to add a DCF77 receiver (it appears there's a contact provided on the PCB board for that). However the FAQ page says the boost converter is too noisy for such a receiver to work near the clock. If that's true, I wonder what other devices also are also affected by this interference.

Posted by Tomaž | Categories: Digital | Comments »

Avatar

20.01.2010 23:45

I went to see Avatar last week.

Yes, if you ignore the visuals it would be boring. But if anything this is the kind of movie to see for the pretty moving images alone. This also means that it was still worth watching even after hearing all kinds of spoilers - it's been in theaters for a month now and I overheard most of the story in various conversations before I even went to the cinema. Still, I very much enjoyed it. Pseudo-3D-induced headache that followed not that much.

Actually, one of the reasons I didn't go watch it sooner was because I didn't found the still frames on posters very appealing. Interesting how the perception changes when things are moving.

One pleasant surprise was that nothing I saw in the movie was outrageously outside the domain of possible. Slower-than-light travel, no artificial gravity, a planet with unbreathable atmosphere and aliens that don't speak English (and no universal translator) are all rare nice touches in mainstream Hollywood science-fiction. Ok, floating mountains are stretching that a bit, but explanation with the Meissner effect at least passes the first mental plausibility test.

The movie obviously features as much fictional biology as it does technology. There I found some weirdness harder to ignore. One thing that caught my attention was that principles of evolution seemed kind of broken. Take a chunk of Earth and the larger lifeforms walking around (including any humans) will look pretty similar: you know, four limbs, fur, two eyes, mouth, etc. But on Pandora, no animals are seen with fur, they breathe through their stomach and have different numbers of limbs and eyes. The Na'vi with their hair and human-like bodies seem out of place in that scheme.

Talking about the blue folk, it's interesting how their feline traits (large eyes, ears, tails) on the screen appear to unrealistically amplify your perception of emotion on their faces. I wonder why is that? Reading other people's feelings is basically an image recognition task and that's something brain is very well adapted for. I guess such a face combines the features from expressions you instinctively recognize from both animals (ears) and humans (facial gestures). Since these two things never appear on the same individual you can't experience their combined effect in real life. With a bit of additional exaggeration that's possible with CGI (for example pupil dilation) no wonder some scenes feel like emotion overload.

In the end I left the theater thinking that a realistic sequel to the story would be very short. In 10 years when the next ship from Earth arrives, somebody says Nuke them from orbit. It's the only way to be sure. But I guess that would be bad for ticket sales.

Posted by Tomaž | Categories: Life | Comments »

VFD porn

18.01.2010 18:33

Vacuum Fluorescent Display from Adafruit's Ice Tube Clock

Vacuum Fluorescent Display tube from Adafruit's Ice Tube Clock, which I'm assembling right now.

Posted by Tomaž | Categories: Digital | Comments »

Image recognition 101

11.01.2010 19:14

The camera setup I mentioned in my last post left me with a week worth of video of the electrical water heater in my bathroom. Instead of watching the boring video myself, I used a simple program to extract a single bit of information from each frame: is the heater operating or not?

Here are four typical frames from the video - all four combinations of states for the heater and the light. The latter is obviously causing the largest undesirable changes in the image I needed to filter out.

Four typical frames in the video

First obstacle was how to actually get the raw image data back from the MJPEG encoded frames stored in the AVI container. Raw video is actually surprisingly hard to get to, since the whole GStreamer framework seems to be built around the idea to keep you away from it. Finally I found the fdsink element which, while not perfect, was good enough for the job. Some trial and error and I came up with the following pipeline:

gst-launch-0.10 --gst-debug-disable filesrc location=boiler.avi ! \
jpegdec ! video/x-raw-yuv,width=128,height=512,format="(fourcc)I420" ! \
fdsink fd=1 | ./get_pixel

This decodes MJPEG (jpegdec element) into YUV format variant called I420 and pipes it through GStreamer's standard output (file descriptor 1) into my program's (get_pixel) standard input.

YUV format is just what I needed for this purpose, since I'm only interested in luminance values.

Here's the C source for get_pixel:

#include <stdio.h>
#include <stdlib.h>

#define HEIGHT  512
#define WIDTH   128

static unsigned char get_yvalue(unsigned char *frame, int x, int y) {
        return frame[x + y * WIDTH];
}

int main() {
        size_t yplane_size = WIDTH*HEIGHT*sizeof(unsigned char);
        size_t uvplane_size = WIDTH*HEIGHT*sizeof(unsigned char)/4;

        size_t frame_size = yplane_size + uvplane_size * 2;
        unsigned char *frame = malloc(frame_size);

        /* fread(frame, 26, 1, stdin); */

        while(fread(frame, frame_size, 1, stdin) == 1) {
                unsigned char b = get_yvalue(frame, 97, 246);
                unsigned char l = get_yvalue(frame, 97, 446);
                printf("%d\t%d\n", b, l);
        }

        return 0;
}

This reads frame by frame from the standard input and stores it in a buffer. It then reads two luminance values from the image: one centered at the heater light (l) and one for reference on the white boiler surface (b) and writes both values to the standard output.

The commented-out fread() call is there because GStreamer sometimes writes 26 bytes of garbage at the start (actually it looks like some debug message that every once in a while doesn't end up in it's proper standard error stream). It's probably a bug somewhere in GStreamer, but I didn't look much into it since this is such an one-off program.

So, this program gave me a list of luminance values and it was then trivial to find a function that discriminates between heater on and heater off states regardless of the ambient light.

Here's what the luminance values look like for a typical 24 hour day:

Graph of pixel luminosities versus time

The red dots show the reference value (b) and blue dots show the heater light luminance (l). The shaded areas show the time intervals where heater was detected to be operating.

The function used was:

f(b, l) = \left\{ \begin{array}{ll}1 & \textrm{if $l - \frac{b}{2} > 35$} \\ 0 & \textrm{if $l - \frac{b}{2} \le 35$}\end{array}\right.

Where b and l values range from 16 to 235 (and my question why is that so still stands).

As you can see the bathroom is mostly in the dark during the day and only around noon some daylight manages to shine into it. I manually checked some random points and it seems precision is nearly perfect. The only exception appear to be the first frames taken after the light has been turned on or off when the camera hasn't yet adapted to the light and the frames are grossly over- or underexposed.

Posted by Tomaž | Categories: Code | Comments »

Midnight Tivoli

09.01.2010 1:09

It's been snowing on and off for the past week and we had continuous heavy snowfall today in Ljubljana, creating a delightful chaos in the city. I deserted my car under a pile of snow in front of the office - feet are just about the only thing that will get you around in weather like this.

Empty walkway in Tivoli, Ljubljana

So I guess the conditions were just about right for a stroll through knee-deep snow in Tivoli at an ungodly hour.

Sculpture of a sitting person, covered in snow, Tivoli, Ljubljana

At least the snowman is keeping him company.

Creepy giant snowman, Tivoli, Ljubljana

Talk about an uncanny feeling. I would swear this thing was standing under the other streetlight when I was going back.

Posted by Tomaž | Categories: Life | Comments »

Camera in a bathroom

07.01.2010 21:53

Hot water in my apartment is supplied by a storage-type electrical boiler. As uneconomical as it is to use electricity for heating in a big city I can't really do much about that. However what I did notice is that the insulation on the boiler is pretty good - the water will remain hot enough to use even if the heater has been turned off for a couple of days. So maybe I could optimize the time the heater is working. Instead of a simple thermostat I could add a timer that would only allow the heater to work at night when electrical energy costs around half as much as during the day.

However, before I start tearing the boiler apart I want to have at least a rough approximation of how much energy the boiler actually uses and how much such a modification would save me per month in lower electricity bills.

Unfortunately I don't have proper equipment at hand to directly measure and record electrical power (plus that would mean more messing with the wiring in the bathroom and that's a can of worms want to open as rarely as possible).

So I used another approach. I took my EeePC, put it on a high shelf across the boiler and trained the camera on the indicator that lights up when the heater is operating. From the recording it will be obvious at what time the heater was on and since I know approximate power I'll be able to calculate the energy consumption and cost.

Of course, the thought has crossed my mind that having a camera operating in a bathroom can be a really bad idea. Even more so if it's connected to a Wi-Fi capable laptop. So I took some extra precautions to prevent anyone ending up on YouTube (i.e. I disabled the wireless adapter).

I used the following GStreamer pipeline to record the video:

gst-launch -v \
v4l2src ! video/x-raw-yuv,width=640,height=480,fps=5 ! ffmpegcolorspace ! \
videocrop bottom=112 left=288 right=288 top=112 ! \
videoscale ! videorate ! video/x-raw-yuv,width=128,height=512,framerate=1/4 ! \
clockoverlay ! \
jpegenc ! avimux ! filesink location=boiler.avi

Everything up to ffmpegcolorspace appears to be necessary for the camera to operate properly. videocrop crops the video so that only the boiler can be seen on the image (makes for a boring video, but that's what I'm aiming for here). videorate drops 19 frames out of 20 to save disk space.

I also added a clock overlay, so that each frame would have the exact time recorded, just in case I couldn't later deduce the time by other, more sophisticated means. I had to scale the video width up by 2 (hence the videoscale element) to make it wide enough for the clock display to fit onto it.

Video is saved in the simple motion-JPEG format. Not really the most efficient encoding out there, but everything else seemed to produce unusable results. I guess the default settings for Theora and friends don't work very well at 0.25 fps.

I left this setup running for a week and it produced around 3.5 MB of video data per hour - no problem for EeePC's somewhat limited storage capabilities.

If ran at a normal 25 frames per second, this video shows a day in 15 minutes, so in case I won't be able to do anything automatically I would still be able to watch it and manually record the time (however I'm confident it won't come to that)

Posted by Tomaž | Categories: Ideas | Comments »

Darker than black

05.01.2010 21:54

For some reason (which will perhaps be a topic of some other post), I've been trying to extract raw data from a video stream using GStreamer this afternoon. I got to the point where I though I understood how a raw YUV stream looks like, but I kept getting non-zero luminance values for the completely black screen produced by the videotestsrc with pattern=black.

So I went and asked a stupid question on #gstreamer if their idea of black is different than mine.

Interestingly, the answer was that digital YUV video traditionally has an 8-bit luma value scaled to the 16-235 range (i.e. 16 being black and 235 being white). Some searching around the web turned up a lot of questions and confusion about this, but no definite answer why is this a good thing.

I was told by GStreamer guys that this convention remains from analog video. There is some logic in this. YUV is the way how color analog signal is encoded and black level there is above reference 0 V (that level being used for signal synchronization). However, I'm not convinced by this explanation. If you scale analog reference voltages to 8-bits, you get a different range. And also, why would someone want to encode synchronization signals in digital video?

Wikipedia gives a little different, but also cryptic answer. It mentions this is because of MPEG standards and that this scale is convenient when doing fixed-point arithmetic with the value.

Anyone out there that can explain what was that significant benefit that justified this additional complication in digital video (as if the topic itself wasn't complicated enough)?

Posted by Tomaž | Categories: Code | Comments »