Drive-by contributions

28.07.2012 21:52

I like free and open source software and I use it almost exclusively in my day-to-day computing tasks. One of the reasons why I prefer it over proprietary applications is because I can go in and change things. Nothing is more frustrating than when one little bug in an otherwise perfectly fine piece of programming makes it unusable for a task at hand and you can do absolutely nothing about it. Or when you need one feature which you know must be a one-line change and you are left with pleading to some narrow communication channel with a disinterested customer support person on the other side.

Not surprisingly, such tinkering has led me to the current state where many of the programs I use every day are custom compiled with this or that little change I made. Over they years I got good at maintaining these changes in a somewhat painless way (in short, quilt and/or git). But software upgrades are still painful as each upgrade either means dropping my custom patches or wasting time to port them to the new version. This is why I prefer Debian and its slow release cycle for computers I use.

If it's not an one-off hack though, but some application I will use at least twice, I always take the time to roll-up the change in a presentable patch and send it to upstream developers along with a nice description. Having a patch accepted in upstream of course is always better than maintaining my own private version of the code. It means I no longer waste time each release cycle and other people can benefit from my work, which gives me a warm and fuzzy feeling.

The problem however is that very few of these patches actually get applied by the maintainers of the project. In my experience, the best chances are with active, small or one-man projects. Larger projects usually just let such contributions linger indefinitely in the mailing list archives or the ticket tracker. I'm not even talking about them rejecting the patch - in that case I usually take the time to correct any problems maintainers found as that is of course beneficial to me as well. In most cases there is simply no reply at all.

A bunch of rusty gears.

Jono Bacon calls this kind of participation in free software projects drive-by contributions and explains that they are expensive in term of maintainer effort and that communities should strive to convert such random contributors to active members. I can't argue with that logic. Obviously a patch from someone that has an established track record within the community requires less careful review before it is accepted into the official repository. Although the patches I'm talking about are always relatively small fixes and not some big architectural changes that would require some broad consensus from developers.

The problem is that I simply don't have time to be actively involved with every project developing each piece of the software that runs on my computers. I usually join the mailing list of a project I submit a patch to, at least for a while. But for more there is simply not enough hours in a day and I have enough of my own projects.

Another problem are dead projects. Is it better to leave a series of patches in an abandoned public tracker or accept all the overhead that comes with it and release my own version?

Sean Dague praises GitHub as a solution to such problems. In my experience GitHub is a double-edged sword. On one hand pull requests are easy to make and provide a standard and hassle-free way of accepting such contributions from random users. But on the other hand, if a project isn't already using GitHub this leads to confusing situation where you have a network of forks on GitHub in addition to a official repository that may or may not be completely ignored by the original developers. Often researching the fragmented fork tree takes more time than actually implementing the fix yourself and again, it doesn't help much with the fact that you have to maintain your own fork, even with all the magic of git-merge.

I know I'm talking in a very general way about many diverse projects, but that is the general pattern I seem to keep noticing. Each time I get involved in some job that uses open source software I preach how it's important to contribute back to the community and not keep a fork of the code in isolation. But recently I often ask myself if I'm doing something wrong or if it's actually not worth my time to keep trying to contribute quality patches when a quick hack I keep in private achieves much the same result.

Posted by Tomaž | Categories: Life | Comments »

EnergyCount 3000, part 2

15.07.2012 20:18

A few months ago I was writing about my attempts to reverse engineer radio transmissions from Voltcraft EnergyCount 3000 devices. Since then a lot of the things I was speculating about got much clearer, mostly thanks to reverse engineering efforts on the forum. Yesterday there was an Open Data Hackday in Kiberpipa and, while this wasn't strictly on topic, I finally managed to integrate all of the little pieces of the puzzle together into one working, simple to use Python module.

Voltcraft Energycount 3000 energy logger

So today, I'm pleased to announce the first release of ec3k, a software receiver for EnergyCount 3000 devices. It's built around GNU Radio and currently works with cheap, RTL2833-based receivers. You can fetch it from Cheese shop or directly from GitHub. It's licensed under GPL version 3 or later.

The biggest obstacle you need to cross before using it is installing GNU Radio, rtl-sdr and gr-osmosdr. A good starting point is the Osmocom Wiki which has a quick overview and links to other relevant documents. When GNU Radio is working for you, only the standard Python distutils dance remains:

$ python install
$ python test

The package includes a simple command utility which prints out all received packets to the standard output:

$ ec3k_recv 
linux; GNU C++ version 4.4.5; Boost_104200; UHD_003.004.002-128-g12f7a5c9

gr-osmosdr supported device types: file fcd rtl rtl_tcp uhd 
Using 16 buffers of size 262144.
Using device #0: ezcap USB 2.0 DVB-T/DAB/FM dongle
Found Elonics E4000 tuner
>>> gr_fir_ccf: using SSE
Using Volk machine: avx_64
Noise level: -27.3 dB
Noise level: -27.4 dB
Noise level: -27.5 dB
id              : f100
uptime          : 4004 seconds
since last reset: 37809 seconds
energy          : 385300 Ws
current power   : 1.3 W
max power       : 2.3 W
energy          : 4579600 Ws
Noise level: -26.7 dB
Noise level: -26.8 dB

Here is how you can use the ec3k module from your own Python code:

import ec3k

def callback(state):
    print state

my_ec3k = ec3k.EnergyCount3K(callback=callback)

while not want_stop:
    print "Noise level: %.1f dB" % (my_ec3k.noise_level,)


As shipped, the receiver uses around 20% of one CPU core on a 2.7 GHz Intel i5 CPU. You can improve that by using my C implementation of the base-band decoder from the am433 project. Just compile the capture binary (version 0.0.4 or later) and drop it somewhere in your PATH. ec3k should find it automatically and use it instead of the built-in pure Python implementation.

For details on the API, please check the included README and docstrings. README also lists a couple of known problems that are still present in the code. Any help regarding those, or otherwise, is most welcome as always.

Again, thanks goes to the people on forum for figuring out how to descramble the packets and Gašper for providing the hardware, Python base-band decoding code and testing.

Posted by Tomaž | Categories: Code | Comments »

Broken C++ compilation on Debian

13.07.2012 9:24

I've noticed lately an issue with GNU C++ compilation on a few machines running mostly Debian Squeeze. Mostly because each one also has a few cherry picked testing and backports packages installed over the basic Squeeze system.

The symptoms are that linking all C++ programs will fail with an error message like this:

/usr/bin/ld: /usr/lib/gcc/x86_64-linux-gnu/4.4.5/libstdc++.a(functexcept.o):
relocation R_X86_64_32 against `std::bad_typeid::~bad_typeid()' can not be used
when making a shared object; recompile with -fPIC
/usr/lib/gcc/x86_64-linux-gnu/4.4.5/libstdc++.a: could not read symbols: Bad

This error points in a completely wrong direction though and made me loose a few hours in hunting down its true cause. So for future reference, here is what's really causing it.

For some reason, this symlink for the standard C++ library gets broken:

/usr/lib/gcc/x86_64-linux-gnu/4.4.5/ -> ../../../

It seems that GCC sees the broken link and quietly decides the dynamic version of libstdc++ isn't available and attempts to link programs with the static one, which fails. I'm not sure exactly which package upgrade breaks the link, but since I saw this on more than one machine I'm guessing it's a bug somewhere and not just some quirk specific to me.

Once you know it, the fix is simple:

$ cd /usr/lib/gcc/x86_64-linux-gnu/4.4.5
$ sudo ln -s ../../../x86_64-linux-gnu/
Posted by Tomaž | Categories: Code | Comments »

Noise versus temperature

11.07.2012 20:26

I mentioned energy detection and spectrum sensing before. The idea is that you try to detect if any radio transmissions are present in your neighborhood by simply detecting the energy the transmitter emits into the electromagnetic field. This is the most basic way of detection - any signal must carry energy. It's also the most robust as detection doesn't depend on the sensor knowing any details about the transmission.

There is a catch, of course. Parts of your own detector are unavoidably emitting radio frequency energy as well. Anything with a temperature above absolute zero has various charged particles moving in random ways which emit their own electrical signals and this thermal glow effectively blinds you for weaker signals. By definition the energy detector can't distinguish between a valid transmission signal and random noise.

SNE-CREWTV, spectrum sensor for UHF and VHF bands

With some statistics however you can still probabilistically detect signals that lie somewhat below the noise floor. The more accurately you can model your own noise, the weaker signals you can detect and higher the probability of correct detection. This is why I tried to find a good statistical model of the noise present in the spectrum sensing receiver I designed at the Institute.

Perhaps not surprisingly, I found out that the characteristics of the noise slowly change with time, which complicates things a bit. In electronics if some property is slowly changing it's usually connected to changes in temperature. The TDA18219HN tuner I'm using has an integrated die temperature sensor, so it was a straightforward experiment to check how the average noise level changes as the silicon heats up or cools down.

Measured power with no input signal versus die temperature

Graph above was measured with no input signal (with a terminator on the antenna connector), at 1.7 MHz channel bandwidth. It shows another surprise - the noise level actually decreases with temperature! This result is suspect, as noise usually increases with temperature. For instance, in this temperature range, the thermal noise of the terminating resistor alone would increase by 0.4 dB.

Obviously some other effect is at play here, so I repeated the same measurement, but this time with the receiver directly connected to a signal generator. I set the generator to a constant sine wave with an amplitude around 20 dB above the noise floor.

Measured power with -80 dBm CW input signal versus die temperature

As you can see, this signal shows a similar decrease in measured power. So obviously it's not the noise itself that decreases with the temperature but the sensitivity of the receiver itself.

At this moment it's hard to say which parts of the receiver are responsible for this. It might be the many amplification stages or the power detector itself. Also interesting is that noise amplitude is falling faster than the amplitude of an external signal (approximately -2 dB change versus -1.5 dB over the measured range). I would expect just the opposite, as the increase in thermal noise should counter the loss of sensitivity.

In any case, most likely I can't do anything about this at the hardware level. The noise model will simply have to either compensate for the die temperature or only be valid after the receiver heats up.

Posted by Tomaž | Categories: Analog | Comments »

USRP spurious emissions

06.07.2012 20:47

As a side effect of a project I am working on at the Jožef Stefan Institute I have a USRP N210 on my desk. Together with an expensive stack of Rohde & Schwarz lab equipment it made for a fun pass time today after work hours while waiting in the office for the rain to stop. It is also an interesting subject for an investigation into those pesky practicalities of software defined radio.

It seems a lot of people forget that any software defined radio still depends on an old-fashioned analogue RF front-end. While the digital signal you feed to it might be perfect in every way, once it passes from the digital into the domain of D/A converters, mixers and amplifiers it is still vulnerable to non-linear effects, saturations and other such worldly imperfections.

Ettus Research USRP N210 without top cover.

In the case of this USRP, the first experimental transmissions we did with it were a quite awful sight on the spectrum analyzer, with the desired modulated signal merely 20 dB above a sea of spurious emissions spanning most of the RF front-end's 40 MHz bandwidth. After a bit of tweaking of signal amplitudes and RF gains though, the transmission did get into a somewhat useful territory.

It's interesting to see what happens when a single, constant-wave sine signal is fed to the USRP on the digital baseband interface as the spurious spectral lines that are produced offer some insight into what's happening behind the scenes in the RF front-end.

In the video below, the USRP N210 was fitted with the SBX daughterboard which was configured for 600.50 MHz central frequency and 0 dB RF gain. It was connected to a PC running GNU Radio software which was producing a quadrature signal with the frequency of around 100 kHz, amplitude of 0.5 and a sample rate of 1 Msample/s. In the first part of the video I varied the frequency of the baseband sine wave while in the second part I varied the ratio between the amplitudes of the I and Q components of the quadrature signal.

Signal analyzer was centered at the same RF frequency as the USRP, with the span of 500 kHz (50 kHz per division).

The highest peak on the screen, at around -50 dBm, is the desired signal. Everything else above the noise floor is unwanted. The most obvious of these spurious emissions is an image signal at minus the desired frequency and some 40 dB below it, and a third harmonic at -3 times the frequency. Image signal is usually a sign of bad image rejection in the mixer and the harmonic is probably created by non-linearities somewhere in the pipeline.

An interesting thing to notice is that the local oscillator (the peak that doesn't move) isn't at the center of the frame, but 6 kHz lower. It's hard to notice on this video, but the desired signal is actually positioned correctly, which means that the signal frequency is finely adjusted before the final mixing to account for this offset, either in the digital or analogue domain. Local oscillator is likely based on a fractional-PLL, which of course doesn't have infinite precision but can only be tuned in discrete steps, a fact that is hidden from you in the GNU Radio interface. This offset differs when you change the RF frequency and disappears in round settings (for instance 600 MHz), which confirms this theory.

Another interesting result happens when I mess with amplitudes of the I and Q signals on the digital side. As expected, another image frequency appears. But since this imbalance appears on the digital side, it is mirrored around the true central frequency and is offset from the image produced by the local oscillator by twice the 6 kHz offset. It even mixes somewhat with it in a non-linearity somewhere and produces a small spike on the other side of it.

Posted by Tomaž | Categories: Analog | Comments »

Joules and calories

01.07.2012 21:20

I recently finished reading The windup girl, a science fiction novel by Paolo Bacigalupi. The story is set in a future where the world has all but ran out of fossil fuels and where the mechanical energy is provided mostly by either physical workers or draft animals. This makes the economy revolve around calories and joules, used to measure chemical energy bound in food versus work done by muscles and springs. It's all mixed up with copious amounts of biotechnology that tries to make the conversion from the former to latter as efficient as possible. This backstory itself makes the book fascinating to read and makes you think about the energy footprint of modern living.

I've browsed back in my records and compiled the sum of all my consumption of general-purpose energy sources I bought in the last year:

  • 5.9 MWh, heat
  • 3.3 MWh, fuel
  • 2.4 MWh, electricity

Obviously, this includes only energy sources I bought directly to use as I please and doesn't include the energy needed to manufacture any products I consumed or the use of public transportation where I didn't directly buy the fuel. So at best this is at the very low end of the energy footprint estimate.

The numbers above add up to 11.6 MWh per year or an average of 31.8 kWh per day or 1.3 kW of continuous power. To put this into perspective with some back-of-the-envelope calculations:

  • Two draft horses would need to be working around the clock to cover this consumption.
  • If perfect conversion from food energy to work would be possible, this would require 6.7 kg of carbohydrates per day.
  • Or, it would require around 100 m2 of photovoltaic cells (a technology curiously absent in the novel).
  • Or, 70 s of exclusive use of our national nuclear power plant's output per year.

Energy consumption is often considered one of the indicators of progress and I don't subscribe to any movements that we should return to pre-industrial age energy footprints. I am all for using resources responsibly and as efficiently as possible though. And such little calculations do show a bit what kind of power is available from your wall socket compared to what was available merely a century or two ago.

Posted by Tomaž | Categories: Life | Comments »