Button, button. Who's got the button?

19.01.2016 19:07

As much as I try, I can't switch away from GNOME. Every once in a while I will try to use some other desktop environment in Debian and I invariably switch back after a month. The fact is that for me, GNOME seems to come with fewest little annoyances that need manual fixing or digging through configuration files or source code. Or it might be that I'm just too used to all the small details in how GNOME does things on the user interface side. It's also possible that I understand enough about GNOME internals at this point that when things stop working I usually find the solution pretty quickly.

That does not mean, however, that GNOME no longer manages to surprise me. This is a story about the latest such occurrence. It's a story about a button. This button in particular:

Rotation lock button in GNOME 3.14

I noticed this button after upgrading my work laptop to Debian Jessie that comes with GNOME 3.14. The most curious thing about it was that it does not appear on my home desktop computer with a supposedly identical Debian Jessie setup. What makes my laptop so special that it sprouted an extra button in this prime piece of screen real-estate?

Being the kind of person that first looks into the documentation, I opened the GNOME 3.14 Release Notes. However, there is no note in there about any mysterious new buttons on the click-top-right-corner menu. I was not surprised though, since it might have been added in any of several GNOME versions that were released between now and the previous Debian Stable.

The button is marked with a circular arrow icon. The kind of icon that is often used to distinguish a reboot in contrast to a shutdown when positioned near the power-switch icon. Like for instance on this Ubuntu shutdown dialog:

Screenshot of Ubuntu shutdown dialog.

I don't think it was unreasonable then to assume that this is a reboot button. Anyway, when documentation about a thing fails you, the best next step is usually to poke the thing with a stick. Preferably a long and non-conductive one.

Unfortunately, this approach failed to uncover the purpose of the mysterious button. Upon closer inspection the button did change a bit after clicking on it, but seemingly nothing else happened. On the second thought, it would be kind of weird for GNOME to have a reboot button when at one point it lacked even a shutdown option, for a reason nobody quite understood.

In a final attempt to discover the purpose of the catching-its-own-tail arrow, I hovered the mouse cursor over it. Of course, no helpful yellow strip of text showed up. Tooltips are obsolete now that everything is a smartphone.

At that point I ran out of ideas. Since I had no text associated with the button it seemed impossible to turn to a web search for help. In fact, I realized I don't even know how the menu that holds the button is called these days. I complained about it on IRC and someone suggested doing a Google Image search for the button. This first sounded like a brilliant idea, but it soon turned out that visual search for graphical user interface widgets just isn't there yet:

Results of a search for visually similar images.

Of course, I was stubborn enough to keep searching. In the end, a (text) query led me to a thread on StackExchange that solved the conundrum. I don't remember what I typed into Google when I first saw that result, but once I knew what to search for, the web suddenly turned out to be full of confused GNOME users. Well, at least I was not alone in being baffled by this.

In the end, I only wonder how many of those people that went through this same experience also immediately started swinging their laptop around to see whether the screen on their laptop would actually rotate and were disappointed when, regardless of the screen lock button, the desktop remained stubbornly in the landscape mode.

Posted by Tomaž | Categories: Life | Comments »

End of a project

15.11.2015 11:38

Two weeks ago, the CREW project concluded with a public workshop and a closed meeting with reviewers invited by the European Commission. Thus ended five years of work on testbeds for wireless communications by an international team from Belgium, Ireland, Germany, France and Slovenia. I've been working on the project for nearly four of these years as the technical lead for the Jožef Stefan Institute. While I've been involved in several other projects, the work related to CREW and the testbed clusters we built in Slovenia has occupied most of my time at the Institute.

Spectrum Wars demo at the Wireless Community meeting.

Image by Ingrid Moerman

It's been four years of periodical conference calls, plenary and review meetings, joint experiments at the testbeds, giving talks and demonstrations at scientific conferences, writing deliverables for the Commission, internal reports and articles for publication, chasing deadlines and, of course, the actual work: writing software, developing electronics, debugging both, making measurements in the field and analyzing data, climbing up lamp posts and figuring out municipal power grids from outdated wiring plans.

It's funny how when I'm thinking back, writing is the first thing that comes to mind. I estimate I've written about 20 thousand words of documents per year, which does not seem all that much. It's less than the amount of text I publish yearly on this blog. Counting everyone involved, we produced almost 100 peer-reviewed papers related to the project, which does seem a lot. It has also resulted in my first experience working with a book publisher and a best paper award from the Digital Avionics Systems Conference I have hanging above my desk.

Measuring radio interference inside an Airbus cabin mockup.

Image by Christoph Heller

Remote writing collaboration has definitely been something new for me. It is surprising that what works best in the end are Word documents in emails and lots of manual editing and merging. A web-based document management system helped with keeping inboxes within IMAP quotas, but little else. Some of this is certainly due to technical shortcomings in various tools, but the biggest reason I believe is simply the fact that Word plus email is the lowest common denominator that can be expected of a large group of people of varying professions and organizations with various internal IT policies.

Teleconferencing these days is survivable, but barely so. I suspect Alexander Graham Bell got better audio quality than some of the GoToMeetings I attended. Which is where face-to-face meetings come into the picture. I've been on so many trips during these four years that I've finally came to the point where flying became the necessary evil instead of a rare opportunity to be in an airplane. Most of the meetings have been pleasant, if somewhat exhausting 3 day experiences. It is always nice to meet people you're exchanging emails with in person and see how their institutions look like. Highlights for me were definitely experiments and tours of other facilities. The most impressive that come to mind were seeing Imec RF labs and clean rooms in Leuven, the w-iLab.t Zwijnaarde testbed in Ghent and Airbus headquarters near Munich.

Instrumented Roombas at w-iLab.t

(Click to watch Instrumented Roombas at w-iLab.t video)

One of the closing comments at the review was about identifying the main achievement of the project. The most publicly visible results of the work in Slovenia are definitely little boxes with antennas you can see above streets in Logatec and around our campus. I have somewhat mixed feelings about them. On one hand, it's been a big learning experience. Planning and designing a network, and most of all, seeing how electronics fails when you leave it hanging outside all year round. Designing a programming interface that would be simple enough to learn and powerful enough to be useful. If I would do it again, I would certainly do a lot of things differently, software and hardware wise.

Different kinds of sensor nodes mounted on street lights.

While a number of experiments were done on the testbed, practically all required a lot of hands-on support. We are still far from having a fully automated remote testbed as a service that would be useful to academics and industry alike. Another thing that was very much underestimated was the amount of continuous effort needed to maintain such a testbed in operation. It requires much more than one full-time person to keep something as complex as this up and running. The percentage of working nodes in the network was often not something I could be proud of.

For me personally, the biggest take away from the project has been the opportunity to study practical radio technology in depth - something I didn't get to do much during my undergraduate study. I've had the chance to work with fancy (and expensive) equipment I would not have had otherwise. I studied in detail Texas Instruments CC series of integrated transceivers, which resulted in some interesting hacks. I've designed, built and tested three generations of custom VHF/UHF receivers. These were the most ambitious electronic designs I've made so far. Their capabilities compared to other hardware are encouraging and right now it seems I will continue to work on them in the next year, either as part of my post-graduate studies or under other projects.

SNE-ESHTER analog radio front-end.

I have heard it said several times that CREW was one of the best performing European projects in this field. I can't really give a fair comparison since this is the only such project I've been deeply involved so far. I was disappointed when I heard of servers being shutdown and files deleted as the project wound down, but that is how I hear things usually are when focus shifts to other sources of funding. It is one thing to satisfy own curiosity and desire to learn, but seeing your work being genuinely useful is even better. I learned a lot and got some references, but I was expecting the end result as a whole to be something more tangible, something that would be more directly usable outside of the immediate circle involved in the project.

Posted by Tomaž | Categories: Life | Comments »

TEMPerHUM USB hygrometers

11.10.2015 14:19

Last month at a project meeting in Ghent, Hans Cappelle told me about PCsensor TEMPerHUM hygrometers and thermometers. At around 23 EUR per piece, these are relatively cheap, Chinese made USB-connected sensors. I was looking for something like that to put into a few machines I have running at different places. Knowing the room temperature can be handy. For instance, once I noticed that the CPU and motherboard temperature on one server was rising even though fan RPMs did not change. Was it because the change in the room temperature or was it a sign that heat sinks were getting clogged with dust? Having a reading from a sensor outside of the server's box would help in knowing the reason without having to go and have a look.

TEMPerHUM v1.2 USB hygrometers

When I took them out of the box, I was a bit surprised at their weight. I assumed that the shiny surface was metallized plastic, but in fact it looks like the case is actually made of metal. Each came with a short USB extension cable so you can put it a bit away from the warm computer they are plugged in.

My devices identify as RDing TEMPERHUM1V1.2 on the USB bus (vendor ID 0x0c45, product ID 0x7402). They register two interfaces with the host: a HID keyboard and a HID mouse. The blue circle with TXT in it is actually a button. When you press it, the sensor starts simulating someone manually typing in periodic sensor readouts. As far as I understand, you can set some settings in this mode by playing with Caps lock and Num lock LEDs. I haven't looked into it further though since I don't currently care about this feature, but it is good to know that sensors can be used with only a basic USB HID keyboard driver and a text editor.

Text editor with TEMPerHUM readings typed in.

On Linux, udev creates two hidraw devices for the sensor. To get a sensor readout from those without having to press the button, Frode Austvik wrote a nice little utility. For v1.2 sensors, you need this patched version.

The utility is written in C and is easy to compile for more exotic devices. The only real dependency is HID API. HID API is only packaged for Debian Jessie, but the 0.8.0 source package compiles on Wheezy as well without any modifications. OpenWRT has a package for HID API. The sensor however does not work on my TP-Link router because it uses USB v1.10 and the router's USB ports apparently only support USB v2.x (see OpenWRT ticket #15194).

By default, udev creates hidraw device files that only root can read. To enable any member of the temperhum group to read the sensor, I use the following in /etc/udev/rules.d/20-temperhum.rules:

Update: Previously, I made the hidraw device world-writable. That might be a security problem, although probably a very unlikely one.

SUBSYSTEM=="hidraw", ATTRS{idVendor}=="0c45", ATTRS{idProduct}=="7402", MODE="0660", GROUP="temperhum"

Finally, what about the accuracy? I left one of the sensors (connected via an extension cable) to sit overnight in a room next to an Arduino with a DS18B20 and the old wireless weather monitor I use at home. The two spikes you see on the graphs are from when I opened the windows in the morning.

Temperature versus time for three thermometers.

The DS18B20 has a declared accuracy of ±0.5°C, while the Si7021 sensor that TEMPerHUM is supposedly using should be accurate to ±0.4°C. However, TEMPerHUM is showing between 1.5°C and 2.0°C higher reading than DS18B20. This might be because the USB fob is heating itself a little bit, or it might be some numerical error (or they might be using off-spec chips). Other people have reported that similar sensors tend to overestimate the temperature.

Humidity versus time for two hygrometers.

For relative humidity measurements I don't have a good reference. The difference between my weather monitor and TEMPerHUM is around 10% according to this graph. Si7021 should be accurate to within 3% according to its datasheet. My weather monitor is certainly worse than that. So I can't really say anything about the humidity measurement except that it isn't obviously wrong.

In conclusion, except for the incompatibility with some Atheros SoCs I mentioned above, these sensors seem to be exactly what they say they are. Not super accurate but probably good enough for some basic monitoring of the environment.

Posted by Tomaž | Categories: Life | Comments »

Spectrum Wars

25.07.2015 14:42

SpectrumWars will be a programming game where players compete for bandwidth on a limited piece of radio spectrum. It's also apparently a title that tends to draw attention from bored conference attendees and European commissioners alike.

SpectrumWars XKCD spoof.

with apologies to Randall Munroe

With the CREW project nearing completion, it was decided that one of the last things created in the project will be a game. This might seem an odd conclusion to five years of work from 8 European institutions that previously included little in terms of what most would consider entertainment. We built testing environments for experimentation with modern wireless communications aimed at the exclusive club of people that know what CSMA/CA and PCFICH stand for. In the end, I'm not sure we managed to convince them that what we did was useful to their efforts. Much less I guess the broader public and potential students that would like to study this field.

What better to correct this than to present the need for smart radio testbeds in the form of a fun game? The end of the project is also the time when you want to show off your work in way that reaches the broadest possible audience. Everything to raise the possibility that powers-that-be grant you funding for another project.

In any case, SpectrumWars is a fun little thing to work on and I gladly picked it up in the general lethargy that typically suffuses European projects on their home stretch. The idea stems from a combination of the venerable RobotWar game concept, DARPA Spectrum Challenge and some previous demonstrations by the CREW project at various conferences. Instead of writing a program for a robot that fights off other robots, players program radios. Turning the turret becomes tuning the radio to a desired frequency. A spectrum analyzer replaces the radar and the battle arena is a few MHz of vacant spectrum. Whoever manages to be best at transferring data from their transmitter to their receiver in the presence of competition, wins.

A desk-top setup for SpectrumWars with VESNA sensor nodes.

In contrast to the dueling robots, the radios are not simulated: their transmissions happen in the real world and might share the spectrum with Wi-Fi and the broken microwave next door. To keep it fun, things are simplified, of course. The game abstracts the control of the radios to the bare minimum so that a basic player fits into 10 lines of code, and a pretty sophisticated one into 50. This is not the DARPA Challenge where you need a whole team to have a go (unfortunately, there's no five-figure prize fund either).

So far I've made a hardware-agnostic game controller implementation in Python that takes care of all the dirty little details of running the game and defines the player's interface. The radios are VESNA nodes with Texas Instruments CC2500 transceivers and special firmware, but other CREW partners are currently working on supporting their own hardware. A USRP is used as a spectrum sensor that is shared by all users. While a lot of things are still missing (a user-friendly front-end in particular), it was enough to confirm so far that playing with the algorithms in this way is indeed fun (well, at least for my limited idea of fun).

I gave some more details in a talk I had at the Institute last Friday (slides), or you can go check the draft documentation I wrote or the source, if you feel particularly adventurous.

Title slide from the SpectrumWars talk.

In the end, SpectrumWars is supposed to look like a web site where tournaments between algorithms are continuously performed, where one can submit an entry and where a kind of a scoreboard is held. I'm not optimistic that it will be in fact finished to the point where it will be presentable to the general public in the couple of months left for the project. However I think it would be a shame if it went directly into the bit bucket together with many other things when the project funding runs out.

While the future might be uncertain at the moment, I do have a tentative plan to maybe set up an instance of SpectrumWars at the CCC congress in December. The inherently crowded 2.4 GHz spectrum there would certainly present an interesting challenge for the players and the congress does have a history of unusual programming games.

Posted by Tomaž | Categories: Life | Comments »

Weather radar image artifacts

16.06.2015 21:49

The Slovenian environment agency (ARSO) operates a 5 GHz Doppler weather radar on a hill in the eastern part of the country (Lisca). They publish a map with rate of rainfall estimate based on radar returns each 10 minutes.

Gašper of the opendata.si group has been archiving these rainfall maps for several years now as part of the Slovenian weather API project. A few days ago I've been playing with the part of this archive that I happened to have on my disk. I've noticed some interesting properties of the radar image that show up when you look at statistics generated from many individual images.

This is a sum of approximately 17676 individual rainfall maps, taken between December 2012 and April 2013:

Sum of 17676 weather radar images from ARSO.

Technically what this image shows is how often each pixel indicated non-zero rainfall during this span of time. In other words, the rainiest parts, shown in red, had rain around 18% of the time according to the radar. I ignored the information about the rate of rainfall.

More interesting is how some artifacts become visible:

  • There is a blind spot around the actual location of the radar. The radar doesn't scan above a certain elevation angle and is blind for weather directly above it.

  • There are triangular shadows centered on the radar location. These are probably due to physical obstructions of the radar beam by terrain. I've also heard that some of these radars automatically decrease sensitivity in directions where interference is detected (weather radars share the 5 GHz band with wireless LAN). So it is also possible that those directions have higher noise levels.

  • Concentric bands are clearly visible. The image is probably composed of several 360° sweeps at different elevation angles of the antenna. Although it is curious that their spacing decreases with distance. They might also be result of some processing algorithm that adjusts for the distance in discrete steps.

  • It's obvious that sensitivity is falling with distance. Some parts of the map in the north-western corner never show any rain.

  • Some moiré-like patterns are visible in the darker parts of the image. They are probably the result of interpolation. The map has rectangular pixels aligned with latitude and longitude while the radar produces an image in polar coordinates.

I should mention that ARSO's weather radar system got upgraded in 2014 with a second radar. Newer rainfall maps are calculated from combined data from both radars. They likely have much fewer artifacts. Unfortunately, the format of the images published on the web changed slightly several times during that year, which makes them a bit less convenient to process and compare.

Posted by Tomaž | Categories: Life | Comments »

On cyclostationary detectors in literature

01.03.2015 21:50

Recently, I've been studying cyclostationary detectors. These methods of detecting the presence of a radio transmission seem to be quite popular in academic literature. On the other hand, I kind of glossed over them in my seminar on spectrum sensing. There were several reasons for that: one of them was certainly that I find the theory of cyclostationary signals and statistical spectral analysis hard to wrap my head around. Another reason was however that I couldn't find any papers that would describe a practical cyclostationary detector to sufficient enough detail to allow reproduction of published results. At the very least I wanted to compare its performance against other (non-cyclostationary) detectors in a laboratory experiment before writing about it in any depth.

Here are some of the problems I stumbled upon when trying to understand and reproduce results from scientific papers published on this topic. I don't want to point to any particular work. The example below is just a recent one that crossed my desktop (and happens to nicely illustrate all of the problems I write about). I encountered similar difficulties when reading many other research papers.

Example of unreadable mathematical typesetting.

(from Wireless Microphone Sensing Using Cyclostationary Detector presented at the INCT 2012 conference)

Perhaps the most trivial thing that makes reading unnecessary hard is bad mathematical typesetting. It's sometimes bad enough that it is impossible to follow derivations without first rewriting them to a more consistent form. A part of the blame here goes to the publishers who are preferring manuscripts in Word. I have seen some beautiful articles produced in Word, but the fact is that it is much harder to mess up formatting in LaTeX. Not that it is impossible though. I have had to deal with my share of buggy LaTeX styles.

Another common problem that is simple to fix is inconsistent mathematical notation. For instance the constant Ac2/4 gets dropped somewhere in the middle of the derivation above. Also, x might (or might not) mean two different things. I've heard people argue strongly that such trivial errors are of no concern, since obviously everyone in the field knows what is meant by the author. The problem is that mathematical sloppiness creates confusion for readers that are not intimately familiar with the topic. Why even have a theoretical overview section in the paper then, if the assumption is that it the reader already knows it by heart?

After you manage to bite your way through the theoretical sections to the practical evaluation, it is surprisingly common that various parameters necessary to reproduce the results are left out. For instance, a lot of cyclostationary detectors use a method of estimating the cyclic spectrum called the FFT accumulation method. This method takes several parameters, for instance the sizes of two Fourier transforms that determine its time and frequency resolution. These are often omitted. Another omission is the exact waveform used to test the detector. The paper above gives only frequency deviation used by the wireless microphone, but not the nature of the modulation signal.

Having such difficulty reproducing results from papers based on pure numerical simulations is especially frustrating. I imagine the results are based on a few hundred lines of Matlab code which, if published, would quickly remove any doubts on how the results were obtained. However publishing source together with an article is still a rare exception.

Some of the difficulties I've been having are definitely due to my unfamiliarity with the field. As I keep studying the topic, some of the line of thoughts from the authors who wrote these papers become clearer and some mysteries on how they managed to come up with their results deepen. Undoubtedly some of my own writings aren't as clear, or error free, as I liked them to be. However there are literally hundreds of articles written on the topic of cyclostationary detectors and I'm continuously surprised how much time it is taking me to find a single reproducible example.

Posted by Tomaž | Categories: Life | Comments »


10.01.2015 23:00

I guess too many things happen in a year to be able to mark the whole as good or bad with good conscience. For me, 2014 certainly had had its ups and downs. Looking back at it though, I can't avoid the feeling that the latter were more abundant and that it's been overall a difficult year to weather.

My life in 2014 has been dominated by a bunch of personal problems that surfaced almost exactly at the start of the year. I only wish I could say they are completely behind me. I am not comfortable discussing them here, except to say that I am immensely grateful for the help and support I received, despite the worries and grief I caused. Foremost from my parents and also from professional counselors I met over the past twelve months.

I have completed, even if just barely catching the deadline, the first year of a postgraduate programme at the Jožef Stefan International Postgraduate School. Although I have been encouraged by my employer and others to pursue a PhD, I am not that optimistic that I will complete the study. After being a passive observer of the school for two years and a student for one I'm not convinced that I can make a meaningful contribution to science in this environment. I feel I am too old to sign attendance at lectures, study a topic only for the sake of passing an exam and giving seminar presentations to empty rooms.

Last year has been the time when I completed what is probably my most elaborate electronics project yet. The new UHF receiver for VESNA has been an interesting challenge, a welcome refreshment of some parts of electrical engineering I haven't touched in many years and provided a wonderful sense of achievement when everything fell in place. Still, looking at it now, the success is bitter sweet. It's been work done in an isolated garden. I've done all electronics design, firmware and higher level software development myself and it's too little too late. What I made is a product of European funding and the ecosystem at the Institute. Outside these walls it has precious little advantages over something as mediocre as a rtl-sdr stick in a Raspberry Pi.

Given the above, it's no wonder that 2014 has been kind of dry season for technical side projects. I always tended to have some piled on my table, but this time few deserve mention. I've spent perhaps the most time playing with CubieTruck, getting Debian and a web server running smoothly on it and discovering some of its whims. jsonmerge was another project that had me occupied beyond the hackday that sparked it. While it turned out less generally useful than I initially thought, it was a nice excuse to update my Python skills somewhat.

Speaking of Python, visiting Berlin in summer for EuroPython has definitely been one of the highlights of the year. As was manning the hardware hacking table at the WebCamp Ljubljana.

It's hard to sum up 2014. In some way it's been full of serious work and worries that have at times removed the joy of working in my profession. On the other hand, it's been also about facing problems that went unaddressed for too long and doing things that I might not have imagined doing a few years back. It's hard to say which of them has been a distraction from which. It has left me wishing for change and wanting back times when I could spend evenings hacking away on some fun project without deadlines attached.

Posted by Tomaž | Categories: Life | Comments »

Continuous spectrum monitoring

23.12.2014 19:40

Just before the holidays and with some help from my colleagues at the Institute I managed to finally deploy the first spectrum sensor based on my new UHF receiver design. A handful of similar deployments are planned for the next year. At the end of January I'm traveling to London to mount two similar sensors that will monitor the TV White Spaces pilot in UK as part of a cooperation between King's College London and the CREW project I'm working for.

VESNA stand-alone spectrum sensor in a metal box.

The sensor is based on the VESNA sensor node and is mounted in a small, weather-proof metal box. The previous generation of spectrum sensors (like those that were deployed in our testbed in Logatec) were designed to work in a wireless sensor network. However the low bit rate and reliability of the sensor network and interference caused by another radio next to a sensitive receiver proved to be very limiting. So for this device I went with wired Ethernet even though that reduces the deployment possibilities somewhat. Ethernet now provides a sufficient bit rate to allow for continuous monitoring of the complete UHF band with increased precision that is possible with the new receiver.

In some modes of operation however, the bandwidth from the sensor is still a limiting factor. The Digi Connect ME Ethernet module used here only allows for streaming data up to around 230 kbit/s, which is around 2-3 times slower than what the VESNA's CPU can handle. This is not a problem if you do signal processing on-board the sensor and only send some test statistics back over the network. In that case other things, like the local oscillator and AGC settle time, are usually the limiting factor. However if you want to stream raw signal samples, the network becomes the bottleneck.

Cluster of antennas mounted on JSI building C.

Antenna for the sensor is mounted on top of one of the buildings at the Jožef Stefan Institute campus in Ljubljana. Most of the setup, like power and Ethernet connection, is shared with WLAN Slovenija, who already had some equipment mounted there. On the picture above, my UHF antenna (Super Scan Stick from Moonraker) is second from the left (one with three radials). I'm still using SO-239 connectors and RG-58 coax which I know is not optimal, but they were what I had in stock.

There is direct line of sight to a DVB-T receiver near Ljubljana Castle around 1500 m away and two cellular towers on nearby buildings, so there are plenty of strong transmissions to look at. On the same mast there is also a big omni-directional antenna for 2.4 GHz Wi-Fi. This might cause some interference with sensing in the 800 MHz band (2.4 GHz is the third harmonic of a 800 MHz LO signal), but I don't think that will be a problem. RG-58 cable is bad enough even at sub-1 GHz frequencies and that part of the spectrum seems to be occupied by a strong LTE signal anyway.

Screenshot of the real-time spectrogram plot.

If you click on the image above, you should see a live visualization in you browser of the data stream coming from the sensor.

At the time of writing, the receiver is sweeping the band between 470 MHz and 861 MHz in 1 MHz increments. For each 1 MHz channel, it takes 25000 samples of the baseband signal. From these samples, it calculates the baseband signal power and a vector of sample covariances. It then sends these statistics back to a server that logs them to a hard drive and also provides the visualization (using a somewhat convoluted setup).

The signal power is shown on the spectrogram plot in logarithmic scale. Currently on the spectrogram you can see four DVB-T multiplexes at 482, 522, 562 and 610 MHz and some LTE activity around 800 MHz. Note that the power level is in relative units: 0 dB is the maximum ADC level which can correspond to different absolute power levels due to automatic changes in gain. The sensor can provide calibrated absolute power figures as well, but they are not shown at the moment.

Sample covariances are currently used by a maximum auto-correlation detector to detect whether a channel is vacant or not. This is shown on the live graph below the spectrogram. Note though that this detector works best for narrow-band transmissions, like wireless microphones, and often marks a channel vacant when it is in fact occupied by a wide-band transmission like DVB-T (see my seminar on spectrum sensing methods for details and references).

Posted by Tomaž | Categories: Life | Comments »

Arts & Crafts

19.12.2014 22:44

I have in front of me two sketchbooks. Between them, they contain around a hundred pages and a bundle of loose sheets. Thumbing through them, they start with embarrassingly clumsy pencil sketches, half way through change to line marker drawings and end with digitally shaded print-outs. The first page is dated April 2013. A year and a half later, I'm trying to come up with a reasonable story behind all of this.

The fact is that free-hand drawing is one skill I never really learned. I am pretty handy with a pencil, mind you. I am old enough to have had technical drawing lessons in school that did not involve anything fancier than a compass and a straightedge. I do majority of my notes in a paper notebook and still prefer to do plans and schematics for my home projects with a pen instead of a mouse. However the way one draws objects that do not consist of right angles and straight lines has always eluded me.

Drawing is really hard.

-- Randal Munroe in an recent interview

I have grown up with characters drawn by Miki Muster and I have kept my appetite for comics ever since. I don't think I ever attempted to draw one though. I remember the art class I attended in elementary school. It seems the local artist that held it considered it his job to protect our unspoiled childhood creativity. Any art inspired by popular culture was frowned upon and a reason for a lecture on how completely devoid of imagination our generation was. I guess we stuck to whatever kind of imaginary things children were supposed to draw back then and I was more interested in the room full of ZX Spectrums next door anyway.

So from that point of view, attempting to draw cartoon characters has been an interesting new challenge, something very different from my usual pursuits. Maybe it shouldn't be surprising that I picked up something like that now. I have been writing software at home while studying electronics at the Faculty and I have been doing electronics at home while writing software at work. These days I spend most of my time doing everything from electronics design, software development to writing and giving talks for the Institute. It leaves little space to decompress after work. Doing some non-digital creativity is refreshing.

A working pony, a pony of science.

The other part of this story is, of course, that I've been spending increasing amounts of time on forums and events that have something to do with cartoonish horses and other such things. For better or for worse, it's been a common pass time and means of escape for me this past year. I certainly wouldn't go out and buy a sketch book and a set of pencils if I wouldn't witness a few people from science and engineering professions try themselves in drawing, sometimes with really nice results. It seems that whatever community I happen to meet, I can't manage to stay just a passive observer for long.

If nothing else, that teacher from elementary school was probably right regarding lack of imagination. I don't pretend that what I've made is very original nor that I've managed to teach myself anything else than basics. On the other hand, learning anything necessarily involves some degree of copying. I don't feel this has been a waste even if there is a crowd on the Internet is doing similar things.

The point is that to be truly adept at designing something, you have to understand how it works. [Otherwise] you're drawing ponies. Don't draw ponies.

-- David Kadavy in Design for Hackers

Several years and two schools after that art class I mentioned, I found myself at an art history lecture. It was taught by a bitter old lady who, despite her quite obvious disappointment at the world, was still very much determined to teach us ignorant youth. In addition to explaining historic European architecture styles to us, she also commonly managed to slip in her observations about life in general. On one such occasion she told us how often men fall into the trap of irrationally obsessing over one thing. I remember duly noting it in my notebook.

Maybe this is what she had in mind, maybe not. The fact is that I enjoyed doing it and learning to draw made me look at things around me in new ways. If someone else liked the few results that I shared on the Internet, that much better. Some say it might look terribly silly a few years on, but I'm sure I won't regret gaining a new skill.

Posted by Tomaž | Categories: Life | Comments »

2.4 GHz band occupancy survey

09.10.2014 19:36

The 100 MHz of spectrum around 2.45 GHz is shared by all sorts of technologies, from wireless LAN and Bluetooth, through video streaming to the yesterday's meatloaf you are heating up in the microwave oven. It's not hard to see the potential for it being overused. Couple this with ubiquitous complaints about non-working Wi-Fi at conferences and overuse is generally taken as a fact.

The assumption that existing unlicensed spectrum, including the 2.4 GHz band, is not enough to support all the igadgets of tomorrow is pretty much central in all sorts of efforts that push for new radio technologies. These try to introduce regulatory changes or develop smarter radios. While I don't have anything against these projects (in fact, some of them pay for my lunch), it seems there's a lack of up-to-date surveys of how much the band is actually used in the real world. It's always nice to double-check the assumptions before building upon them.

Back in April I've already written about using VESNA sensor nodes to monitor the usage of radio spectrum. Since then I have placed my stand-alone sensor at several more locations in or around Ljubljana and recorded spectrogram data for intervals ranging between a few hours to a few months. You might remember the sensor box and my lightning talk about it from WebCamp Ljubljana. All together it resulted in a pretty comprehensive dataset that covers some typical in-door environments where you usually hear most complaints about bad quality of service.

(At this point, I would like to thank everyone that ranted about their Wi-Fi and allowed me to put a ugly plastic spy box in their living room for a week. You know who you are).

In-door occupancy survey of the 2.4 GHz band

A few weeks ago I have finally managed to put together a relatively comprehensive report on these measurements. Typically, such surveys are done with professional equipment in the five-digit price range instead of cheap sensor nodes. Because of that a lot of the paper is dedicated to ensuring that the results are trustworthy. While there are still some unknowns regarding how the spectrum measurement with CC2500 behaves, I'm pretty confident at this point that what's presented is not completely wrong.

To spare you the reading if you are in a hurry, here's the relevant paragraph from the conclusion. Please bear in mind that I'm talking about the physical layer here. Whether or not various upper-layer protocols were able to efficiently use this spectrum is another matter.

According to our study, more than 90% of spectrum is available more than 95% of the time in residential areas in Ljubljana, Slovenia. Daily variations in occupancy exist, but are limited to approximately 2%. In a conference environment, overall occupancy reached at most 40%.

For another view of this data set, check also animated histograms on YouTube.

Posted by Tomaž | Categories: Life | Comments »

Checking hygrometer calibration

06.10.2014 22:08

Several years ago I picked an old, wireless temperature and humidity sensor from trash. I fixed a bad solder joint on its radio transmitter and then used it many times simply as a dummy AM transmitter when playing with 433 MHz super-regenerative receivers and packet decoders. Recently though, I've been using it for it's original purpose: to monitor outside air temperature and humidity. I've thrown together a receiver from some old parts I had lying around, a packet decoder running on an Arduino and a Munin plug-in.

Looking at the relative air humidity measurements I gathered over the past months however I was wondering how accurate they are. The hygrometer is now probably close to 10 years old and of course hasn't been calibrated since it left the factory. Considering this is a fairly low-cost product, I doubt it was very precise even when new.

Weather station humidity and temperature sensors.

These are the sensors on the circuit board: the green bulb on the right is a thermistor and the big black box on the left is the humidity sensor, probably some kind of a resistive type. There are no markings on it, but the HR202 looks very similar. The sensor reports relative humidity with 1% resolution and temperature with 0.1°C resolution.

Resistive sensors are sensitive to temperature as well as humidity. Since the unit has a thermometer, I'm guessing the on-board controller compensates for the changes in resistance due to temperature variations. It shows the same value on an LCD screen as it sends over the radio, so the compensation definitely isn't left to the receiver.

Calibrating a hygrometer using a saturated salt solution.

To check the accuracy of the humidity measurements reported by the sensor, I made two reference environments with known humidity in small, airtight Tupperware containers:

  • A 75% relative humidity above a saturated solution of sodium chloride and
  • 100% relative humidity above a soaked paper towel.

I don't have a temperature stabilized oven at home and I wanted to measure at least three different humidity and temperature points. The humidity in my containers took around 24 hours to stabilize after sealing, so I couldn't just heat them up. In the end, I decided to only take the measurements at the room temperature (which didn't change a lot) and in the fridge. Surprisingly, the receiver picked up 433 MHz transmission from within the metal fridge without any special tweaking.

Here are the measurements:

T [°C]Rhreference [%]Rhmeasured [%]ΔRh [%]

So, from this simple experiment it seems that the measurements are consistently a bit too low.

The 6% step between 22 and 24°C is interesting - it happens abruptly when the temperature sensor reading goes over 23°C. I'm pretty sure it's due to temperature compensation in the controller. Probably it does not do any interpolation between values in its calibration table.

Rh and temperature readings above a saturated solution at room temperature.

From a quick look into various datasheets it seems these sensors typically have a ±5% accuracy. The range I saw here is +0/-15%, so it's a bit worse. However considering its age and the fact that the sensor has been sitting on a dusty shelf for a few years without a cover, I would say it's still relatively accurate.

I've seen some cheap hygrometer calibration kits for sale that contain salt mixtures for different humidity references. It would be interesting to try that and get a better picture of how the response of the sensor changed, but I think buying a new, better calibrated sensor makes much more sense at this point.

Posted by Tomaž | Categories: Life | Comments »

Seminar on covariance-based spectrum sensing

29.09.2014 20:05

Here are the slides from my seminar on a practical test of covariance-based spectrum sensing methods. It was presented behind closed doors and was worth five science collaboration points.

Covariance-based spectrum sensing methods in practice title slide

The basic idea behind spectrum sensing is for a sensor to detect which frequency channels are occupied and which are vacant. This requires detecting very weak transmissions, typically below the noise floor of the receiver. For practical reasons, you want such a sensor to be small, cheap, robust and capable of detecting a wide range of signals.

Covariance-based and eigenvalue-based detectors are a relatively recent development in this field. Simulations show that they are capable of detecting a wide range of realistic transmissions, are immune to noise power changes and can detect signals at relatively low signal-to-noise ratios. They are also interesting because they are not hard to implement on low-cost hardware with limited capabilities.

Over the summer I performed several table-top experiments with a RF signal generator and a few radio receivers. I implemented a few of the methods I found in various published papers and checked how well they perform in practice. I was also interested in what kind of characteristics are important when designing a receiver specifically for such an use case - when using a receiver for sensing, noise properties you mostly don't care about for data reception start to get important.

This work more or less builds upon my earlier seminar on spectrum sensing methods and my work on a new UHF receiver for VESNA. In fact, I have performed similar experiments with the Ettus Research USRP specifically to see how well my receiver would work with such methods before finalizing the design. Since I now finally have a few precious prototypes of SNE-ESHTER on my desk, I was able to actually check its performance. While I don't have conclusive results yet, these latest tests do hint that it does well compared to the bog-standard USRP.

A paper describing the details is yet to be published, so unfortunately I'm told it is under an embargo (I'm happy to share details in person, if anyone is interested though). But the actual code, measurements and a few IPython notebooks with analysis of the measurements are already on GitHub. Feel free to try and replicate my results in your own lab.

Posted by Tomaž | Categories: Life | Comments »

Disassembling Tek P5050 probe

03.09.2014 18:59

We have a big and noisy 4-channel Tektronix TDS 5000 oscilloscope at work that is used in our lab and around the department. Recently, one of its 500 MHz probes stopped working for an unknown reason, as it's bound to happen to any equipment that is used daily by a diverse group of people.

Tektronix P5050 oscilloscope probe.

This is an old Tektronix P5050 voltage probe. You can't buy new ones any more and a similar probe will apparently set the tax payers back around $500. So it seemed reasonable to spend some time looking into fixing it before ordering a replacement.

This specimen doesn't appear to be getting any signal to the scope. The cable is probably fine, since I can see some resistance between the tip and the BNC connector on the other end. My guess is that the problem is most likely in the compensation circuit inside the box at the oscilloscope end.

So, how does one disassemble it? It's not like you want to apply the usual remove-the-labels-and-jam-the-screwdriver-under-the-plastic to this thing. I couldn't find any documentation on the web, so here's a quick guide. It's nothing complicated, but when working with delicate, expensive gadgets (that are not even mine in the first place) I usually feel much better if I see someone else has managed to open it before me.

First step is to remove the metal cable strain relief and the BNC connector. I used a wrench to unscrew washer at the back of the BNC connector while the strain relief was loose enough to remove by hand.

Tek P5050 probe partially disassembled.

The circuit case consists of two plastic halves on the outside and two metal shield halves on the inside that also carry the front and aft windings for the strain relief and the BNC connector. There are no screws or glue. The plastic halves firmly latch into grooves on the two broad sides of the metal ground shield (you can see one of the grooves on the photo above).

You can pry off the plastic shell by carefully lifting the sides. I found that it's best to start at the holes for the windings. Bracing a flat screwdriver against the metal at that point allows you to lift the plastic parts with minimal damage.

After you remove the plastic, the metal parts should come apart by themselves. The cable is not removable without soldering.

Circuit board inside Tek P5050 probe.

Finally, here's the small circuit that is hidden inside the box. Vias suggest that it's a two-sided board. Unfortunately you can't remove it from the shield without cutting off the metal rivets.

The trimmer capacitor in the center is user-accessible through the hole in the casing. The two potentiometers on the side appear to be factory set. From a quick series of pokes with a multimeter it appears one of the ceramic capacitors is shorted, however I want to study this a bit more before I put a soldering iron to it.

Posted by Tomaž | Categories: Life | Comments »

On cartoon horses and their lawyers

15.08.2014 19:14

GalaCon is an annual event that is about celebrating pastel colored ponies of all shapes and forms, from animation to traditional art and writing. It's one of the European counterparts to similar events that have popped up on the other side of the Atlantic in recent years. These gatherings were created in the unexpected wake of the amateur creativity that was inspired by Lauren Faust's reimagining of Hasbro's My Little Pony franchise. For the third year in a row GalaCon attracted people from as far away as New Zealand. It's a place where a sizable portion of the attendees wear at least a set of pony ears and a tail, if not a more elaborate equestrian-inspired attire. Needless to say, it can be a somewhat frightful experience at first and definitely not for everyone.

For most people it seems to be a place to get away from the social norms that typically prevent adults from obsessing over stories and imagery meant for audience half their age and often of the opposite gender. While I find the worshiping of creative talents behind the show a bit off-putting, I'm still fascinated by the amateur creations of this community. The artist's booths were a bit high on kitsch ("Incredible. Incredibly expensive" was one comment I overheard), but if you look into the right places on-line, there are still enjoyable and thoughtful stories, art and music to be found.

Meeting people I knew from their creations on the web was a fun experience. However for me this year's GalaCon was also a sobering insight into what kind of a strange mix of creativity, marketing psychology and legal matters goes into creating a cartoon for children these days.

GalaCon and Bronies e.V. flags at Forum am Schlosspark.

A highlight of the event was a panel by M. A. Larson, one of the writers behind the cartoon series. By going step by step through a thread of actual emails exchanged between himself, Lauren Faust and the Hasbro office he demonstrated the process behind creating a script for a single episode.

The exact topic of the panel was not announced beforehand however and all recording of the screen was prohibited, with staff patrolling the aisles to look for cameras. I don't know how much of that was for dramatic effect and how much due to real legal requirements. However even before the panel began that gave a strong impression of the kind of atmosphere a project like this is created in. Especially considering the episode he was discussing aired more then three years ago. I'm sure a lot of people in the audience could quote parts of that script by heart. It has been transcribed, analyzed to the last pixel, remixed and in general picked apart on the Internet years ago.

My Little Pony was called the end of the creator-driven era in animation. So far I thought marketing departments dictated what products should appear on the screen and which characters should be retired to make place for new toy lines. I was surprised to hear that sometimes Hasbro office gets involved even in details like which scene should appear last before the end of an act and the commercial break. That fact was even more surprising since this apparently happened in one of the earliest episodes where the general consensus seems to be that the show was not yet ruined by corporate control over creative talent.

Similar amount of thought seemed to go into possibilities of lawsuits. Larson mentioned their self-censorship of the idea to make characters go paragliding and have them do zip lining instead. Is it really harder to say in a court that some child has been hurt trying to imitate horses sliding along a wire than horses soaring under a parachute?

GalaCon 2014 opening ceremony.

The signs of the absurdity of intellectual property protection these days could also be seen throughout the event. Considering Bronies e.V. paid license fees for the public performance of the show it was ridiculous that they were using low-quality videos from the United States TV broadcasts for projection on the big cinema screen, complete with pop-up advertisements that didn't make sense.

Similarly, the love-hate relationship between copyright holders and non-commercial amateur works is nothing new to report. There were a lot of examples where rabid law firms, tasked with copyright protection and with only tenuous connections back to the mothership, used various extortion tactics to remove remixed content from the web. I still don't understand though what kind of a law justifies cease-and-desist letters for works inspired by unnamed background characters that only appeared for a couple of seconds in the original show.

Evening in front of the Forum am Schlosspark.

In general, GalaCon was a bit more chaotic experience than I would wish for and I left it with mixed feelings. Cartoon ponies on the internet are full of contradictions. While the stories they tell are inspiring and a welcome getaway from daily life, the money-grabs behind them are often depressing. I still believe in the good intentions of these events but the extravagant money throwing at the charity auction made me question a lot of things. With extra fees for faster queues, photos and autographs this year's event felt more like a commercial enterprise than a grassroots community event.

Posted by Tomaž | Categories: Life | Comments »

EuroPython 2014 wrap-up

29.07.2014 14:08

I spent the last week at the EuroPython conference. Since my last visit three years ago, the largest European meeting of the Python community moved from Florence in Italy to the familiar Berlin's Congress Center in Alexanderplatz. Last 7 days were chock-full of talks about everything related to Python, friendly conversations with fellow developers and hacking on this and that odd part of the free software ecosystem. So much so actually that it was a welcome change to spend a day outside, winding down in one of Berlin's numerous parks.

EuroPython 2014 schedule poster in the basement of the BCC.

It was an odd feeling to be back in the BCC. So far I knew it only by the Chaos Communication Congresses it hosted before they moved back to Hamburg. I was half-expecting to see the signature Fairy dust rocket and black Datenpirat flags flying in the front. Instead, part of the front yard was converted to a pleasant outdoor lounge area with an ice cream stand while a bunch of sponsor booths took place of the hackcenter.

Traces of the German hacker scene and the community around the CCC could still be easily seen though. The ubiquitous Club Mate was served on each floor. Blinking All Colors Are Beautiful and Mate Light installations were placed on two floors of the conference center. A few speakers reminisced about the congresses and the CCC Video Operations Center took care of the video recordings with the efficiency we are used to from the Congresses. You could even hear a public request to use more bandwidth now and then.

Constanze Kurz talking about surveillance industry.

There were several talks with a surprisingly political topic for a conference centered around a programming language. I felt odd listening to the opening keynote from Constanze Kurz about one year of Snowden revelations and the involvement of big Internet companies while those same companies were hunting for new employees just outside the lecture hall. Still, I don't think it hurts to remind people that even purely technical work can have consequences in the society.

From the keynotes, it's also worth mentioning Emily Bache and her views on test driven development and the history and future of NumPy development by Travis Oliphant.

Even though I'm not doing as much Python development as I used to in my previous job, there were still plenty of interesting presentations. In the context of scientific computing in Python, I found the talks about the GR plotting framework, simulations with SimPy and probabilistic programming most interesting. Some interesting scientific projects relevant to my work were presented in the poster session as well, although I had problems later finding them on the web to learn more about them. Especially, the work on propagating measurement errors in numerical calculations done by Robert Froeschl at CERN caught my eye.

Park in front of the BCC for EuroPython attendants.

I also attended the matplotlib training, although I was a bit disappointed with it. I was hoping to learn more about advanced usage of this Python library I use daily in my work. I enjoyed the historical examples of data visualization and general advice on how to present graphs in a readable way. However most of the technical details behind plotting and data manipulation weren't new to me and we ran out of time before digging into more advanced topics. I guess it was my fault though, since I now see that the training has a novice tag.

Going away from the topics strictly connected to my work, the talks about automated testing with a simple robot and using Kinect in Python left me with several crazy ideas for potential future projects.

Open Contracting table at EuroPython 2014 sprints.

During the weekend I joined the Open Contracting coding sprint. Jure and I represented our small Open Data group in helping to develop tools for governments to publish information about public tenders and contracts in a standard format. My result from these two days is the jsonmerge library, which deserves a blog post of its own.

In conclusion, I enjoyed EuroPython 2014 immensely. In contrast for instance with 30C3 last year, it was again one of the those events where I got to talk with an unusually large number of people during coffee breaks and social events.

On the other hand, attending it made me realize how much my interests changed after I left Zemanta. I visited Florence during my last days at that job and I was intimately familiar in all nuances of the Python interpreter. There I joined the spring hacking on the CPython distribution. This year however I was more interested in what Python can offer to me as a tool and not that much in developing Python itself.

Posted by Tomaž | Categories: Life | Comments »