Google's paper on DRAM

31.10.2009 18:27

A while ago Google and University of Toronto published an article (PDF) about the results of a two-and-a-half year long study into DRAM reliability.

The bottom line of their findings is that they have seen on average 25 to 70 thousand errors per 109 operating hours per Mbit, and 8% of the memory modules having some kind of an error in the course of a year.

Those are some pretty large numbers. My server currently sports 1 GB of non-ECC RAM. With that kind of an error rate, I would be seeing a bit-flip every day or so. I'm pretty confident that's not happening. Of course, without ECC, I don't have any way of being sure. But with a bit-flip per day, I'm sure the software would be crashing much more often than it is.

Of course, with only 8% of the population affected by errors, it can be that I just got lucky.

But still, non-ECC modules with these kinds of error rates would be unusable. Does that mean that ECC modules being sold are actually of lower quality than non-ECC? I can imagine that an ECC module with an occasional recoverable error would be easier to pass through quality control than an non-ECC one with similar reliability.

Posted by Tomaž | Categories: Digital | Comments »

Apple Time Capsule

29.10.2009 17:39

Apple users at Zemanta used to backup their machines to an Apple Time Capsule. I say used to because it went belly-up a while ago, probably for the same reason as many others of its kind. On Friday we did some clean-up in the office and I took the white brick home, to see if I can still find some use for it.

Apple Time Capsule with rubber seal removed

Disassembly went quite smooth. Actually, the approach was somewhat similar to what I did with Piki the Pleo. And similarly irreversible.

Apple Time Capsule without the cover

Inside is a normal hard drive and an embedded computer which I hear is running NetBSD. As far as my web searches have shown nobody has yet managed to run any kind of custom software for it, so I guess that option is out.

It also becomes obvious why these things are dying of overheating. What kind of a thermal design is that? The fan does nothing - it just mixes hot air around the case as there are hardly any holes in it. There's also precious little air to actually mix around because the case of so packed. Interestingly, there seems to be a temperature sensor glued to the hard drive.

So far I haven't done much more than just poked around for a couple of minutes, so right now I have no idea why it's simulating a brick and whether it's fixable or not. Nothing is obviously blown out.

Posted by Tomaž | Categories: Digital | Comments »

Experiments with DLP-232PC

27.10.2009 17:15

DLP-232PC is described as a low-cost USB data acquisition module for process monitoring and control. In practice that means a tiny circuit board that fits into a 300 mil DIP18 socket, sports a mini-B USB connector and has 14 GPIO digital channels, 8 of which can also be switched to analog read-out. From the USB host point of view it presents itself as an USB-to-serial converter and is controlled by sending single byte commands through the virtual serial port (e.g. sending ASCII '1' puts pin 1 high).

DLP-232PC

Now imagine some old hardware that connects to the PC's parallel port and used to be controlled by old-fashioned DOS software doing bit-banging on the printer's pins. DLP-232PC sounds like a good tool to easily port that old software to a modern operating system running on a modern computer that doesn't have a parallel port (USB-to-parallel converters don't work, but that's a whole new story). Simply replace parallel port writes with appropriate writes to the virtual serial port, recompile for Windows and everything should work fine.

Well, some experiments showed that in practice it's not that simple. The data-sheet says that the serial port baud rate is set to 460800, so in theory it should be able to take 46080 single-byte commands per second (i.e. around one state change per 20 μs). Unfortunately DPL-232PC is actually implemented with a 18F2410 PIC microcontroller that reads commands from the UART and controls its GPIO pins in software.

Here's the first, simple experiment: do a burst of 8 impulses on channel 3 by writing a block of 16 bytes (8 repetitions of ASCII "3E") to the device (e.g. /dev/ttyUSBx on Linux, or COMxx on Windows).

DLP-232PC 8 impulses burst

This is the result the oscilloscope showed. You can see that doing anything that require real-time determinism is going to be hard. First two impulses are cca. 80 μs wide, while the rest are closer to 25 μs. The pattern of pulse widths changed with different block lengths. This was the result when DLP-232PC was controlled from a Windows XP host. An equivalent test with my Linux-running EeePC gave a more regular waveform, where all impulses were 80 μs wide (i.e. around 4 times theoretical minimum).

I'm not sure what is causing this. Maybe it's a quirk in the USB transport, or Windows drivers. But it does make me a little skeptical if you can reliably bit-bang, say, an I2C protocol on this.

DLP-232PC continuous square wave on two channels

This is another test, to see how it handles two channels. It was done by continuously writing "36EY" as fast as possible to the device, which cycles states of channel 3 and 6. Again, running on Linux gives a this deterministic waveform, with the pulse width around two times that of the first test (as expected). On the Windows machine, the things look much more chaotic.

One other interesting feature of this device had shown itself during testing: during the two channel test DLP-232PC simply froze. From the oscilloscope trace it appeared that the IO pins were put into the high-Z state and the device stopped responding to any commands from the PC (even to the reset and ping commands). Disconnect/connect didn't fix it, although the USB handshake worked normally (we did notice that the diagnostic LED blinked a little different pattern than usual on power-on). Only after being disconnected from power for a couple of minutes did it unexpectedly return to life.

My only explanation is that somehow the software in the PIC crashed (maybe receive buffer overflow) and the microcontroller didn't reset properly when the power was absent only for a short time.

So, the first impression says this is not exactly fly-by-wire grade material. Beware of hardware that is actually software in disguise.

Posted by Tomaž | Categories: Digital | Comments »

Liquid semiconducting nuclear batteries

26.10.2009 0:17

Electronics lab blog talks about an interesting paper being published in Applied Physics Letters back in July. It's about a new kind of a nuclear battery that is tiny and (of course) has a potential for a far higher energy density than your ordinary chemical battery. The actual article is Radioisotope microbattery based on liquid semiconductor by T. Wacharasindhu et al. (PDF).

It shows a very interesting approach. Your ordinary nuclear battery uses the same principle as a photovoltaic cell - an PN junction is exposed to α- or β-particles which can generate free charge carrier pairs in the depletion zone. These holes and electrons are then separated by the electric field in the junction and can form a current through an attached load. The problem is that radiation over time also damages the semiconductor by knocking whole atoms out of the crystal lattice - this is usually the main factor determining the lifetime of such a device, not the half-life of the radiation source.

What they did here to fix this problem was to replace one side of the junction with a liquid. Since atoms in the liquid can move freely, the radiation doesn't cause any damage to it if it moves some atoms around. The additional gain is that you can have the radioactive source mixed with the semiconductor and so present directly in the junction itself.

Apparently, selenium has semiconducting properties in the liquid state so they used a Schottky junction with a P-doped selenium on one side and aluminum on the other side. P-dopant was an isotope of sulfur, which was also the radiation source.

Don't expect this in a laptop anytime soon though. It gives out around 1 V open-circuit, but the power is in the order of tens of nanowatts.

An interesting exercise is to calculate how much of the energy of the isotope they actually managed to convert to useful work: They used 166 MBq of 35S which they report has an average decay energy of 49 keV.

P = A \cdot E_{decay} = 166 \mathrm{MBq} \cdot 49 \mathrm{keV} = 8.1 \cdot 10^{12} \frac{\mathrm{eV}}{\mathrm{s}} = 1.3 \mathrm{\mu W}

Which gives the efficiency of their cell around 1%. By volume I say around 90 thousand of these would fit into the space occupied by the battery on my Eee PC. Even with 100% efficiency, that would still be just around 100 milliwats. So, great for tiny devices, not so great for your consumer electronics.

Posted by Tomaž | Categories: Life | Comments »

Urbana

24.10.2009 13:01

City buses in Ljubljana finally got a modern way of paying for the fare: instead of tokens and tickets that had to be visually inspected by the driver we now have an RFID card, similar to London's Oyster (although not as innovatively named).

Urbana, enotna mestna kartica

I haven't seen any mentions of how the system is implemented. Is there just an unique identifier on the card and the bus looks up the credit in some database through a wireless link? Or is everything stored on the card itself?

I know buses have a data link. A while ago they all got equipped with a fleet monitoring equipment that has a GPS receiver and a GSM modem so the central control knows the location of all buses all the time (there's even a semi-official page predicting the arrival times). I'm guessing that plugging this new payment system on top of that infrastructure wouldn't be too hard.

However, the way you top-up the card on a self-service machine suggests that the amount of credit is actually stored on the card itself. The procedure goes like this: touch the card on the reader, insert money, touch the card on the reader again. If only the ID is stored on the card, why would the machine have to communicate with the card for the second time?

It's possible that the card only works as a cache - the real-time check is done with the credit amount on the card and the central database only verifies that it's correct. I've heard somewhere that this is how Slovenian automatic toll collection system ABC worked.

But if it works that way, and they later find a discrepancy between the reported credit on the card and the checkpoint in the database, what can they do about it? Unlike car owners, these card users are anonymous. Or at least they should be. Actually, they demanded a personal ID if you wanted to get the card for the first month (which was in my opinion outrageous). But now you can buy the pre-paid version anonymously.

Anyone from Slovenia seen any more details on this?

Posted by Tomaž | Categories: Life | Comments »

Flying

22.10.2009 21:16

This post has been sitting in my drafts folder for quite a while now. The photographs were made the old-fashioned way - you know, by catching photons in silver compounds instead of silicon - so it took a little longer than usual to put them on-line. I thought it's better to post it late than never.

So, (back in August) I went on vacation to Bovec in the Julian Alps. There's a small airfield there and parked around it were a couple of gliders. You can imagine I didn't have to search for long to find a pilot that would happily take me for a couple of circles if I paid for the tow. It's one of those things I always wanted to try, but never came around to actually doing.

Preparing a LET L-13 Blanik for take-off

The aircraft was a LET L-13 Blaník, a two-seater trainer manufactured in Czech Republic. Nothing high-tech really. I believe the only electrical device on board was a hand-held radio for communicating with ground control (which as it turned out, was a similar radio, positioned curiously on a table near a couple of kegs of beer).

In the cockpit of a LET L-13

It's interesting that you have to wear a parachute inside one of these. Not that I was worried, but somehow I doubt it would do me any good in case of an emergency - we would go ridge soaring which meant flying really low near the slopes surrounding Bovec. Even at our top height of 750 m above the valley, you would have only a good 10 seconds of free fall.

The flight itself was much smoother than I thought. I believed that such a small aircraft would bounce around in the turbulence, but in reality the ride felt just a smooth as on a jet liner. In fact, even in the tightest turns the g-forces were nothing compared to what you feel at the take-off in the cabin of an Airbus.

On the other hand, I always imagined that there would be silence in a glider, considering there's no engine. It was soon obvious that that's far from truth: I had to shout quite loudly if I wanted the pilot to understand me because of the noise the wind was making.

Javoršček above Bovec

Although noisy, the wind wasn't as good as it could have been and we didn't manage to gain much height over the city. It also meant the flight was cut a little short. We spent most of the time just above the tree tops. Actually, now that I think about it, I would hate to miss the experience of flying in a narrow valley and seeing the mountains close by on all three sides of the airplane while traveling at 100 km/h. So perhaps the weather was just right.

I have to do this again some time.

Posted by Tomaž | Categories: Life | Comments »

Pirate what?

20.10.2009 22:08

There was a panel on politics and the Internet in Kiberpipa today. More interestingly, the Slovenian version of the Pirate Party was said to present itself there. I missed the first half of the panel and I left before the end because I just couldn't stand listening to it.

I was excited when I first heard that there is a political party in the making in Slovenia that wants to imitate the notorious Swedish and German ones. For some reason I actually believed that there are some people here that will be able to follow in those footsteps. Now that I've seen the performance of their representatives on the panel I have very little hope left. This is a typical Slovenian political failure. People that don't actually know what they are talking about and don't give concrete answers to the questions from the audience. And I foolishly thought that there would finally be an entry on the next ballot that I could choose with confidence.

Also, if I were moderating the discussion I would nicely ask the guy with the mobile phone to leave the stage. I don't care who he was. Call me old-fashioned, but I don't think all those people came there to look at him reading text messages from his wife and listen to his comments about them. What kind of an etiquette is that?

Anyway, there will probably be a video recording available soon so you can make your own judgment.

Posted by Tomaž | Categories: Life | Comments »

Symphonic Shades

17.10.2009 23:08

I got my copy of the Symphonic Shades CD album a few days ago. It's the recording of a concert that took place back in August 2008 in Cologne, so this is possibly old news, but I only found out about it a couple of weeks ago.

The album contains orchestral arrangements of music composed in 1980s by Chris Hülsbeck for video games on Commodore 64 and Amiga. The music was mostly rearranged from the original FM-synthesis and 8-bit samples by Jonne Valtonen (a.k.a. Purple Motion of Future Crew, author of the first half of the Second Reality sound track) and played by the Cologne radio symphony orchestra.

The result is really impressive. There's a bit of chorus singing here and there, drum solo, piano suite. I'm not used to writing about music, so I'm a bit short on vocabulary here. It often sounds like a good movie soundtrack, but it doesn't feel like one (I usually find listening to movie scores a bit weird, because I know the visual part is missing).

It's funny that these melodies don't actually ring any bells for me, so I guess a feeling of nostalgia isn't necessary to enjoy them. I grew up with a Spectrum, so I probably missed most of the games featured here. I did almost certainly play R-Type, but I guess the limited sound capabilities of my machine compared to Commodore 64 didn't make for an impressive performance back then. I do however remember hearing one particular melody (Giana Sisters) in some old Machinae Supremacy track.

So yes, I would definitely recommend it if you like this kind of music. In fact, with a little bit of searching you can find most of the video recordings of the original concert on YouTube - but I got the CD because I think the authors really deserve the support (plus the cover art is cool).

Posted by Tomaž | Categories: Life | Comments »

Box of goodies

15.10.2009 16:12

Mailman just brought me a box of goodies. Among other things a couple of these were inside:

Arduino Duemilanove

I never worked with an Arduino before and I thought I might try playing with one considering all the buzz that's been going on around them. And I might have just the right place to use it.

Posted by Tomaž | Categories: Digital | Comments »

Days of Physics 2009

11.10.2009 22:46

I caught the last day of this year's Days of Physics event that took place in Technical museum of Slovenia, Bistra. Days of Physics are a collaboration of Slovenian students of physics from both of our largest universities and several other institutions and are meant to show the field in a fun and very hands-on way.

Friction lock (Leonardo's) bridge

This simple experiment was the first thing that caught the eye, since it was placed in front of the hall entrance. It's a friction lock bridge (they were calling it Leonardo's bridge, since the principle was supposedly invented by Leonardo da Vinci). The point is that wood planks form a bridge that can carry a person without any kind of bonds - the only thing holding it together is friction.

I tested it myself, and although it doesn't really make for a sure footing I did make it across. I guess this is not really useful if you need to cross a chasm while stranded on a deserted island - it's impossible to build it if you only have access to one side.

It would be interesting to know what are limiting factors in this design. I'm guessing the thickness of the planks (which affects the curve) and coefficient of friction play a role.

Inside the hall there were ten-something experiments on show. Here are the few most interesting:

Infrared image

You could make an infrared portrait of yourself. Not really ground breaking, but infrared cameras are still fun to play with.

Demonstration of the principle of Maglev train operation

This was the demonstration of the principle behind magnetic levitation (maglev) trains. A pellet of high-temperature superconductor was floating above a rail constructed out of bar magnets.

An interesting trick I haven't seen before is to cool the superconductor below the critical temperature inside the magnetic field (e.g. at some known elevation from the track). In this case the pellet will not only float above the track, but will also resist when you try to pull it higher. This makes it possible to actually turn the track up-side-down without the "train" falling off.

Gauss cannon

You garden variety of the Gauss cannon. Watching carefully I now understand where the extra kinetic energy that the last ball carries away comes from.

Fun with plastic bottles and liquid nitrogen

After the show there was some left-over liquid nitrogen. And you probably know what happens when you have liquid nitrogen, physics students and a supply of empty plastic water bottles (this video pretty much sums it up if you haven't encountered this phenomenon before).

Concluding event was a talk about the Apollo project by dr. Dušan Petrač, a Slovenian physicist working at JPL.

It was a fun way to spend an October Sunday and judging from the expressions on faces of kids (and adults) around the hall I'm guessing the project achieved it's goal of making physics just a little bit more interesting for everyone.

Posted by Tomaž | Categories: Life | Comments »

One more thing about HHK

08.10.2009 23:16

Actually, I wanted to add two more concluding notes to my saga about the Happy Hacking Keyboard.

First about the unreliable keys. Trimming the plastic leftovers from the molding process on the bottom of the keys and thoroughly cleaning the membranes only helped for the escape key. Cursor keys were still pretty unreliable, often failing to register keystrokes.

The final solution I found after disassembling the keyboard something like ten times and that has been holding for two weeks now are several strategically placed pieces of cardboard. These help to support the metal plate under the keys. My theory is that the plastic stubs that press it in place weren't manufactured precisely enough and the plate buckled somewhat, especially in the corner under the cursor keys. I guess tolerances are pretty tight if you want to make a reliable switch with such a membrane.

USB ports on the Happy Hacking Keyboard

Second thing I wanted to add is a comment about the usability of the built-in USB switch. I thought that together with the keyboard's small size it was a nice addition to reducing the clutter on the desktop. Well, the switch is pretty useless for any device that doesn't come with it's own USB extension cable. The connectors are recessed into the casing and the openings are narrow enough so that none of my USB keys can be used with them.

Finally, considering my experience with this Happy Hacking Keyboard Lite2, I can firmly say that I don't recommend buying it. It appears however that some are happy with it - see for example Unr3a1's comments on my first post.

Posted by Tomaž | Categories: Digital | Comments »

More about wchar_t

06.10.2009 19:26

I guess there's something missing from my yesterday's post. I mentioned that wide strings as defined by the standard C library can hold characters in any encoding. Just about the only thing you do know is that every wchar_t value holds exactly one character. Your C code can make no other assumptions.

I said that Python's PyUnicode_FromWideChar fails at that step. But how can you do the right thing and convert wide character strings to a known encoding?

One way is to use iconv character set conversion function in the standard library. At least in the GNU C library this function supports the WCHAR_T as the input and output encoding, which allows you to transform wide strings to any other encoding.

Here's a simple example (that lacks all kinds of error handling):

#include <stdio.h>
#include <wchar.h>
#include <iconv.h>

int main() {
	wchar_t input[] = L"Toma\u017e \u0160olc";
	char output[128];

	char* pi = (char*) input;
	char* po = output;

	size_t ni = wcslen(input) * sizeof(wchar_t);
	size_t no = 127;

	iconv_t cd = iconv_open("ISO-8859-2", "WCHAR_T");
	while(ni > 0) iconv(cd, &pi, &ni, &po, &no);
	iconv_close(cd);

	*po = 0;
	puts(output);
}
Posted by Tomaž | Categories: Code | Comments »

Amazing adventures of Py_UNICODE in Pythonland

05.10.2009 20:08

Today I ported some Python modules written in C from Python 2.4 to 2.5 and got stuck when compilation failing on translation between Py_UNICODE and standard C wide-strings. By now it is probably obvious that I won't let a piece of software out of my hands until it handles Unicode properly, even when I have to dig out dirty implementation details. So here are all the skeletons I dug out of Python's back yard.

Regarding Unicode support, there are actually two versions of Python: one stores strings encoded in UCS-2 (string is serialized into a sequence of 16-bit words, one word per character) while the other uses UCS-4 (sequence of 32-bit words). Since these two internal formats are obviously incompatible with each other this means that modules compiled for one version of Python will not work with the other. The first one is more memory efficient, while the second one allows the use of characters outside of the Unicode Basic Multilingual Plane (i.e. characters with numbers above 0xFFFF). I'm not sure why the interpreter keeps supporting both variants, but at least on Linux it seems most distributions tend to stick with the UCS-4 variant. By the way, this is controlled by the --enable-unicode=ucs4 argument to the configure script when compiling Python and is apparent from the value of Py_UNICODE_SIZE constant.

This settles the question of the lowest-level representation - mapping from abstract codes points in the Unicode code space to concrete sequences of ones and zeros in computer's memory. Apart from this distinction all Python interpreters use the same lowest-level representation of strings - regardless of what is considered the "native" form of the system.

Now comes the question of how this string representation presented to the C compiler. This is important, because it affects the way algorithms are expressed in C (Python and it's standard library contain many functions that have to deal with encoded Unicode strings). For example, if a compiler sees an UCS-2 representation as an array of unsigned 16-bit values, C code that deals with characters above 0x8000 will be different than if an array of signed values is used: in the first instance these characters will be interpreted as positive values and in the other as negative - which affects comparison operators.

Pythonistas decided on using unsigned values, which is a sensible decision, since Unicode characters have code points starting at 0 and negative values don't make much sense. So, at this point we know that Python uses UCS-2 or UCS-4 encodings which are presented at the C level as lists of unsigned 16 or 32-bit values. And this is exactly what various arrays of type Py_UNICODE* in the Python API refer to.

Things get complicated when you want to plug strings from the Python API to code that uses the standard C library. C has its own equivalent to Py_UNICODE* called wchar_t*, which is what various string-handling functions accept and return. The problem being that wchar_t* is defined as platform dependent which translates as go away - we don't want to deal with all this mess. It can be either signed or unsigned and the only thing you know about it is that it's at least 8 bits wide.

For example, on this Debian x86 system, it's defined as a 32-bit signed value and contains UCS-4 (which also depends on the POSIX locale settings). This appears to be the most common setup on Linux machines I've seen, while I hear Windows uses 16-bit for wchar_t. This ambiguity in C is also the reason why you have to dodge all those gchars, xmlChars and similar types all the time as authors of other libraries try to work around the problem.

So apparently in this common case the standard C library and Python have the same basic UCS-4 representation for strings, except their view on it is slightly different: C library interprets them as signed numbers and Python unsigned. But both write the same sequence of ones and zeros to the RAM for the same abstract Unicode string. Obviously, the translation between both representations is nothing more complicated than a C pointer type cast.

Python in fact checks whether wchar_t matches his world view and sets Py_UNICODE to be an alias for wchar_t in that case (to enhance native platform compatibility as the manual says). It just so happened that Python 2.4 had a slightly broken check and didn't take into the account the signedness of the type, so on Python 2.4 Py_UNICODE could be set to wchar_t even when it was a signed type.

However, nothing says that this match of word size and encodings isn't just a lucky coincidence. When C and Python's ideas do differ, Python API actually has routines that at first glance appear to try to lend you a helping hand: PyUnicode_FromWideChar and PyUnicode_AsWideChar. But these really just wait to push you off the cliff. Under the hood they basically do this:

Py_UNICODE *u;
wchar_t *w;

for (i = size; i > 0; i--)
    *u++ = *w++;

Given the large range of sizes and encodings that wide C strings can hold, this simple loop can break character encodings in a wonderful number of ways.

Update: Here's the correct way to convert wide strings to a known encoding using iconv().

Posted by Tomaž | Categories: Code | Comments »