I recently picked up a Miniware LA104, a small 4 channel logic analyzer. I thought a stand-alone, battery powered logic analyzer might be useful in some cases, especially since it comes with built-in protocol decoding for I2C. I have a few random USB-connected logic analyzers in a drawer somewhere, but I found that I rarely use them. When I do need a logic analyzer I tend not to have a PC at hand with all the necessary software setup, I don't want to risk the laptop by connecting it to a possibly malfunctioning device or there's a suitable oscilloscope nearby that works well enough for the purpose.
The first test I did with LA104 fresh out of the box was to connect all of its 4 channels to a 100 kHz square-wave signal and run a capture. I was disappointed to see that the displayed digital waveform had occasional random glitches that were certainly not there in the real signal. 100 kHz frequency of the input signal was well below the 100 MHz sampling frequency of the LA104, so no undersampling should occur either:

Interestingly, when exporting the captured waveform to a comma-separated value (CSV) file using the function built into the stock firmware and displaying it in PulseView, the glitches were not visible anymore:

This suggested that the waveform capture was working correctly and that the glitches were somehow only introduced when drawing the waveform onto the LA104's LCD screen. Considering that this device was released back in 2018 it's unlikely there will be another firmware update from the manufacturer. However since the source for the firmware is publicly available, I thought I might look into this and see if there's a simple fix for this bug.
For the sake of completeness, I was using firmware version 1.03, which came preloaded on my device and is also the latest publicly available version from Miniware at this time. I'm aware that there exists also an alternative open source firmware for LA104 by Gabonator on GitHub. However that project seems to be focused on various other, non-logic-analyzer uses for the hardware. A quick diff of the logic analyzer app in Gabonator's firmware against the Miniware's source release showed that logic analyzer functionality hasn't received any significant changes or bug fixes.
Looking at the Save_Csv()
function in the firmware source, it appears that the CSV export directly dumps the internal representation of the waveform into a file. The exported file for the 100 kHz waveform looked something like the following:
Time(nS), CH1, CH2, CH3, CH4,
0000004990,00,00,00,00,
0000004990,01,01,01,01,
0000005000,00,00,00,00,
0000005000,01,01,01,01,
...
First column is the time interval in nanoseconds while the rest of the columns show the logic state of the four input channels. It's interesting that the first column is not a timestamp. The CSV also does not contain all input samples, rather it only lists the detected changes in input states. The first column contains the time interval for which the channels were held in the listed state after the last change. In other words, the LA104 internally compresses the captured samples using run-length encoding.
In the example above, it means that all four channels were in the logic 0 state for 4990 ns, followed by all four channels in the logic 1 state for 4990 ns, and so on. This timing checks out since it corresponds roughly to the 100 kHz frequency of the input signal. Note also that all intervals are multiples of 10 ns - this makes sense since the 100 MHz sampling rate makes for a sampling period of 10 ns. The export function simply multiplies the sample run length by 10 to get the time interval in ns.
Inspecting the part of the CSV export where the LA104 displays the glitches gives a hint to where the problem might be:
0000005000,00,00,00,00,
0000005000,01,01,01,01,
0000000010,00,00,01,01,
0000005000,00,00,00,00,
0000005000,01,01,01,01,
...
0000005000,01,01,01,01,
0000004990,00,00,00,00,
0000000010,00,01,00,01,
0000005000,01,01,01,01,
0000005000,00,00,00,00,
0000005000,01,01,01,01,
The highlighted lines show that the logic analyzer detected that some channels changed state one sample time before others. That is fine and not very surprising. I'm not expecting all channels to change state perfectly in sync down to the nanosecond scale. This can be due to slightly different wire lengths, stray capacitances and so on.
It seems however that the waveform drawing routines on the LA104 do not handle this situation well. At the scale of the display, the 10 ns delay is much less than one display pixel, so it should not even be visible. However for some reason LA104 chooses to display this short transitional state and then ignores the next, much longer stable state altogether. This leads to the waveform seemingly skipping cycles on the display: in the first instance channels 3 and 4 stay in logic 1 state for a cycle when they should go to logic 0; in the second instance channels 1 and 3 stay in logic 0 state for a cycle when they should go to logic 1.

In fact, the glitch disappears if I zoom enough into the part of the waveform where the missing edge should be. This suggests that this is indeed some kind of a failed attempt at skipping over signal transitions that are shorter than one display pixel:

Looking deeper into what might be causing this led me to the ShowWaveToLCD()
function, which seems to be called whenever the waveform on the screen needs to be refreshed. Unfortunately, this function only seems to send some instructions to the FPGA and not much else.
LA104 consists of a STM32F103-series ARM microcontroller and an FPGA. The FPGA is used for waveform capture while the microcontroller handles the user interface. According to the public schematic, both the FPGA and the ARM are connected via a shared bus to the LCD. Hence it seems feasible that the waveform drawing is done directly from the FPGA. This was still a bit surprising to me, since everything else on the screen seems to be drawn by the code running on the ARM. Probably drawing the waveform directly from the FPGA was faster and led to a more responsive user interface.
Unfortunately, this means that fixing the waveform display bug would require modifying the FPGA bitstream. Its source is not public as far as I know. Even if it were, apparently building the bitstream requires a bunch of proprietary software. In any case, there's not much hope in getting this fixed in a simple way. It should be possible to write a new function that does the waveform drawing from ARM, but that would require more effort than I'm prepared to sink into fixing this issue. In the end it might turn out to be too slow anyway.
I'm afraid LA104 will end up gathering dust in a drawer with my other logic analyzers. A logic analyzer for me is primarily a debugging tool, but a tool that misleads me by showing problems that are not there is worse than having no tool at all. I might find some use for it for capturing signal traces for later off-line analysis on a PC, since so far that seems to work reliably. Other than that, sadly I don't see myself using it much in the future.