On black magic

27.09.2011 17:42

A while ago, during lunch break at a BarCamp, I sat at a table with some of my colleagues from Zemanta. There was an iPhone 4 on the table and conversation touched its infamous antenna issue. Gašper said "You know, I heard antenna design is kind of a black magic" to which I replied that somewhere an RF engineer is probably saying to her colleagues "You know, I heard software development is kind of a black magic".

While I was only half serious at the time, it did get me thinking. I believe that design in all fields of engineering has parts that can be calculated analytically. Parts where laws of nature, either directly or through simplification and abstraction permit parameters to be calculated with formulas and algorithms to some useful engineering precision. There are also other parts where the only solution is an experiment. Those are the parts where you have difficult to predict parameters, chaotic systems or simply involve systems so complex that an analytical solution either doesn't exist or costs more to obtain than doing a series of experiments that will lead to the same result.

I understand black magic as a problem that favors an extremely experiment-oriented approach. A problem where a solution can only be obtained with a heuristical process. One that requires intuition of a designer that has over the years accumulated and internalized a large number of failed and successful experiments. Designer with enough experience that when presented with a problem he can intuitively recall what approach worked well in a similar case, even if he can't explain it.

The two approaches, the purely analytical and purely experimental, overlap to a large degree. For example when designing analog circuits you can go pretty far calculating exact values down to the last resistor in the system, counting in stray capacitances and so on. Or you can take a more experimental approach. Calculate some basic parameters with wide enough error margins, build a prototype circuit or a simulation and tweak it until it works to your satisfaction. I would argue that the sweet spot lies somewhere between both extremes. Both in the aspect of least effort and quality of solution. Of course, hitting that sweet spot, knowing when you went too deep into theory, is also where intuition and experience come into play. It's a different kind of intuition though and arguably one that is more broadly applicable than the black magic one mentioned above.

Moving work from design to implementation.

Original by Paul A. Hoadley CC BY-SA 2.5

It's quite the same with software engineering. Just replace pages upon pages of differential equations with design documentation in a waterfall model. Or incremental building and optimization with the coveted agile approach.

In fact my growing dissatisfaction with the software development world is that the mindset is ever more moving towards extremely experimental design. This is most apparent in the fresh field of web application programming. There is little to no strict documentation on much of the modern frameworks that hold the web together. Few well defined interfaces have specifications that are too complex to fully understand and are riddled with implementation bugs. It's the field that favors the magicians that have enough mileage to recite individual web browser CSS bugs by heart in the middle of the night, ordered backwards by the issue numbers. With continuous deployment, increasing hate of version numbers and release cycles for software this bias is only growing more prevalent throughout the software industry.

It's an interesting contrast to the inherently analytical nature of computer programs and it certainly doesn't look like the fictional RF engineer's statement would be any less well grounded than my colleague's.

Posted by Tomaž | Categories: Ideas

Comments

Software development is done for a great part by open source people, working on it as a hobby. These will most of the time go the black magic way, because it's a way to try something you don't know about. On one side, this may lead to loss of time by missing a known solution and an easy way to go with a known formal approach ; but on the other hand it allows for some creativity and new designs, either by pure chance after a long tweaking time ; or by throwing in ideas from another field of competence.

It hapenned to electronics as well in the past. Ever heard of clive Sinclair rediscovering binary representation of numbers and arithmetics long after the start of computer science ? I think it's a way to discover new things. Researchers need some experimental questions (like "why do this work, and that doesn't ?") to get started, then they add a lot of experiments, tweaking parameters until they can extract the theorical explanation. Sometimes it's useful, sometimes it's so complicated that it's easier to keep tweaking parameters. But in the end, just applying known equations will never bring you anything new. Sure, you can also tweak the equations themselves :)

Black magic happens atthe start of anything, then, eventually, it clears out or moves further. Leave some more time to software engineering and it'll get better. You're just facing the move from computer science, an already formalized field, to software engineering, which isn't yet, so it may seem to go backwards. I hope it doesn't :)

Thanks for the insightful comment, PulkoMandy. Experiments are, of course, the basis of research. No argument there. I just find it sad that we have made software such a complicated moving target that each individual software engineer must re-apply the methods of basic research to discover how to use the part he is interested in.

I didn't know Clive Sinclair reinvented binary. Sounds like an interesting story. Do you know where I could read more about it?

Posted by Tomaž

Not sure where to read more about it, a techer told me that story some years ago.
You can find some mentions on it like here : http://www.nvg.ntnu.no/sinclair/sinclair/life_of_clive.htm

As for exeperiments in software design, I feel there is just no method with a proper balance yet. When designing stuff for analogue electronics, you have several years of research and work behind, and now people know which factors are important and which can be neglected. So in some fields it's ok to follow the rules. On the other hand, as the power supply voltage gets down in digital devices, factors that were ignored before now become more important. This applies to noise in the circuits, as well as the formula to compute the power use of a circuit, for example. The same applies for antennas, as the wave frequencies increase, you have more and more factors to take into account. The antenna must have a precise length that matches the wavelength, but it must also be connected the right way with the integrated circuits. Increasing the wave frequency shows new problems that may be solved by "black magic".

In software design, you can face the same problems. For simple software, the good old V-cycle is fine, but as the software gets bigger and more complex you can't think about everything before writing a line of code. Unforeseen problems will be found as youwrite the actual code. Agile methods, on the other hand, seem to fail at capitalizing experience and building a theory/workflow from it. Ways of working are actually emerging, for example the use of Java makes people work with references and garbage collecting to manage memory. On the other hand, C++ would allow many other ways to handle it. It is not used as much for production code...

Tomaz, jaz sem po 20 letih spet zacel programirati pri 66. C# posle Pascal. Vec napredka sem pricakoval ampak distributed environment je ocitno zakompliciral software. Elektronika bolj linearno napreduje dokler ne udari v meje atomov.

Antene so najbolj preprost elektricni element monopol! Magija je v radijskem sevanju.

V 50+ let elektronike in racunalnistva sem imel najvec veselja z eksperimentiranjem, a na stara leta s simulacijami :-)

LP MMM S56A

Add a new comment


(No HTML tags allowed. Separate paragraphs with a blank line.)