Just as a word of warning, I'm neither an RF expert nor pretend to be one. I spent most of my days in the university ditching classes, hanging out at coffee shops, and attending dance rehearsals. I spent the next decade and a half pretending to work at my various jobs. I'm the furthest thing you can get from an PhD'd RF engineer. But if you're willing to accept all that and are still willing to believe me, then read on:

I made a very interesting discovery recently while I was trying to improve the range of my 2.4 GHz wireless boards. I was getting limited range out of the on-board SMD antennas that I was using and was doing testing on multiple boards to see if it was due to the variance of the discrete RF components or if it was an inherent property of the system.

The funny thing was that in my sample of about five boards, one board constantly demonstrated a high received signal and transmitter output, almost twice that of the other boards. I was keeping detailed notes of the component values on all the boards so I copied the exact same component values on another board but couldn’t duplicate the results. The complete matching circuit was identical and I even replaced the radio and the balun to see if those were causing the discrepancy.

I spent about two days investigating this and trying to get to the bottom of why that one board was constantly outperforming the rest of them. I finally found the solution and it was staring at me in the face.

On all of my radio boards, I have two antenna options. One of them is for the onboard SMD antenna and the other is for the right angle SMA connector which can interface to external antennas. You can choose between the options by moving a capacitor to either the SMD antenna path or the SMA path. The board that was constantly outperforming the others had the capacitor in the SMA path rather than the SMD path. The strange thing was that there was no external antenna connected to the SMA.

Now that I’ve finished teaching my first microcontroller class to Tokyo Hackerspace, I must say that it was quite an enlightening experience. To put it mildly, I was pretty nervous about teaching it. Microcontrollers are a complex topic that combines software skill, hardware skill, toolchains, and hardware platforms. It’s not really a class to teach for the faint of heart, and most schools take three to six months to teach the subject…to engineering majors. So trying to cram all of that into a four hour class and teach it to people of differing backgrounds gave me loads of anxiety.

I think the difficult part was trying to figure out where to start. The famous saying, “ a journey of a thousand miles…” was useless to me because I couldn’t even decide where to take my first step. I eventually decided that I wouldn’t cover anything. I wanted to run things as simply as possible and get people toggling LEDs, moving motors, and reading sensors quickly.

To say that I wouldn’t cover anything is being a bit overdramatic. I structured things into a series of labs and prepared detailed lab notes on handouts that discussed everything I wanted to say. The lab notes also contained step-by-step instructions on how to perform the lab. Rather than being an instructor, I wanted to try being more of a lab assistant and have the class be a guided tutorial. The results were very impressive.

So much attention has been paid to digital technology in the past couple decades that its easy to forget our electronic roots. However I’ve been seeing an interesting trend in the past few years that seems to be accelerating. Just like bellbottoms and hippy gear, it looks like analog electronics is starting to become fashionable again.

Analog is the bane of electrical engineering students and conjures up bad flashbacks of analyzing useless circuit diagrams composed of passive components in bizarre configurations and trying to remember equivalent circuits for different types of transistor signal analysis for me.  It’s even worse for non-electrical engineering students because analog is a mysterious form of black magic that only bearded old men understand. Incidentally, all of those flashbacks were eliminated after I discovered two magical tools: SPICE simulation and Matlab.

It’s no wonder that analog has taken a backseat over the past many years as breakthrough after breakthrough in digital technology kept inundating us, showering us with cheap hardware, free software, and boatloads of information. But a peculiar trend seems to be emerging with the increasing tech savvy of everyday users combined with the mountains of information that are now available at everyone’s fingertips. That, along with the recent popularity of DIY hardware, hardware hacking, circuit bending, electronic art installations, and environmental sensing, are starting to spur curiosity in how technology can fit into our everyday world. All of a sudden analog electronics is back in vogue again and it seems to be coming from a culmination of multiple factors.

Well, I finally did it. I mentioned before that I was working on something when I was in the US to while away the time at the coffee shops. That something was an ultra-simple, ultra-small wireless stack that could just be used to transmit and receive data. It’s something that came out of working with the people at Tokyo Hackerspace . A lot of them were interested in being able to control things wirelessly, but I didn’t really have an answer for them. I felt that Zigbee was too complex, and even 802.15.4 had a fairly steep learning curve, due to the need to do passive/active scans and associate with other nodes.

What most of them were looking for was just something that they could transmit and receive data with. I’ve been thinking about how to implement something like that for a while because wireless stacks, even simple ones, can get complicated due to wireless being an unreliable medium. Hence you have to build in complexity to handle timeouts, retries, acknowledgements, etc.

I stumbled upon an interesting article from Embedded.com today called "Real Men Program in C" . It highlights one of the biggest paradoxes occurring in the embedded space, which is that the embedded field is growing, C is becoming the dominant language, and the amount of programmers that can program in C is shrinking. The reason for this is that C is rarely taught in universities anymore, most likely due to the fact that C requires understanding of both the hardware and memory architecture as well as computer science constructs such as data structures and algorithms. In short, C has become the new Pascal in university education.

The interesting thing is that I can actually see this problem in real life now since I've become part of Tokyo Hackerspace. The majority of the techies involved in hackerspace program in languages like Python and PHP and want to get more involved in programming embedded projects. While there isn't anything wrong with those languages, a lot of the low level details are abstracted away, which is great for learning the computer science part of things, but horrible if you every want to program embedded systems. 

I've recently set up the hackerspace webshop and had to do some PHP hacking to get things working like modifying the shipping cost tables for the Japanese postal system. Since it has a C-like syntax, it wasn't a real problem writing the simple PHP functions required to implement the cost tables. However I was a bit shocked to see that the language doesn't require strong typing. You don't even need to declare a local variable. You can just instantiate one anywhere in the function and it will hold strings, integers, floats, or anything else. There's also not a strong concept of scope that I could see (ha ha ha...get it?). The variables were either local to the function or global, and by global, I mean visible to all files in the project. Any C programmer would cringe at the use of global variables because they lead to problems like non-reentrancy, random errors, and really weird behavior in general.

My first thought was..."wow, this language is pretty simple and easy to use". But in thinking about it, it's a big handicap if a PHP programmer goes in the opposite direction and wants to move to a language like C. C requires a lot of discipline about data types, scope, and memory usage. That's not to disrespect PHP programmers since they have to deal with all sorts of nastiness like tracking down bugs from a PHP file included in some random file located within the 500 or so files that come with whatever content management system they're using. And I don't seem to see a good debug infrastructure for PHP other than logging error messages (although I could be wrong).

I think this is one of the reasons why the Arduino has become quite popular. It's a good way to introduce people to embedded systems and projects without having to deal with a lot of the dirty details like chasing wild pointers and memory leaks. And of course, you can say you program with Arduinos and still maintain a certain amount of dignity. It's more difficult if you're using a language like PICBASIC. 

But the eventual barrier that I see a lot of Arduino users running into is that projects are limited to the AVR processor. Normally, I like AVRs but in terms of the embedded space, you lose a lot of the richness and variety of the different flavors of MCUs if you stick with just one processor.There's the integrated signal chain and low power of the MSP430s, the infinite amount of variations in the Microchip PICs, the configurability of the Cypress PSOCs, and let's not forget playing around with the ARM-based SOCs that are dotting the chip world these days. In order to harness the rich variety of all these different chips, you would need to learn how to program in C. Otherwise, you'll end up with Arduino shields piled on top of each other like the Leaning Tower Of Pisa and still be limited to the Arduino API. 

Anyways, this post isn't really to rail against the Arduino, which I think did a great job of introducing an entirely new generation to the embedded world. Nor am I trying to complain about PHP, Python, or any other scripting language which is doing a great job of powering content-driven websites and even making this site's creation and maintenance possible by a complete web n00b. The main issue is that the embedded field is expanding and there's a huge interest in combining embedded technologies and web technologies. This can be seen by the growing popularity of publications like Make Magazine, numerous hackerspaces popping all over the world, and of course my involvement in Tokyo Hackerspace. However there isn't a whole lot being done about the paradox that C is a dying language in the academic world, but very alive and kickin' in the real world.

Hmmm...I guess this counts towards my quota of one rant on the lack of C education per year...

Protothreads are the building blocks of the Contiki OS which drive many wireless sensor protocol network stacks: SICSLowPAN (using uIP), Rime, and of course FreakZ, my Zigbee stack. Larry Ruane has written his own version of Protothreads, based on Adam Dunkel's original version, for Unix and includes its own protothread scheduler. Altogether, the implementation runs 400 lines of C source code and provides users with deterministic thread scheduling which can be used in many applications. What's more interesting is that the protothread implementation can be run in it's own Unix thread so you can have a protothreaded application running within a threaded application. Fun fun fun...

As an aside, Contiki OS also natively supports protothreads on x86 and can be run in Linux or Windows threads. But the more the merrier...

Here's a link to the sourceforge project:


You can't see it much from the blog, but I've actually been pretty busy lately. I've been wanting to post some of the stuff I've been doing, but it's been hard to find the time. The PCBs that I made for the part time consulting job came back last week and I've been busy assembling them. Hand assembly is definitely not very fun. The first opportunity that I have an extra $30k lying around, I'm going to buy me a pick and place machine. Then my apartment is gonna be totally cool!

For soldering, a lot of my friends ask me how to hand solder QFPs and QFNs. I learned by hanging out with the technicians at my old jobs, but you can also read about it on tutorials on the internet. The Sparkfun one is especially good .One thing that people don't tell you is that its best to use the fat wave soldering tips. The place that I see that most people mess up on is that they use the skinny fine tips. Those ones have really poor heat conduction due to the small surface area, and in soldering, heat conduction is everything. I recommend the fat tips and water soluble flux. The fat tip has excellent heat conduction, and if you make a bridge, you just brush on some flux and use the fat tip to wick away the solder. The solder will travel to the tip due to the larger surface area. I would have included more detailed photos of soldering the QFN, but my camera doesn't do good closeups. If I can ever figure out how to do closeups with a standard digital camera, I'll be posting some soldering tutorials of my own.

Without further ado, here are the pix...

{gallery}2008-10-28 PCBs{/gallery}

For people that buy their own parts and build their own boards, you normally do price searches on digi-key which is dramatically overpriced. I've forgiven them for this because they're convenient and they cater to individuals/hobbyists so I figure their handling costs are higher. But lately, I've been liking the chip price search engines and thought I'd share two of them that I use a lot:

Findchips - Findchips does a search over many distributors (mostly US based) including the big ones for the hobbyists (Digi-Key, Jameco, Mouser...). They have a decent interface and show which distributors have the part you're looking for as well as the price. 

Octopart - I just found this one recently. It has a search engine and also a category tree that you can use to look for stuff that you might not know to search for.  It's still quite a new site and many of the categories are empty (similar to my site?). Also, Digi-Key is currently suing them to take their prices off of the site, although I don't know why they'd turn down extra traffic and sales. It's a good site though and when you do searches, they even search the semiconductor manufacturers for the price listings. You can test this out by doing a search for CC2420. Note: Manufacturers should help them build up their product category trees by listing their own products. It couldn't hurt sales and might even improve it. 

Anyways, I just thought I would share those with everyone because they're quite useful. 

Man, if I had a dollar for every article I read that claims C as a dying skill, I'd probably have around five dollars by now. Hmmm...that would include assembly language programming, I think. Well, here's another one, where C made the top 10 dying skills list at ComputerWorld . The article is a bit old, but C ranked just below cc:Mail programming in the list. With C programming as the second most popular language in the world ,  why is it that people think of C as a dying language?

That's an interesting question. I think it's probably because it's not really taught in universities anymore, since they are focusing on scripting/interpreted languages .  The reason behind that is that C is a difficult language to understand, especially for undergraduates. Having college students track down memory leaks and pointer problems is an easy way to make them cry, and kind of mean too. As Wikipedia put it: "...the safe, effective use of C requires more programmer skill, experience, effort, and attention to detail than is required for some other programming languages" . With the tech companies in the US shouting at politicians and universities that they aren't turning out enough computer science engineers, the schools seem to be watering down the curriculum to make sure that less people drop out. Is it bad? Is it good?

In general, I don't think its bad. Having more Java and Python programmers in the world can't hurt. You can do a lot of interesting things with those languages and make many useful applications; like say Google, where Python is one of the official languages. However I can say that it's bad for the future of the embedded industry. Anybody in the semiconductor or electrical engineering industry today knows that there is a shortage of embedded engineers . One of the main reasons for this shortage is that C is rarely taught anymore. Even C++ is slowly getting faded out from the curriculum at many schools. And in the embedded realm, C is king. 

At the semiconductor startups that I've worked at previously, software drivers are the main difference between being able to sell a chip or not. That translates into revenue for the company. Having software drivers available enabled a former startup company I worked at to generate $10M+ in sales off of one chip. Not having software drivers available also forced the company into bankruptcy. Well, that and a couple other factors. The reason why software couldn't be finished in time and fit the customer requirements was that its hard to find good embedded software engineers. These days, IC design is mostly stitching together different IP cores and running tests at the toplevel. But that level of design automation and reuse has not hit software yet so it's still largely a craftsman-based skill. Unfortunately, the available pool of craftsmen is shrinking because there are fewer places to learn the language, and nobody wants to give on-the-job training to a noob that, with one wrong pointer operation, can bring a whole system to its knees. 

So the main point of my post, which I seemingly forgot, is that C is still very much needed and anything but a dying language in this industry (embedded). And the benefit of knowing it often comes in the form of a six-figure salary; at least if you find the right company. Laughing

Someone posted a link recently on Slashdot entitled "Obsolete Technical Skills", one of which was assembly language programming. Actually, I'd have to agree that its obsolete to program in assembly language, in every industry except embedded. Also obsolete is trying to fit your code into less than 64K of memory or trying to get your program to run on 2K RAM, in every industry except embedded. Why do we need to struggle with these issues? Because the universal design principle of embedded engineering is:

Design it to be as cheap as possible and still be functional

I like to call this the "Cheapskate Postulate" and this is universal in almost any manufacturing industry. This means that if you can use a $1 8-bit 8051 instead of $10 32-bit ARM, then you should do it. 

And what's the best way to reduce your code size and RAM usage, hence minimizing cost? And the best way to improve performance when you have to use a slower processor? Hint: NOP, JMP, JNE. 

 The second best way to reduce your code size and improve your performance is to write your software in C. I'm not sure if anyone's ever tried to write a Java OS, but I'm sure it'll be slow as shit.