The APS layer, also known as the application sublayer, is the top of the main data path in a Zigbee stack. Th only things that lie above it are application objects, which are implementation specific. The APS layer handles data transmission and reception as well as table management. The main tables that are located in the APS layer are: binding, discovery cache, address map, and endpoint grouping tables. We'll discuss the tables later on when they get implemented. Right now, I'd just like to focus on reliable and unreliable data transmission.

The APS data service is the main vehicle for device discovery and management. The application objects use this service to communicate descriptors to each other, handle client/server communications, and perform remote management and provisioning. The data path for this layer can get a bit complicated due to the many options available for transmission and reception. The binding, grouping, and address tables are all used when building the frames. However on a fundamental level, there are only two types of transmissions: reliable and unreliable. Reliable transmission is transmission which requires an APS level acknowledgement from the destination device. Unreliable transmission is when you just send it out and you don't care if it arrives or not.

The IEEE 802.15.4 spec already implements a form of acknowledgement, but this is a PHY layer acknowledgement. Since most 802.15.4 radios implement the ACK in hardware, the 802.15.4 ACK just says that the frame was received into the chip's FIFO properly. The APS level ACK says that the frame was processed correctly as well, which is very important. Many things could go wrong from the PHY to the APS layer. Some examples of dropping a frame between the PHY and the APS processing are:

1)    The received frame requires routing, but no route can be found for it.
2)    The received frame goes to a router with no routing resources and no tree routing ability.
3)    The MAC doesn't pull the received frame out of the chip's FIFO quickly enough and it gets overwritten by another frame.

If a reliable transmission is needed, then an APS frame with the ACK required is probably the safest way to send it. However it also is costly in terms of performance.

I normally try to update my blog everyday. However I'm getting killed by the pollen levels in Tokyo . It feels like I'm getting hit by a mutant form of the common cold. I'm just going to curl up with a beer and watch Entourage on the internet.

The Wireless Sensor Networks Blog just posted an interesting link to an article at Popular Science . Microsoft unveiled a prototype sensor module reference design at Microsoft TechFest that they are currently using to monitor the temperature from their server hardware. I can only assume that it came from Microsoft's Networked Embedded Computing division at the Microsoft Research Labs. Well, I have nothing against Microsoft, and in fact, I enjoy using Windows (XP, not Vista). However the article does claim that the sensor modules are cheap, so I thought I would give a best guess-timate of how much the reference hardware bill of materials cost.

Luckily, the article provides an excellent clue when they said that they are using a TI transceiver that costs about $3. Well, that sounds exactly like the good ol' TI (Chipcon) CC2420. Why not the CC2520 you ask? Well, the CC2520 was just released recently (see my review of that chip ) and I am assuming that for them to make the PCB, write the software, create a nice plastic enclosure, and already have it running in their server rooms, the project must be more than a few months old. Hence, I am going with my initial assumption that they are using the CC2420.

Man, if I had a dollar for every article I read that claims C as a dying skill, I'd probably have around five dollars by now. Hmmm...that would include assembly language programming, I think. Well, here's another one, where C made the top 10 dying skills list at ComputerWorld . The article is a bit old, but C ranked just below cc:Mail programming in the list. With C programming as the second most popular language in the world ,  why is it that people think of C as a dying language?

That's an interesting question. I think it's probably because it's not really taught in universities anymore, since they are focusing on scripting/interpreted languages .  The reason behind that is that C is a difficult language to understand, especially for undergraduates. Having college students track down memory leaks and pointer problems is an easy way to make them cry, and kind of mean too. As Wikipedia put it: "...the safe, effective use of C requires more programmer skill, experience, effort, and attention to detail than is required for some other programming languages" . With the tech companies in the US shouting at politicians and universities that they aren't turning out enough computer science engineers, the schools seem to be watering down the curriculum to make sure that less people drop out. Is it bad? Is it good?

In general, I don't think its bad. Having more Java and Python programmers in the world can't hurt. You can do a lot of interesting things with those languages and make many useful applications; like say Google, where Python is one of the official languages. However I can say that it's bad for the future of the embedded industry. Anybody in the semiconductor or electrical engineering industry today knows that there is a shortage of embedded engineers . One of the main reasons for this shortage is that C is rarely taught anymore. Even C++ is slowly getting faded out from the curriculum at many schools. And in the embedded realm, C is king. 

At the semiconductor startups that I've worked at previously, software drivers are the main difference between being able to sell a chip or not. That translates into revenue for the company. Having software drivers available enabled a former startup company I worked at to generate $10M+ in sales off of one chip. Not having software drivers available also forced the company into bankruptcy. Well, that and a couple other factors. The reason why software couldn't be finished in time and fit the customer requirements was that its hard to find good embedded software engineers. These days, IC design is mostly stitching together different IP cores and running tests at the toplevel. But that level of design automation and reuse has not hit software yet so it's still largely a craftsman-based skill. Unfortunately, the available pool of craftsmen is shrinking because there are fewer places to learn the language, and nobody wants to give on-the-job training to a noob that, with one wrong pointer operation, can bring a whole system to its knees. 

So the main point of my post, which I seemingly forgot, is that C is still very much needed and anything but a dying language in this industry (embedded). And the benefit of knowing it often comes in the form of a six-figure salary; at least if you find the right company. Laughing

With the buffer management system chosen and the frame buffer pool finished, the next job was to pass a frame up and down the stack. This meant that code needed to be written to process the headers.

There are three headers that need to be handled: the MAC, NWK, and APS headers. Going downstream, the headers needed to be built from the available information you pass into the function. Going upstream, the headers needed to be stripped off from the frame and processed so that they could be put into a structure for easy access.

Many of the people that work with TCP/IP stacks would laugh at putting the headers into a structure. That's because TCP and IP headers (and Ethernet headers for that matter) are fixed sizes. This means that you can just create a structure pointer (ie: a pointer to an IP header struct) and point it at the start of the header in the buffer. That way, you can process all the fields in the buffer instead of copying them into a separate struct. Hope I didn't lose anyone with that explanation. You can save space by doing in-buffer processing because you don't need to use RAM to hold actual header structures.

After I got the dummy functions in the data path going, I needed to create the data structures that would be traveling through this path. The data structures consists of a frame buffer pool, and the request/indication parameter structures that conform to the Zigbee/802.15.4 specifications. Its at this point that you need to decide on the buffer management strategy. Yeah, I could just hear those yawns. I know buffer management strategy is not the most exciting thing in the world, but its important. Really! So here's a brief and probably incomplete discussion on buffer management...

A recent comment on my blog (there's only two right now) kind of got me thinking about the purpose of protocol wars. The commenter was a 6LowPAN backer and was taking a lot of jabs at Zigbee. Anyone that knows me knows that I'm  not a diehard fan of anything. I never really subscribed to the whole "whose side are you on" type of thing, not even for presidential elections (go Obama). However I did point out some holes in his arguments.

Protocol wars are basically stupid. They are part of the same genus as format wars. So as an overplayed example, lets take Blu-Ray vs HD-DVD. Toshiba was touting the benefits of HD-DVD and preaching to all that would listen that the world would be a better place if we adopted it as the standard. Sony was preaching the same message, except about Blu-Ray. They both were one-upping each other on specs:

HD-DVD - Backwards compatibility with DVD players

Blu-Ray - Higher storage capacities

HD-DVD - Lower manufacturing cost

Blu-Ray - More DRM options

Yada yada...

However the consumer doesn't give a shit about the storage density of the disc or compatibility with DVD players. What they wanted was a disc that they can watch HD content on. Could Blu-Ray do it? Yes. Could HD-DVD do it? Yes. Does the consumer care if Blu-Ray or HD-DVD wins? I'm pretty sure the answer was no. The whole consumer market just wanted a decision made. That was it. However the whole time the format wars were going on, damage was being done to the entire industry. The consumer didn't know which player to buy or to even buy one at all. The studios couldn't decide if they should release HD content. The manufacturer's had to make expensive players that were compatible with both. And finally, how did Sony win the format war? Was it by having a better spec? No. They paid off Warner . There you go folks.