I moved from 16 to 32 bits. So can you. - Embedded.com

I moved from 16 to 32 bits. So can you.

One of the more enjoyable parts of my job is when a vendor sends me one of their development kits to “play” with. I like to put it through the paces, generally putting myself in the position of the typical design engineer, if there is such a thing.

I had to look at the most recent development kit I received from a different perspective. You may be aware that Microchip recently announced a new architecture, one that migrates from the company's popular 16-bit architecture into the world of 32 bits. I've used different variations of Microchip's 16-bit kit many times over the years and was quite comfortable with its use.

I made the assumption that there are thousands of designers out there in the same boat–people who were familiar with the 16-bit kit but wary of moving up the ladder to 32 bits. The Microchip tech folks assured me that the learning curve to get up and running on the 32-bit development kit, assuming you were familiar with the 16-bit variant, was not a steep one.

Being the skeptic that I am, I decided that the only way to find out for sure was to dive in and see for myself. First, understand why people would be making the move from 16 to 32 bits. One reason is that it opens up a longer list of potential peripherals. And even some of those peripherals that don't necessarily change, like a UART, for example, can be upgraded to deploy more modes. A second reason to move to 32 bits is the ability to run higher performance applications, such as graphics and TCP/IP.

One of the best features that Microchip put in was the pin compatibility between the families. While I'm not minimizing the effort required to write your code, once you do get that code written and recompiled, you can just plug the 32-bit microcontroller into the same spot on the board that was occupied by the 16-bit microcontroller. Other than a few minor tweaks, that's all that's required on the hardware side.

That said, the simplicity of the kit (and the hardware in general) means that most developers will spend the majority of their time writing software. The complexity of that task is dependent on the application. While writing code for the 32-bit part clearly differs from writing for 16 bits, it doesn't necessarily have to be harder, given the array of tools that are now at our disposal.

The development kit I tested used a modular approach, where you could easily swap the two microcontrollers in and out. In addition, you can either take advantage of the peripherals offered by Microchip, or develop your own. I opted for the easy way out and used a couple provided by Microchip.

All in all, I spent the better part of two days toying around with the 32-bit development kit. I was able to do some simplistic tasks, obviously limited by my programming knowledge. But for a seasoned developer, my guess is that the move from 16 to 32 bits would be a relatively easy one.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.