Python is better than C! (Or is it the other way round?) -

Python is better than C! (Or is it the other way round?)

If you have a quick Google for something like “Python vs. C,” you will find lots of comparisons out there. Sad to relate, however, trying to work out which is the “best” language is well-nigh impossible for many reasons, not the least that it's extremely difficult to define what one means by “best” in the first place.

One aspect of all this that doesn’t seem to garner quite so much discussion is Python versus C in the context of embedded systems — especially small microcontroller (MCU)-based applications for “things” that are intended to be connected to the Internet of Things (IoT) — so that's what I'm poised to ponder here, but first…

…it's interesting to note that there are literally hundreds and hundreds of different programming languages out there. If you check this Wikipedia page, for example, you'll find 54 languages that start with the letter 'A' alone (don’t get me started on how many start with 'C” or 'P'), and this list doesn’t even include the more esoteric languages, such as Whitespace, which uses only whitespace characters (space, tab, and return), ignoring all other characters, but we digress…

The reason I'm waffling on about this here is that I just finished reading a brilliant book called Learn to Program with Minecraft by Craig Richardson (see my review). This little scamp (the book, not Craig) focuses on teaching the Python programming language, and it offers the most user-friendly, intuitive, and innovative approach I've seen for any language.

As part of my review I said: “Now, I don’t wish to wander off into the weeds debating the pros and cons of languages like C and Python here — that's a separate column in its own right.” Well, I'm not outrageously surprised to discover that I was 100% correct, because this is indeed a separate column in its own right (LOL).

Now, I'm not an expert programmer by any stretch of the imagination, but I do dabble enough to be dangerous, and I think I know enough about both Python and C to be very dangerous indeed. There are myriad comparisons that can be made between these two languages; the problem, oftentimes, is working out what these comparisons actually mean. It's very common to hear that C is statically typed while Python is dynamically typed , for example, but even getting folks to agree on what these terms mean can be somewhat problematical.

Some folks might say: “A language is statically typed if the types of any variables are known at compile time; by comparison, it's dynamically typed if the types of any variables are interpreted at runtime.” Others, like the folks at Cunningham & Cunningham (I can never tell those two apart), might respond that static typing actually means that “…a value is manifestly (which is not the same as at compile time) constrained with respect to the type of the value it can denote, and that the language implementation, whether it is a compiler or an interpreter, both enforces and uses these constraints as much as possible.” Well, I'm glad we've cleared that up (LOL).

Another comparison we commonly hear is that C is weakly typed while Python is strongly typed . In reality, weak versus strong typing is more of a continuum than a Boolean categorization. If you read enough articles, for example, you will see C being described as both weakly and strongly typed depending on the author's point of view. Furthermore, if you accept the definition of strong typing as being “The type of a value doesn’t suddenly change,” then how do you explain the fact that you can do the following in Python:

    bob = 6    bob = 6.5    bob = "My name is Bob"

In reality, what we mean by “strongly typed” is that, for example, a string containing only digits (e.g., “123456”) cannot magically transmogrify into a number without our performing an explicit operation to make it do so (unlike in Perl, for example). In the case of the above code snippet, all we're saying is that the variable bob can be used represent different things at different times. If we used the method type(bob) after bob = 6 , then it would return int (integer); after bob = 6.5 it would return float (floating point number); and after bob = "My name is Bob" it would return str (string).

One thing on which we can all agree is that C doesn’t force you to use indentation while — putting it simplistically — Python does. This is another topic people can get really passionate about, but I personally think we can summarize things by saying that (a) If you try, you can write incredibly obfuscated C code (there's even an International Obfuscated C Code Contest — need I say more) and (b) Python forces you to use the indentation you would have (should have) used anyway, which can’t be a bad thing.

Another thing we can agree on is that C is compiled while Python is interpreted (let's not wander off into the weeds with just-in-time (JIT) compilation here). On the one hand, this means that a program in Python will typically run slower than an equivalent program in C, but this isn’t the whole story because — in many cases — that program won’t be speed/performance bound. This is especially true in the case of applications running on small MCUs, such as may be found lurking in the “things” portion of the IoT.

I feel like I've already set myself up for a sound shellacking with what I've written so far, so let's go for broke with a few images that depict the way I think of things and the way I think other people think of things, if you see what I mean. Let's start with the way I think of things prior to Python circa the late 1980s and early 1990s. At that time, I tended to visualize the programming landscape as looking something like the following:

The way I used to think of things circa 1990 (Source: Max Maxfield /

Again, I know that there were a lot of other languages around, but — for the purposes of this portion of our discussions — we're focusing on assembly language and C. At that time, a lot of embedded designers captured their code in assembly language. There were several reasons for this, not the least that many early MCU architectures were not geared up to obtaining optimum results from C compilers.

Next, let's turn our attention to today and consider the way I think other people tend to think of things today with respect to Python and C. Obviously processors have gotten bigger and faster across the board — plus we have multi-core processors and suchlike — but we're taking a “big picture” view here. As part of this, we might note that — generally speaking — assembly now tends to be used only in the smallest MCUs that contain only minuscule amounts of memory.

The way I think other people think of things circa 2016
(Source: Max Maxfield /

In turn, this leads us to the fact that C is often described as being a low-level language . The term “low-level” may seem a bit disparaging, but — in computer science — it actually refers to a programming language that provides little or no abstraction from a computer's underlying hardware architecture. By comparison, Python is often said to be a high-level language , which means that it is abstracted from the nitty-gritty details of the underlying system.

Now, it's certainly true that the C model for pointers and memory and suchlike maps closely onto typical processor architectures. It's also true that — although it does support bitwise operations — pure Python doesn’t natively allow you to do things like peek and poke MCU registers and memory locations. Having said this, if you are using Python on a MCU, then there will also be a hardware abstraction layer (HAL) that provides an interface allowing the Python application to communicate directly with the underlying hardware.

One example of Python being used in embedded systems can be found in the RF Modules from the folks at Synapse Wireless that are used to implement low-power wireless mesh networks. This deployment also provides a great basis for comparison with C-based counterparts.

In the case of a ZigBee wireless stack implemented in C, where any applications will also typically be implemented in C, for example, the stack itself can easily occupy ~100KB of Flash memory, and then you have to consider the additional memory required for the applications (more sophisticated applications could easily push you to a more-expensive 256KB MCU). Also, you are typically going to have to compile the C-based stack in conjunction with your C-based application into one honking executable, which you then have to load into your wireless node. Furthermore, you will have to recompile your stack-application combo for each target MCU (but I'm not bitter).

By comparison, the Synapse's stack, which — like ZigBee — sits on top of the IEEE 802.15.4 physical and media access control layers, consumes only ~55KB of Flash memory, and this includes a Python virtual machine (VM). This means that if you opt to use a low-cost 128KB MCU, you still have 73KB free for your Python-based applications.

And, speaking of these Python-based applications, they are interpreted into bytecode, which runs on the Python virtual machine. Since each bytecode equates to between 1 and 10 machine opcodes — let's average this out at 5 — this means that your 73KB of application memory is really sort of equivalent to 73 x 5 = 365KB. Furthermore, the same bytecode application will run on any target MCU that's using Synapse's stack.

As part of my ponderings, I also asked my chum David Ewing about his views on the C versus Python debate. David — who is the creator of our ESC Collectible wireless mesh networked “Hello There!” Badges and “Hello There!” Robots — is the CTO over at Synapse Wireless and, unlike yours truly, he is an expert programmer. David responded as follows:

C and Python are both fantastic languages and I love them both dearly. There are of course numerous technical, syntactic, and semantic differences — static vs. dynamic typing, compiled versus interpreted, etc. — but the gist of it all is this:

— C is a “close to the metal” compiled language. It is the “universal assembler.” It is clean and elegant. My favorite quote about C is from the back of my 30-year old K&R book: “C is not a large language, and it is not well served by a large book.”

— Python, with its “dynamic typing” etc., reduces “accidental complexity.” Python is interpreted (or JIT compiled), so you can have silly errors that aren’t discovered until runtime. However, compilers don’t generally catch the hard, non-trivial bugs. For those, only testing will suffice; a solution must be rigorously tested, regardless of the implementation language.

David went on to say:

If a problem can be solved in Python, then it can also be solved in C; the reverse is not always true. However, if the problem can be solved in Python:

— The solution (source code) will be simpler than the corresponding C code.
— It will be more “readable.”
— Perhaps more importantly, it will be more “writeable” (this is an oft-overlooked quality!).

Due to the qualities noted above, the solution will have fewer bugs and be much faster to develop, and these are the real reasons to opt for Python over C for many tasks.

I'm like David (except I sport much better Hawaiian shirts) in that I appreciate the pros associated with both languages. I like the clever things you can do with pointers in C, for example, and I also appreciate the more intuitive, easy-to-use syntax of Python.

So, which language is best for embedded applications? I'm hard-pushed to say. To a large extent this depends on what you want (need) to get out of your applications. You know what I'm going to say next, don't you? What do you think about all of this? Which language do you favor between C and Python (a) in general and (b) for embedded applications? Also, if we were to widen the scope, is there another language you prefer to use in Embedded Space (where no one can hear you scream)?

34 thoughts on “Python is better than C! (Or is it the other way round?)

  1. “In my opinion, the one big defect in python is using indentation to define blocks.nnTry emailing a code snippet to someone and see what happens.nTry editing with a different editor that changes indentation and see what happens.nnMost other languages

    Log in to Reply
  2. “I know exactly what will happen if I email a snippet of my code to anyone — they'll tell me I'm an idiot who shouldn't be let anywhere near a computer LOL”

    Log in to Reply
  3. “Significant white space can also be a real headache when trying to do meta-programming. Then again, maybe that's a good thing, since too much meta-programming normally doesn't make for maintainable code.”

    Log in to Reply
  4. “If I knew what meta-programming was, this is the point where I would make an incredibly intelligent and thought-provoking comment…nn…but I don't, so I won't :-)”

    Log in to Reply
  5. “Meta-programming is having your program create programs (well, typically functions or bits of code)…look at Python's exec function. It can be very useful when used with care…”

    Log in to Reply
  6. “That Python can be used on real world embedded microcontrollers is great news. Instead of comparing C and Python, how about telling us about how the PVM is supported on a resource-limited embedded platform? How much FLASH and RAM does it require? This wil

    Log in to Reply
  7. “Steve,nI would highly recommend looking at the Micropython project. It's been running on ST32Fxxx MCUs for years, and just completed a successful kickstarter for doing a better port to the ESP8266.nnAnother option for memory constrained MCUs is Lua/e

    Log in to Reply
  8. “Python does not have types, so things like bytes are not defined. Even finding out the size of an integer gets weird. It is not a concept in the language. When you are accessing hardware, doing bit set and clear, the operations get… well odd.nTry this

    Log in to Reply
  9. “It's true, python's generalization of the concept of a “number” is quite different than what we're accustomed to as embedded C programmers. Python does indeed have types – in the example above, type(byte) reveals that it's [type 'int']. True, there are

    Log in to Reply
  10. “Nice introduction to Python Max, thanks. What I find interesting is that I don't hear much discussion about experience, libraries and reliability when I hear discussions about tool selection. While using the latest “hammer” may appear “cutting edge”

    Log in to Reply
  11. “All good points — C has been the mainstay of the industry for decades now — and that comes with experience — but we also learned a lot since C was conceived — there are a lot of things I like about Python — but at the end of the day I'm a hardware gu

    Log in to Reply
  12. “As always, it's all about using an appropriate tool for the job. Python (like many other dynamic languages such as Lua) interfaces very well with C and C++. So using Python for high level logic and C code for low level is very common.nnPython is also

    Log in to Reply
  13. “Also, although I don't know how well you can apply this to embedded, but Python has a strong testing culture (unit testing & test driven development).”

    Log in to Reply
  14. “If a problem can be solved in Python, then it can also be solved in C; the reverse is not always true. However, if the problem can be solved in Python: The solution will be simpler than the corresponding C code (let me add: both to understand and to imple

    Log in to Reply
  15. “Speaking from my own experience as a seasoned embedded C programmer, learning python has certainly made me more capable. The comment you reference was really about using the most appropriate tool for a particular job – similar to the tradeoff of C vs Asse

    Log in to Reply
  16. “Love this post. Nicely put. As a low level driver programmer, I could not see myself using Python for that; not that anyone has really suggested that, and though I love Python.”

    Log in to Reply
  17. “!!! this is the 1 reason I droped bein ginterested in python !!! IMO big design fail, they fixed something that wasnu2019t broken and introduced much broken design in the end:nnOverview of the problem: python afaik is the only mainstream programming la

    Log in to Reply
  18. “For embedded I would pick LUA and day and night to PYTHON. In fact I would pick LUA instead of any other dynamic lang anytime 🙂 Yes I am looking at you Javascript :P”

    Log in to Reply
  19. “Lua has has somewhat different “feel” than Python. Based on the success of Python vs Lua, and MicroPython vs eLua, I'd say more programmers prefer the Python approach, but it all depends…Lua has a fair amount of success, too (especially in mobile gam

    Log in to Reply
  20. “Forth and Lisp are better then either of them. :-)nnActually I like Python a lot, but find C to be insufferably tedious.nncheers, – vicn”

    Log in to Reply
  21. “One important aspect to be considered in Embedded world is power consumption.nWhich code (C or Python) in an end product consumes less power? And multiply it by the Billions of microcontrollers that are in use.nnFor servers and data centers, speed and

    Log in to Reply
  22. “Yes, there will always be applications whose market economics justify the additional development expense of lower-level coding, even when the technical constraints don't necessarily mandate it. You can write even faster/tighter code in assembly language,

    Log in to Reply
  23. “Taking the closely related matter of C being obsolete, I'd like to express some thoughts.nnWe need, as business, a fastest language than C and C++ do develop, there's a lot of money wasted on reinventing the wheel over and over again. But as engineers,

    Log in to Reply
  24. “@Fanlnn”Like Vala, but it would have its own standard library.” nnDon't be modest. You should mention that you have written a blog “Vala Applications n Embedded Linux”n”

    Log in to Reply

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.