Bill Gatliff

's profile
image
Consultant
Freelance embedded developer and consultant who specializes in adapting Linux, Android, and GNU tools to custom hardware and applications. Advisor to the Embedded Systems Conference, noted author and speaker, and counts several Fortune 500 companies among his clients.

Bill Gatliff

's contributions
Articles
Comments
Discussions
    • Original Linux developer's claims of OS being huge and scary are not based on fact.

    • What is and what isn't an embedded system these days may not be worth sorting out. It's time to embrace that blurry line.

    • Many embedded systems depend on obscurity to achieve security. We often design systems to download unsigned or unencrypted firmware upgrades or store unencrypted user data, a practice we justify because it's invisible to the end user and makes our lives easier. The stealthy practice, however, is no longer kosher. With the help of this public-domain encryption algorithm, we can clean up our act.

    • The Universal Serial Bus is a useful communications interface and more popular than ever. Here are three approaches to adding USB support to an embedded system running the Linux operating system

    • Looking for a way to update flash-based firmware in a design? This article describes a handy software architecture that will help you avoid common mistakes.

    • The first part of this article introduced newlib, a C runtime library for some embedded software built using GNU tools. This time, Bill shows you how to integrate newlib into a multithreaded runtime environment that features Jean Labrosse's C/OS.

    • newlib is often the most appropriate choice for a C runtime library in an embedded system. Read on to find out why.

    • Your mission, if you choose to accept it: design and build a pair of satellites to measure the effects of atmospheric lightning, on a shoestring budget of $120,000. Think you could? Someone else already has.

    • Managing the addition of oil to a diesel engine's fuel-oil mix turns out to be a tricky problem. Here's one design that worked well.

    • "Although Android itself is processor agnostic (it already runs in x86, ARM, and MIPS processors), Android apps are not." For certain definitions of "not", anyway. Last I heard, 2/3 of apps in the Google Play application repository had no native code, ergo they would be expected to run correctly on an Android platform regardless of the native instruction set architecture of the host machine. Furthermore, Android is perfectly happy with applications containing multiple runtime libraries, each optimized for a different instruction set architecture. So applications that require native code are not a-priori prevented from being instruction set-agnostic as well.

    • After working two decades in this business, I have encountered only a handful of projects where the underlying instruction set architecture made a difference AND was also being utilized to its fullest. For the overwhelming majority, the peripheral mix of the SoC, power consumption, interrupt latency, and computational throughput were far more important than what flavor of assembly language the processor was speaking. A lot of the resistance to change here is due to the expense of new tools, combined with the testing effort needed to ferret out code that makes unnecessary and/or unintended assumptions about data representation: big-endian to little-endian conversion, suspicious unions, and so on. There isn't a lot one can do about the former, but writing good code (which you should do anyway) fully cures the latter. Come to think of it, writing good code addresses a large part of the tool problem too: despite the fact that I'm a hard-core Linux kernel and device driver author, I need a JTAG adapter about twice a year: to program virgin flash chips. If you routinely need to step code on the hardware, you are doing something wrong. Long story short: you are right that software is motivating the collapse to a two-instruction-set-architecture world: BAD software. Coalescing around one or two ISA winners won't fix that problem.

    • There is no practical way to assure that an embedded system isn't infected with code that possesses malicious intent. The only long-term solution is to deploy systems with architectures that naturally limit damage when that infection occurs. That means that in addition to being better code monkeys, we embedded developers have to really improve our system design and analysis skills. And our abilities to do failure modes and effects analysis. "The what?", I hear you say. Exactly.

    • For situations where you need wide ranges AND fine precision, scaled integer arithmetic is the way to go. But then you might be working with 256-bit numbers or larger, something that usually requires a runtime library because I'm not aware of a C compiler that offers something like that as a native type. It sounds onerous, but scaled integer arithmetic can be fast--- faster than floating point many cases. It just isn't as convenient as floating point. Maybe Jack will do an article on arbitrary precision maths with scaled integers at some point (maybe he already has, and I missed it).

    • Hi Miro! One of the motivations for using Dan's approach is that it lets you potentially catch errors in accessing peripheral control registers at compile time. And if you look at the assembly language that comes out of Dan's code, you'll see that it's virtually zero-overhead. Win-win.

    • willc2010: I'm happy to write such an article as a rebuttal to yours. Just let me know when it shows up in print! :)

    • The written word is INCREDIBLY important. If you can't write clearly and concisely, how can I assume that you can even THINK clearly and concisely? A person's use of language is an important insight into their thought processes. And a clear thought process is key to good problem-solving ability. And problem-solving is what engineers do! I won't hire an engineer who can't communicate clearly. Period. The only exceptions I'll make are when the candidate can demonstrate to me the desire to develop their communications skills with the same vigor as their other essential skills. Otherwise, I prefer to make them someone else's liability.

    • TsJ: what "silly mistakes" did you spot? We'd love to hear them...

    • Abstracting hardware probably doesn't lead to better testing. Rather, what leads to better testing is making it easy to generate input data sets that exercise as much of your code as possible before it hits actual hardware. You aren't trying to "abstract" the hardware, you are trying to REPLACE it with an object that creates and consumes the same data, but is easier to work with than the actual hardware itself. A subtle but important difference.

    • When you say "stdio", I assume you mean open/close/read/write/ioctl, right? That's a poor model for a timer peripheral, since they neither manufacture nor consume data--- and stdio is all about moving bytes from one place to another in a consistent way.