What multicore and longitude have in common - Embedded.com

What multicore and longitude have in common

In a column a year ago (www.embedded.com/201800330), Bernie Cole compared the state of multicore software tools to the Charge of the Light Brigade: current tools “are as outdated and useless as the sabers, one-shot pistols, and horses of the ill-fated 600 were against cannon, repeating rifles, and mechanized equipment they rode against.”

While it's not a bad comparison, perhaps a better analogy may be navigating by sea. First, we figured out that the world was round. Then we developed tools to measure longitude to actually figure out where we had come from, where we were, and where we were going. Prior to that, sailors mostly kept within sight of shore or risked getting lost and foundering on some unexpected rocky shore. Those famous sailors, such as Columbus, Magellan, and Cook who successfully navigated large distances over open water, owed their success to luck and pluck more than knowing with certainty where they were and where they were going. And for all the successes we learned about in our history books, there were many more navigational failures.

In addition to death and human suffering due to scurvy, as Dava Sobel points out in her book Longitude, ” the global ignorance of longitude wreaked economic havoc on the grandest scale.” Ships crossing the oceans were confined to a few well-known passages, well known by the pirates and navies in addition to the merchants.

Similarly with multicore, designers have kept close to shore, adding more processing power only to use it for a new, specified application. No sharing of those resources because maybe the earth is flat. And, if we share processor and memory resources, the monsters of the deep will devour us.

While we know that the electronics world is round, there are dangers to navigating in open water, but they aren't imaginary monsters. Now it's time for metrics and for tools and a methodology to be developed to help software developers navigate the multicore waters. Otherwise, the consequences of not having the tools and methodology to enable full utilization of multicore platform resources could be catastrophic, delaying new generations of products that help drive the global economy.

What kinds of tools are needed? Everything from parallel programming tools to new debuggers for multicore to analysis and profiling tools. Let's focus here on the debugging and test of code on multicore platforms. Currently, code is run either on hardware prototypes or on the hardware itself. On-chip debug tools have been hot for the last few years. But to use them, the software team has to wait until the hardware is available. Virtual platforms, or virtualized software development environments, have been proposed as a way to get software running earlier in the development process. These have not gained widespread traction in the community due perhaps to cost, proprietary languages, slow simulation speed or maybe not having the necessary tools working on these platforms. This is akin to knowing where you are on the ocean, but not knowing where the dangers are lurking or how to escape the trouble.

According to the recent Embedded Market Survey done by Embedded Systems Design Magazine , test takes up an increasingly large amount of time. Debug tools are among the most important tools that developers use. In addition, increasing code base and complexity of code are key concerns. It seems like these only get worse as multicore platforms become more broadly adopted. The next generation of tools should be developed from the ground up with multicore systems in mind.

These tools should provide:

  • A unified multicore software development environment with integrated simulator and debugger so that individual processors, threads and subsystems can be stopped, started and stepped independently from the rest of the platform.
  • Deterministic simulation so that bugs, when found, can be replicated and solved, and tested for again in regression tests.
  • Multi-level debugging applicable to low-level drivers, operating systems and applications with insight into peripherals in addition to the processors.
  • Heterogeneity. It shouldn't matter whether the processors are the same or different, or whether the platform is SMP, AMP or a mix.
  • Software verification tools for automated testing for typical multicore bugs and user-defined checking of functionality specific to the system.

There are some real monsters of the multicore deep, things like race conditions and locking problems and more, most of them involving shared resources. These are well known problems; difficult, but well known, and tools should be able to test for these.

An interesting note regarding virtual platforms surfaces here. Just as in the real hardware, the bottleneck in performance is not the processor but the communications between processors and memories. It's critical for the virtual platform for multicore to not just have good processor modeling/simulation technology, but it must handle memories of all varieties, and handle them smoothly, easily and with speed. Speed is important””hundreds of millions of instructions per second is required, in fact. Whether using pre-defined tests for special conditions or just running random stimuli, many “cycles” are needed to be run to shake out all the bugs from this next generation of multicore software.

It was the development of a reliable, mechanical, non-pendulum-based clock that enabled the measurement of longitude, and ushered in the next generation in navigation, and really started us on the path to a global economy.

It is clear that multicore is the key to the next generation of embedded systems; now we need the next generation of tools to enable us to navigate the multicore oceans.

Larry Lapides is vice president of sales at Imperas. Prior to joining Imperas, he ran sales at Averant and Calypto Design Systems. Lapides was vice president of worldwide sales at Verisity. He was recently an Entrepreneur-in-Residence at Clark University's Graduate School of Management, and holds an MBA from Clark University. Additionally, he holds an MS in Applied & Engineering Physics from Cornell University and BA in Physics from the University of California Berkeley.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.