Design Con 2015

Single core to multicore: Addressing the system design paradigm shift with project management and software instrumentation

Don Harbin, Mentor Graphics Corporation

February 18, 2014

Don Harbin, Mentor Graphics CorporationFebruary 18, 2014

You are a software/systems development lead on a complex embedded development project. There are many requirements to be met in order to satisfy the project specifications as well as an aggressive delivery timeline. The project is entering the integration phase. The functionality seems to be working well and you’re feeling pretty good about things.

But then it happens: initial tests show that your system is performing at 1000% over the requirements! Or as you progress through the integration of the disparate components and begin to apply stress tests on your system, resets are occurring at a frequency that makes your system look like a re-boot test.

More functionality into faster, more powerful devices
With the exponential growth in the complexity of embedded systems, the above scenario is becoming all too common. Consider current mobile devices such as smart phones and tablets now hitting the market that have four processor cores (and an additional GPU core) such as Qualcomm’s Snapdragon, with other suppliers such as Samsung advertising eight (heterogeneous) core devices for next-gen mobile devices.

Then there are higher-end devices such as the LSI Axxia Communication Processors (supporting 16 ARM Cortex A15 cores) for use in networking/telecom applications. It’s safe to assume this trend for more functionality will not slow down any time soon.

Figure 1 shows an example of such a system and its possible components. This example could be a tablet, a mobile device, or even an automotive infotainment system. The demands from handheld to high-end devices are converging, and these systems are being asked to play flash videos, stream applications over Bluetooth, perform on-the-fly security tasks, be ready to take incoming calls, and more – while in many cases having the expectation that the user interface (UI) will not lose any responsiveness to touch gestures.


Figure 1: The inbound and outbound flow of data among devices continues to converge at alarming rates; system functionality needs to keep up with the increased demand for more and more data.

Multi-threaded, multicore, and even multi-OS hardware/software embedded systems lead to extremely difficult-to-diagnose interdependent issues such as non-optimized use of shared resources, including the processors themselves! In some cases, problems may not arise until integration starts, and some of these may have the potential to kill a project.

But in this article we suggest another option. The solution is to propose a juxtaposition of process with technology. That is: leverage a new technical solution for solving these problems, and then merge this technical solution into the project software development processes in order to maximize the benefits.

Mitigating risk
Sound project management includes up-front risk mitigation plans. Thus, if you agree that what has been shared so far is an inherent risk in your upcoming projects, read on.

What has the mitigation for such risks been historically? Answer: people. Mitigation plans often add developers during the later phases of the development to fix the issues. But bringing in a team at that point has its own set of risks. There is the required “ramp-up time” for new team members as well as the need for the current team to set aside cycles to train the new developers brought in.

Couple that with the fact that a percentage of the issues will be very difficult to root cause, thus requiring the attention of the experts on the team, adding developers can cause a project to actually lose ground. To provide the senior developers with every advantage in solving these complex issues, traditional software debugging techniques are no longer adequate on their own. The experts on the team need new methods to efficiently resolve these problems.

Leveraging instrumentation
The new method proposed herein is the incorporation of software instrumentation to analyze the behavior of the system and help debug complex issues in a way that complements traditional software debug. Instrumentation, in this case, is defined as the insertion of code that generates trace data, which in turn reveals important information about the state and flow of a software application.

Though instrumentation has been used informally for many years, it has matured greatly from the days of “printf”, and its inclusion in the formal software development processes for complex system analysis is long overdue. However, it’s an investment that must be designed in from inception if a project is to maximize the benefits. Instead of a developer putting all that they learn into a debug session which is lost when the target power is turned off, when a project invests in instrumentation, the team is putting much of what they learn into the code itself to be leveraged over the life of the program.

< Previous
Page 1 of 2
Next >

Loading comments...

Most Commented

  • Currently no items

Parts Search Datasheets.com

KNOWLEDGE CENTER