Simulation Takes Off with Hardware
Some systems are too dangerous or expensive to be toyed with. Simulation can help you develop the firmware in advance for a safe and successful integration.
Engineers are continually searching for ways to shorten the new product development cycle. One way to reach this goal is to develop the hardware and software in parallel. Traditionally, this approach involves separate hardware and software development groups that perform their work simultaneously and independently. When prototype hardware and a substantial portion of the embedded code become available, the hardware and software are combined in a system integration phase and testing begins.
Too frequently, serious problems arise during the system integration phase, problems that require significant hardware rework or software workarounds. Projects can stall during system integration, bogged down with growing lists of problems, ballooning costs, and major schedule delays. Sometimes projects are cancelled as a result. Clearly, we need a better way to deal with these issues.
One approach that has proven effective is hardware-in-the-loop (HIL) simulation. This technique enables testing of the embedded software at a much earlier stage of the development cycle. By the time the system integration phase begins, the embedded software has been tested much more thoroughly than it would have been in the traditional approach. And the earlier problems are identified, the less they to cost to fix.
This article describes an embedded software development project that used HIL simulation. The goal of this project was to develop and test embedded control software for an experimental dynamic "helicopter" system. Using HIL simulation, I was able to design, implement, and test the controller software without using any actual hardware beyond the embedded processor and its I/O interfaces.
When I connected the actual system hardware to the embedded controller running the new software, it worked correctly on the first attempt. The only additional work during the hardware/software integration phase was some minor controller parameter tuning due to variations between the actual system hardware and its simulated representation.
This project involved developing controller software for the Quanser 3 degree of freedom (3DOF) helicopter.1 This is a tabletop electromechanical system with three rotational axes of motion controlled by two independent electrical motors, each driving a propeller. Figure 1 is a diagram of the helicopter system and its axes of motion.
Figure 1 Helicopter front and side views
Assuming that the pitch axis is near the zero angle, a large voltage applied equally to both motors will cause the helicopter to move upward in the positive elevation direction. A difference in the voltage applied to the motors will cause rotation about the pitch axis. To move the helicopter about the travel axis, it is necessary to first pitch the helicopter to a nonzero angle and then apply voltage to both motors to generate a force in the desired travel direction.
Figure 2 Helicopter control system
As shown in Figure 2, the system uses a control computer with three position encoder input signals and two motor voltage output signals plus inputs from the user that allow mode selection and joystick control. The control computer receives the position encoder input signals via an interface card specifically designed for this purpose and generates analog output voltages that drive the two motors using digital-to-analog converters (DACs). The DAC outputs serve as input signals to power amplifiers that provide the necessary current to operate the motors.
The position encoders measure the motion about each of the three helicopter axes. These devices optically sense rotational motion and produce a digital angular position measurement. The position encoders quantize the measurements with a resolution of 4,096 steps per 360 of rotation or 0.08789 per quantization step. The output signals from each encoder consist of two TTL-level signals, Phase A and Phase B, that toggle between high and low voltage levels as the axis turns. The two signals have a phase difference that enables determination of the direction of motion, as shown in Figure 3. The frequency of the pulses is proportional to the rotation rate of the axis.
Figure 3 Position encoder output signals
The performance goal for the helicopter controller is to move the travel and elevation axes to arbitrary commanded positions within a limited time, nominally ten seconds. In addition, the software for the helicopter controller must support several other modes of operation. The full set of controller modes is:
- Off: The controller software starts up in this mode, which has zero volts applied to both motors. Once the system leaves this mode, it can only be reentered from Null. When Off is entered from Null, the elevation axis is commanded to move slowly down to just above the table top and then the motor voltages are set to zero. This gently drops the motor assembly to the table.
- Null: When selected from Off, this mode powers on the motors and commands all axes to the zero position. From other modes, it commands all axes to the zero position. The zero positions are the positions where the pitch and travel axes were at system startup and with the propeller assemblies lifted to the horizontal position in elevation, as shown in Figure 1.
- Random: Generate a new random value from a predefined range for the travel and elevation axis position commands, at ten-second intervals. The controller software moves the helicopter to these commanded positions.
- Autopilot: In this mode, the joystick generates elevation and travel commands for the controller. Fore and aft joystick motion controls the elevation position command. Side-to-side motion controls the travel position command. The controller moves the helicopter to track the commanded positions.
- Manual: In Manual mode, the joystick directly generates sum and difference voltages to drive the motors. Fore and aft joystick motion controls the sum of the two motor voltages and side-to-side motion controls the difference between the motor voltages. The system is exceptionally difficult to control in this mode. The controller will automatically switch to the Null mode if any axis of motion exceeds a position limit. A limit violation usually occurs within a few seconds after entering this mode.
Given these requirements for system capabilities and performance, the implementation and testing of the controller software could proceed. Simulation techniques were used where possible to speed the development and testing of the helicopter controller software.
To perform realistic HIL simulation testing of the embedded software, it was necessary to employ the embedded processor and its associated I/O devices. For many embedded systems, this is a small subset of the entire system that can be assembled early in the development process. I constructed a simulation of the helicopter hardware and its interactions with the external environment and interfaced this simulation to the embedded controller through the controller's I/O interfaces. Both the embedded controller and the simulation of the helicopter operated as real-time systems.
Early in the development cycle of a complex embedded product, it is common to simulate the complete system operating in its intended environment. This simulation is usually a non-real time application developed using a dynamic system simulation tool such as Simulink. This non-real time simulation can serve as the basis for an HIL simulation. It is sometimes necessary to perform simplifications and optimizations on the models contained in the simulation in order to make it suitable for use as a real-time simulation. In this project, modifications were not necessary to enable it to run in real time.
Because simulations of complex systems require many sophisticated numerical algorithms, specialized software tools have been developed to simplify the task:
- Simulink is an add-on to MATLAB that enables dynamic system simulation in a block diagram-oriented graphical environment. Developing a simulation in Simulink involves dragging blocks from a palette onto a drawing area and connecting the blocks with lines that represent signal flows. Figure 4 is a Simulink block diagram of the position encoder model used in the helicopter project. This model accepts an angular position in radians as its input and generates the Phase A and Phase B signals as outputs. It also outputs an Index signal, which indicates when the axis is at its zero location. The helicopter position encoders do not produce Index signal outputs, so this output from the Simulink model was not used.
- Stateflow is an add-on to Simulink that allows the implementation of finite state machine models. In the helicopter project, a Stateflow model implemented the helicopter mode selection logic.
- Real-Time Workshop generates C code from Simulink block diagrams. This code is used by other tools that provide compilation and execution targets. In this project, these additional tools were Real-Time Windows Target and xPC Target.
- Real-Time Windows Target allows a simulation to be compiled and executed as a real-time process on a PC running Windows. It can run simultaneously with the Windows operating system. For this project, Real-Time Windows Target performed the HIL system simulation and executed on the host computer where the helicopter software was developed and controlled.
- xPC Target enables the execution of a simulation on a PC that functions as a dedicated real-time controller. It provides a real-time multitasking kernel for use on an embedded processor with limited hardware resources. In this project, xPC Target was used to generate and execute the real-time code for the helicopter controller on a separate PC, which served as the "embedded" controller.
Figure 4 Position encoder model
Full size version of this image
All of these tools are from The MathWorks (www.mathworks.com).
The first step in the development of the controller software was to implement a simulation of the complete helicopter-controller system. The top-level diagram of this simulation is shown in Figure 5. The two large blocks represent the helicopter system itself and the digital controller. The two smaller blocks labeled "Joystick" and "Mode Command" provide the user inputs to the controller.
Figure 5 Helicopter and controller model
The "Helicopter" block in Figure 5 contains the Simulink model of the dynamic behavior of the helicopter, which appears in Figure 6. This model uses several Simulink blocks such as transfer functions, summing junctions, and integrators. The block labeled "Limited Motion" contains a model of the elevation axis motion that is limited in the downward direction by the tabletop. When the simulated helicopter hits the tabletop, the velocity in all three axes of motion is set to zero, which approximates the behavior of the real helicopter. The three quantizers near the right side represent the quantization effects of the position encoders.
Figure 6 Helicopter dynamics model
Full size version of this image
The "Limited Motion" block is a subsystem. Subsystem blocks allow hierarchical sets of diagrams to control complexity during simulation development. Subsystems can be nested to any level in much the same manner that function calls can be nested within other functions.
Figure 7 Helicopter controller model
Full size version of this image
The contents of the "Controller" subsystem in Figure 5 are shown in Figure 7. The three primary controller inputs are the quantized angle measurements for the three axes; the controller outputs are the drive voltages for the two motors. Major blocks in this diagram include the "Autopilot," which drives the helicopter to a commanded position, the "Command Generator," which produces travel and elevation position commands in the various operation modes, and the "Mode Control" block, which implements a finite state machine for selecting the different helicopter operational modes.
Figure 8 Helicopter mode control diagram
Full size version of this image
Figure 8 shows the Stateflow diagram contained within the "Mode Control" block. This diagram contains logic to perform calibration of the joystick at system startup, mode changes under user control, automatic switching to the Null mode if a position limit is violated, and control of system shutdown.
The contents of the "Controller" block in Figure 5 provide a complete implementation of the embedded software. A common approach is to perform the embedded software development as a separate process that uses the simulation as an executable representation of the software requirements. However, it is much more efficient to use the implementation of the controller in the simulation as the "source code" for the embedded software.
In this project, I copied the "Controller" block from Figure 5 into a new Simulink project and added the appropriate I/O device blocks to the diagram. I then invoked Real-Time Workshop to generate the C code, compile it, and download it to the "embedded" PC controller. This completed the development of the embedded software.
Using the non-real time Simulink simulation of the helicopter and the controller, I began developing the HIL simulation. First, I created a new Simulink project and copied the block labeled "Helicopter" from Figure 5 into it. This simulation models the helicopter dynamics and includes appropriate I/O device interfaces. Real-Time Windows Target supports a variety of I/O devices. The I/O requirements for the HIL simulation include two ADC inputs (to receive the motor command voltages from the controller) and six TTL digital outputs (to generate the Phase A and Phase B signals for each of the three simulated position encoders).
A laptop PC running Windows is the host system for this application, so an I/O device meeting the above requirements in the PCMCIA format was needed. The National Instruments DAQCard-1200 meets these requirements and includes a ribbon cable that connects the interface card in the computer to a separate connector block where cable connections can be made to the embedded PC.
The helicopter simulation executes at a fixed frame rate and the TTL outputs that simulate the Phase A and Phase B signals are updated once per simulation frame. Since the pulse rate of the position encoder signals is proportional to the angular rate of motion about the axis, the simulation frame rate will limit the maximum angular rate that can be accurately represented.
Using this approach to modeling the position encoder signals, the highest angular rate that can be simulated occurs when the Phase A and Phase B signals are each toggled on alternate simulation frames. Under this assumption, Equation 1 gives the maximum angular rate wmax (in degrees per second) in terms of the simulation update interval h (in seconds):
From the results of digital simulations of the helicopter behavior, it is apparent that the pitch axis has the largest peak angular rates and that these rates rarely exceed 100/sec. It is desirable to have h be no smaller than necessary so that the HIL simulation does not tax the capabilities of the computer on which it runs. By balancing these requirements, a value of 500s was chosen for h, which results in an update rate of 2,000 frames per second. This gives a maximum simulated angular rate of 175.8/sec, which comfortably exceeds the maximum expected angular rate.
The 2,000fps update rate of the helicopter simulation is much higher than required for accurate modeling of the helicopter's dynamics. Because of this, it is not necessary to use an integration algorithm of high order to achieve accurate results. Good accuracy can be obtained using a relatively simple second order integration algorithm. The Simulink "ode-2" method, which is trapezoidal integration, was selected for this simulation. This makes the execution of the simulation somewhat more efficient than it would be if a higher order, more complex integration algorithm had been used instead.
To download and run the embedded software on the target PC, I connected the host and target computers together with a serial cable and booted the target system kernel from a floppy disk. From the Simulink diagram of the Controller, I could then download and run the software for the embedded controller. After connecting the I/O devices in the target system to the appropriate terminals of the DAQCard-1200, I started the Simulink simulation of the helicopter running in Real-Time Windows Target on the host computer system. From the Simulink diagram, I could then send commands to the embedded controller to start its operation and to "fly" the simulated helicopter.
In the HIL simulation mode of operation, I was able to exercise all aspects of the embedded software and fix some problems with its design and implementation. I could perform all this testing without any actual moving hardware. At the conclusion of this round of HIL simulation testing, I had an embedded application that had been thoroughly tested and was likely to undergo a quick and successful integration with the actual hardware.
I intentionally avoided running the embedded software with the helicopter hardware until the software had undergone HIL testing, both to demonstrate the value of HIL simulation and to reduce the risk of damaging the hardware. After the HIL testing was completed, I disconnected the cables leading to the DAQCard-1200 interface and connected them to the helicopter hardware. I powered the system up and commanded the helicopter to the Null position and then into the Random mode, where it flew to randomly generated travel and elevation positions every ten seconds. It worked reasonably well on this first try, although there was a bit more oscillation and overshoot in response to the commands than had been indicated by the HIL simulation.
Some tuning of the controller gains was necessary to get acceptable system performance in all modes of operation. The HIL simulation did not quite match the behavior of the actual system because the simulation of the helicopter was simplified in some ways and the system mass properties used in the simulation did not quite match those of the actual system.
It is always necessary to simplify to some degree when developing a simulation because it is not possible to perfectly model every factor influencing the behavior of a real world system. The differences between the simulation and the actual system proved to be minor and the parameter tuning required for the embedded software was a straightforward procedure.
In this project, the HIL simulation provided major benefits to the development process. The entire embedded application had been thoroughly tested in a realistic environment before it ran for the first time with the system hardware. This avoided risks of damaging the hardware and made it easier to identify and repair problems with the embedded software. The integration process turned out to be merely a matter of tweaking a few parameters instead of the much larger task of attempting to get a large, untested embedded software application functioning on new hardware, which typically contains its own set of problems.
This project demonstrated the value of HIL simulation techniques in developing software for complex embedded systems. HIL simulation techniques enable thorough testing of embedded software early in the development process and reduce the risks involved with running untested software on valuable prototype hardware. Proper application of HIL simulation techniques leads to higher quality products developed in less time than traditional development approaches.
Jim Ledin is a consulting electrical engineer. He has developed several HIL simulations for surface-to-air and air-to-air missile systems at the Naval Air Warfare Center in Point Mugu, CA. He is the author of Simulation Engineering (CMP Books, 2001). Jim welcomes questions and comments from readers and can be reached at email@example.com.
1. Apkarian, Jacob. "Systematic Controller Design and Rapid Prototyping," available at www.mathworks.com/company/digest/dec98/systematic.shtml