Building better memory management for high performance wired/wireless networks: Part 2 - Embedded.com

Building better memory management for high performance wired/wireless networks: Part 2

To evaluate the performance of the proposed variable-size memory pool scheme described in Part 1 in this series – particularly as it compares with fixed-sized pools – we used discrete-event simulation to generate random requests to the memory pool.

The requests are generated in accordance with a Poisson process with a given rate. The request size is also randomly chosen from a specified range of values. The requested buffer is held for a random period of time and then released.

The simulation is run till 1 million requests are generated in each scenario. We execute multiple experiments to assess the performance of the schemes in different scenarios comprised of different loading and different request patterns and pool configurations in terms of number of partitions and partition size for the variable size pools and the number of buffer and maximum size for the fixed-size pools. The following metrics are used to capture the performance effects:

Bandwidth Admission Probability: This quantity is obtained as the ratio between the sum of the sizes of admitted allocation requests and the sum of the sizes of all allocation requests. This directly affects the overall system throughput, where the throughput is the total requested rates times the bandwidth admission probability.

Memory Utilization: This quantity reflects the extent to which the overall pool memory is utilized. This takes into account all the memory used by the pool including the overheads associated with the supporting structures and meta-data.

This is obtained by taking the overall summation of the request size times the holding time of the allocated PG/buffer and dividing it by the total system time multiplied by the overall memory size associated with the pool structure. Another related quantity is the pool memory utilization which takes into account only the memory available for use by the pool user excluding overheads.

Testing variable- vs fixed-size memory pools
In the first scenario we examine the performance of fixed and variable-size pools. We generate allocation requests with a Poisson process such that the average inter-arrival time is changed from 0.0005 to 0.0035 time units in increments of 0.005 time units.

The memory resource is reserved for an average of 1 time unit. The allocation request size varies uniformly between a minimum of 80 bytes and a maximum of 2000 bytes. The fixed size pool has a size of 2048 byte per buffer with a total number of 272 buffers whereas the variable-size pool consists of 1024 partitions each with size 512 bytes.

These configurations result in a total of 557056 bytes and 524288 bytes of user allocatable space on the fixed and variable-size pools respectively. The sizes are carefully chosen such that the overall pool memory size including overheads is almost equal in the two cases. The 3 performance metrics comparing the two schemes are provided in Figure 5 below .

Figure 5: Performance of the variable (a) on top versus fixed-size (b) memory pools below it.

As can be seen from the figure, the variable size pool performance is consistently better than the fixed-size pool in terms of bandwidth admission, overall and pool memory utilization, and also the combined utility is higher.

Moreover, it offers better acceleration in satisfying the requests as indicated by the larger slope of the line in Figure 5a . In other words, it is capable of reaching a 100% request satisfaction earlier than the fixed-pool (when the inter-arrival time is 0.003 units, the variable-size pool admits 100% requests whereas the fixed-size is still at 80%).

Figure 5b (b) indicates an interesting behavior which is there is a certain point at which the memory utilization and overall system utility is maximized after which the system is over-engineered and memory is idle. The designer needs to select the optimum size in light of the expected load and the required throughput which is governed by the bandwidth admission.

Effect of the Minimum Request Size
To examining the performance behavior when the minimum request size is changed, we generated requests at a rate of 500 requests/unit time where the maximum request size is 2000 bytes and vary the minimum request size to take values from the set {100, 200, 400, 800, 1600}.

We inspect the performance for a variable-size pool of 1024 partitions each with size 512 bytes and a fixed pool of 272 buffers each with 2048 bytes. The results are shown in Figure 6a and 6b below .

The fixed pool admission probability does not change much as the request size is always less than the maximum buffer. The memory utilization increases as expected since larger requests tend to occupy more of the buffer more. For the variable-size pool, the larger the size of the minimum request size, the smaller the admission probability.

After certain size, the admission probability of the fixed-pool is better. For the variable-size pool, the utilization increases as the minimum request size increases. At certain value, the utilization of the fixed-size pool is better since it has better request admission probability.

The issue that the designer needs to inspect here is the mix of traffic. If the requests tend to be close to maximum buffer size, then it is probably more efficient to use a fixed-size pool.

Figure 6: Performance of the Variable (a) on top and Fixed-Size Pools (b) on bottom as minimum request size changes.

Effect of the Maximum Request Size
In this section we complement the study made in the previous section by inspecting the performance effect with the variation of the maximum request size. We fix the minimum request size at 100 bytes and vary the maximum request size from the set {1000, 1200, 1400, 1600, 1800, 2000} and keep the request parameters and pools configuration as the previous section. The results are shown in Figure 7 below .

Figure 7: Performance of the variable size pools (a) on top and and Fixed-Size Pools (b) on bottom.

We note here that the maximum size also affects the performance of the variable-size pool, however the performance is always better than the fixed-size pool with regards to admission probability and memory utilization. The fixed-size pool has an admission probability that is almost constant at 54%. For some values of the maximum request size, the variable-size pool attains 100% admission and exhibits memory utilization in the vicinity of 50-60% which is reasonable.

Our conclusion here is that – once again – if there is large variation in the sizes of the requests, the usage of the variable-size pool is more efficient.

Estimating the effect of partition size
In the last experiment we fix traffic parameters and study the performance of the variable-size pool as the partition size increases.

We use two configurations: the first one a variable-size pool that has a total pool memory fixed but with varying partition size and in the second the number of partitions is set at 4096 with partitions size increase.

Requests are generated at a rate of 1000/unit time with minimum request size of 1000 bytes and a maximum request size of 3000 bytes. We select partition size from the set {256, 512, 1024, 2048, 4096}. . The results are shown in Figure 8 below .

As can be seen, with the total pool memory fixed, smaller partition size result in better performance in terms of admission probability and overall memory utilization.

However, very small partition sizes result in large overheads and the memory utilization will be bad (as shown by the drop in utilization at the size 256). Certainly, for the configuration with total partitions of 4096, as the partition size increases, the overall memory increases, and thus we can admit more requests but at the expense of utilizing memory less efficiently.

This study shows clearly that the smaller the partition size, the better the utilization. However, very small partitions result in low utilization of the memory. Once again, the designer should carefully analyse the situation carefully.

Figure 8: Performance of the Variable-sized pools (a) on top and Fixed-Size pools (b) on bottom.

Overhead cost of variable vs fixed size pools
In this last experiment we evaluate the overheads associated with the variable-size and fixed-size pools. We select two sizes: the first is a pool of size 64 Kbytes usable space and the second for a pool of size 512 Kbytes.

In both cases we vary the segment size in the variable-size pool and the buffer size in the fixed-buffer pool from 16 bytes to 1024 bytes. The overhead is calculated as the ratio of the additional space needed for the supporting data structures and the space available for allocation by the end user.

As can be seen from Figure 9 below , the smaller the size of the segment/buffer the larger the overhead for a given overall pool size. Also, it is obvious that fixed-size pools always have smaller overhead as compared to variable-size pools.

Figure 9: Overheads for Variable and Fixed Pools.

Conclusions
Variable-size pools offer an attractive alternative for memory management in deeply embedded systems and protocol stack implementation. The performance advantage over fixed-size pools is apparent particularly when there is large variation of the allocations sizes and also at high load situations. Smaller partition size allows for fine granularity allocations but typically result in larger associated overheads.

The designer should consider all these dimensions in the design: 1) how much memory is needed, 2) how is the memory structured (fixed-versus variable), and 3) if variable-sized pool is chosen what is the appropriate partition size or alternatively if fixed-size pool is chosen what is the appropriate fixed buffer size.

The results describe in this article should be used to complement, not replace, the designer's own intuition to judge on best configuration for the system. For best results, a simulation tool such as the one used here can be handy to experiment with the system before implementation.

The ideas discussed here have been implemented in ANSI-C and are part of a platform independent and RTOS independent operating system abstraction layer that is used extensively in SySDSoft for protocol stack implementation for standards such as IEEE 802.16e/WiMAX and LTE.

The field tests have shown the good performance of the variable-size pools which emphasizes the validity of the approach for highly efficient protocol stack implementations.

Many enhancements are possible for further overhead reduction and enhanced performance by using more efficient structures than a linked list to maintain the FPG.

To read Part 1, go to: Variable versus fixed-size pools

References
[1] B. Stroustrup, “Programming — Principles and Practice Using C++,”Chapter 25 Embedded Systems Programming, December 2008. Addison-Wesley.
[2] D. Saks, “The yin and yang of dynamic allocation,” Embedded Systems Design, May 2008,
[3] D. Saks, “Dynamic allocation in C and C++,” Embedded Systems Design, June 2008.
[4] N. Murphy, “Safe Memory Utilization,” Embedded Systems Design, April 2000.

Khaled Fouad Elsayed is the CTO of the embedded wireless business unit at >a href=”http://www.sysdsoft.com/”>SySDSoft and a full professor of communication networks in Cairo University. Khaled received his B.Sc. degree with honors and M.Sc. from Cairo University and his Ph.D. in Computer Science and Engineering from North Carolina State University. He has published over 50 technical papers and was chair or technical committee member in various IEEE and IFIP conferences and was a technical editor with the IEEE Communications Magazine.

Mohamed Galal is a Software Specialist in the Embedded Wireless business unit in SySDSoft, He has been with SySDSoft since 2004. Currently he is a member of the LTE User Equipment (UE) project team, and his main role is designing, developing, and testing features in the Non Access Stratum (NAS) layer. Previously, he acted as the Software Lead of the Mobile WiMAX Medium Access Control (MAC) program for both Mobile Station (MS) and Base Station (BS). Mohamed has received his B.Sc. of Communication and Electronics Engineering from Cairo University.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.