Using OVM to reuse vital verification knowledgeReuse of legacy directed test environments is common practice but, with each generation of reuse, the number of tests grows and with it the overhead of maintaining the environment across multiple projects. Another concern lies in the fragility of directed tests. With each change to a design, a percentage of the directed tests will fail, but a certain percentage will pass but no longer verify their intended feature. Careful, tedious, time consuming auditing is the only way to identify and fix these tests.
During the design of a fifth generation SAS device at LSI, it was clear that our testbench needed significant updates to verify the new features. The environment surrounding the SAS expander design had become cumbersome to manage. With each generation many new features had been added to the design. The new features complicated the existing environment, which, although flexible, had not been designed to verify these features. And, as each new feature meant adding directed tests to the library, after four generations the library contained thousands upon thousands of tests.
Of course, vital verification knowledge was embedded in the existing test patterns that we wanted to preserve. However, documentation associating the tests to the relevant sections within the design specification needed improvement and porting the tests directly would also involve many tedious hours (that our team did not have). As if all of these factors weren't problematic enough, all of this work would need to be repeated for future generations of the device.
Clearly, we needed a new approach; one that could encapsulate the verification knowledge in a portable, documented form that could move easily through future generations of the project without manual intervention. The new approach also needed to show which features of the device had been verified not just a list of directed tests that had been written.
We used the Open Verification Methodology (OVM) with verification management and functional coverage to satisfy these requirements. Verification management allowed us to track and analyze coverage and test data, and functional coverage gave us unequivocal data on what had been exercised in the design. By focusing our efforts on functional coverage measurement development, we eliminated the effort of porting an extremely large number of tests.
Verification technology has taken a leap forward with the introduction of the OVM and more sophisticated, effective coverage devices in both the form of SystemVerilog assertions (SVA) and verification IP (VIP) that has built-in coverage features. Using these new technologies allowed us to focus on the real task of verifying the design rather than the administrative overhead of managing and maintaining a library of directed tests. These new technologies are far less sensitive to specification changes than older directed test environments as they allow for predominantly modular verification environments. This is especially useful in storage designs, like ours, where market pressures often lead to specification changes late in the design cycle.
We split the team: half of the team focused on understanding the existing directed tests and converting these to coverage measurements, assertions, or monitors. The other half of the team developed the new verification environment.
The team capturing the information from the directed tests needed an auditable way to ensure that all of the information had been captured. The first step toward doing this was to take each directed test and translate it into either a cover group or cover directive. Some verification tasks could not be translated into SystemVerilog coverage constructs. These were added to the scoreboard requirements for the new OVM-based verification environment. For the very few cases where a test could not be translated into SVA or a scoreboard check, the directed test was ported to the new environment.