A closer look at analog verification - Embedded.com

A closer look at analog verification

Behavioral modeling has caught on quite fast in the analog verification community. A RTL like description on analog, RF and mixed-signal blocks has opened up more possibilities of thorough top-level verification for these cores. Now, the power and finesse of digital verification is being brought into the analog domain with the aid of improved modeling and test bench methodologies that are influenced by the “digital way” of doing verification. This influence has been both a help and hindrance to analog verification.

First things first, digital verification is much more advanced and time-tested than the more recent analog verification idea. The digital design methodology of RTL, equivalence checking and formal methods compliment the verification processes beautifully and everything fits in. Most often, the digital design engineers double up their role as verification engineers and run extensive test cases on their designs. I strongly believe that this strong bonding between digital design and verification makes the process work smooth. Analog verification is a different beast. The designers and verification engineers don’t play well together, mostly because of incompatible methodology issues. While the analog verification camp is beginning to believe that digital verification methodologies have to be adapted (like monitors, assertions and test benches), they are involuntarily alienating analog designers, who are not concerned about these aspects of verification. Analog design engineers are super-smart, keen and intuitive. However, most of the time, they are engrossed in the performance of their own designs. They don’t worry about the integration issues that their block/module might face with a digital core or other analog cores. They offload that worry to module/integration leads, which I believe is not necessarily the right thing to do. While a verification engineer worries about capturing the connectivity and functionality issues pertaining to the whole chip, design engineers worry about the performance of their blocks. On the other hand, the analog verification engineers these days don’t have an appreciation for analog design. There is a chasm between what they think analog design is and what it really is. Because of this lack of understanding/appreciation, there are communication issues between the modeling/verification teams and design teams.

Modeling and verification can only be successful when the design and verification engineers are part of the same team. There should be a team chemistry between the two engineers, who actually are working for the same cause, a successful product. In my personal experience, remote interaction between a modeling/verification engineer and design engineer does not work that well because of the continuous interaction that is required to make this process flow smoothly. Normally, verification teams raise a number of concerns like:

•   We cannot start modeling early. The schematics and symbols are not stable.
•   There is too much model churn because of design changes that keeps us occupied.
•   Design engineers do not respond to us in time.
•   There are no specifications to base the models on.
•   There is no appreciation for verification efforts.

Similarly, design teams have several concerns of their own:

•   It is too much work to support the modeling.
•   We have to spend time to explain the functionality and later spend time to validate the models.
•   We run verification on our blocks, we don’t need models to do that. Because verification is more than connectivity for us. It is performance as well.

Both viewpoints are perfectly valid. Let us start with the design team’sviewpoints. Most of the concerns come because of lack of understandingof how modeling fits into the larger scheme of things. This also isbecause of the history of how analog modeling and verification hasevolved over the past three-four years. Initially, Verilog-A models werecreated for analog blocks that captured both the functionality andperformance. These Verilog-A models fit seamlessly in the design flowand the design engineers could co-simulate the models with anothertransistor level circuitry. Life was getting better for them. There wasvalue in this process. With modeling and verification slowly targetingtop-level simulations and complex test scenarios, the modelingmethodology changed to Verilog-AMS, which still allowed co-simulation,which is a big deal for design engineers. Then, Verilog-D with wrealscame into picture, which needed connect modules to convert wreal toelectrical and vice-versa and later pure verilog models using PLIextensions came into picture. While this transition for logical forverification engineers, this slowly started pulling design engineers outof their comfort zones. Now, the test benches became more complex,co-simulation with analog blocks were not straight forward and insteadof verification methodology fitting seamlessly into design, itbifurcated into an alternate discipline.

If you look at the viewpoints of the verification teams that is listedabove, their problems also stem from the aforementioned reasons. In anorganization, if the analog design and verification teams behave asdifferent entities, successful verification efforts are not possible.Also, modeling and verification engineers should have empathy towardsdesign flow. They cannot behave as contract programmers who just code upsome specification and think that their jobs are done. The mantra foranalog modeling and verification should be “make things easy fordesigners to do their jobs”. Because, unlike digital design where thedesigner himself creates the RTL, in analog design the models are NOTcreated by the designer. This brings me to another point – “Who shouldwrite behavioral models?”

I strongly feel that designers should start writing behavioral modelswith the support of design automation and EDA tools. Today, there arebunch of internal and external tools that generates behavioral modeltemplate for a design block. Close to 70 percent of the behavioral modelgeneration can be automated without any worry. There are only finiteclasses of analog blocks and their I-O relationships are known. If thepins are classified accurately, the code fragments can be automaticallygenerated. The biggest challenge is generating the digitalfunctionality, which also is not impossible to automate. Everyorganization follows some sort of internal guideline and namingconventions that can be used to semi-automate the model generation. Oncethe designer automatically generates the models to start with, themodeling and verification team can maintain it. Scripts that reportchanges periodically can handle other challenges like model churnbecause of changing designs. Actually, a lot of things are possible withlittle bit of understanding of the flow, some empathy towardscross-functional teams and loads of help with design automation.

Analog verification is still in early stages. There are greatopportunities and enormous possibilities to develop robustmethodologies. It is important that we also focus on the organizationaldynamics and operational issues in additional to technological advances.Because, without that, real progress in not possible.

About the author:
Saranyan Vigraham is a senior engineer with the RF, Analog team inQualcomm, Austin. These viewpoints are strictly his and not Qualcomm’s.He has a PhD in Computer Engineering. Before joining Qualcomm, heco-founded a tech startup out of graduate school. He has publishedextensively in extensively peer-reviewed journals and conferences. Hisexpertise is in top down design methodologies, analog modeling andverification and design automation

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.