Forget About the Verification Gap

I always find the aftermath of DVCon interesting. I’ve never been to the conference but it always seems to be well covered. Between people live tweeting different sessions and others blogging, it always feels like I can be near there without being there.

The panel session that caught my eye this year was about something called the verification gap. In an article posted last week called Pointing Fingers in Verification, Brian Bailey made it sound like EDA representatives and users were doing their best to defer responsibility and deflect criticism when it comes to creating and closing the verification gap (it’s a good article… you should go read it when you’re done here).

I like these discussions so I’d like to add my 2 cents :).

Because people in our industry think of design and verification as truly separate responsibilities fulfilled through people with truly different skill-sets, it’s easy to perceive the gap between them. Commonly, designers are way out in front building incredible new products that are constantly doubling in size while we verification engineers are well behind, struggling to keep up. EDA vendors do their best to close the gap between the two… or, depending on who you talk to, take advantage of the gap to sell more tools (fwiw… I think EDA vendors are genuine with the tools they build. I don’t think they’re out to get us).

That’s a fun debate with lots of room for opinion, but I think it’s a false debate. I don’t think it’s constructive to focus on the verification gap. Here’s why…

Where some see a verification gap, all I see is designers being much better at building defects than verification engineers are at finding them. That’s it. To be sure of it, we use process that guarantees designers excel at creating defects: we enforce deadlines for RTL complete, we make it clear that testing it is not their responsibility and we wait until weeks after their first line of RTL is written until we finally get around to testing it. All the while, we verification engineers like to wonder what fancy tools and techniques we can use to find defects faster in these giant designs. For good measure, we like to throw in a lot testbench defects of our own. In fact, we’re all very good a building defects, both designers and verifiers. We’re both at fault. We use similar coding practices to write poor quality code. We accept defects. Ask anyone and they’ll tell you defects are inevitable. I think that’s kind of sad considering we are the people responsible for the open-loop coding practices that guarantees them.

I think spending time focusing on a verification gap is misplaced. Instead, we should be debating something called the quality gap. The quality gap opens when an industry ignores the cost of producing defects. The quality gap is not someone else’s fault (EDA: you’re off the hook). The quality gap is also not inevitable; it is, however, irresponsible. The only acceptable way to prevent a quality gap is to reduce defect rates with the ultimate goal of no defects at all. No negotiation. Until no defects becomes a proper goal for development teams – until we start taking defects personally – we’ll continue to analyze a verification gap we have no serious intention of addressing.

Spotting a quality gap shouldn’t be that hard. Here’s a few obvious signs I can think of (if you have other signs, please add them in the comments):

  • you have goals like RTL or testbench complete without exhaustive tests;
  • code goes untested for longer than a week or so;
  • people are discouraged from verifying their own code prior to releasing it;
  • you need a defect database (i.e. you don’t fix defects immediately); and/or
  • you have committees that review and decide what defects get fixed.

If you’re interested in preventing the quality gap, I’ve suggested test-driven development as a technique that will decrease defect rates significantly. I continue to believe this is a good place to start. Incremental development is another that I’ve had success with. Basically, any practice meant to validate progress at the earliest possible moment should be up for consideration.

Of course, maybe I’m wrong and this verification gap is worthy of discussion. It wouldn’t be the first time! Maybe the right thing to do is continue producing defects at a rate we can’t currently keep up with while attempting to compensate by inventing newer, better ways to find them. It’s keeping us all busy so why not carry on!

…though I can’t help but wonder what happens to this debate if the defects we’re so good at building eventually disappear.

-neil

One thought on “Forget About the Verification Gap

  1. One of the problems with the traditional approach to EDA is that it enforces a “waterfall” development flow with defined phases and handover points between distinct teams. This is demonstrably flawed in several ways – there’s no flexibility to adapt to new information, it’s almost impossible to balance resources perfectly and it inevitably leads to “gaps” between code being written and code being because of the small number of synchonisation points.

    One of the lessons from Agile that I think we really need to appreciate is helping to promote cross-functional skills. We shouldn’t have to wait until a particular engineer is ready before verification can commence! In a really productive team, the designer does some of the verification and the software team also does some of the verification. The verification engineers should primarily be responsible for providing metrics and information on how “verified” a design is, not being the only engineers working on actual verification. The burden of testing, integrating and bugfixing should fall relatively evenly across the entire team.

    One of my bugbears with UVM is that it hampers this cross-functional model. RTL design engineers find it too complicated and cumbersome to create testbenches, software engineers find it too nonsensical and ugly to bother with so we artificially introduce a bottleneck by ensuring only a small subset of the project team can productively contribute.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.