How UVM-1.1d Makes The Case for Unit Testing

The point of the open-source UVM-UTest project we’ve been working on is to demonstrate how unit tests can be used to lock down the functionality of legacy code. Being able to run the unit tests means that when you’re changing code you can verify existing features won’t break during maintenance. We think that’s valuable. A nice side benefit of the project is that we can also show unit testing as being effective for increasing the quality of code hardware engineers write. How can we say that? Because over a 5 week period we found 10 defects in the UVM 1.1d release.

Here’s a snapshot of those defects. If you go to the eda.org mantis database, you can look at each of these reports in more detail.

Screen Shot 2013-06-20 at 3.45.37 PM

The project has been great for us so far and we’re glad we’ve made it open-source so others can see what we’ve done with it. To date, we’ve written roughly 430 unit tests. We’re still working on core functionality of the UVM. We’ve written unit tests for the uvm_object, uvm_printer, uvm_printer_knobs, uvm_scope_stack, uvm_status_container and global methods in the uvm_misc.svh. As I wrote in Are You Ready for the UVM-UTest Challenge? we’ve had colleagues challenge our testsuite. They found a few holes that we’ve since filled.

As for specific lessons we’ve learned from this project:

  • We’ve attempted to lock down the UVM line-by-line. That means each of the unit tests we’ve written focus on 1 very (very) specific feature of the UVM. At times, it’s been difficult to work at that level of detail, but we’ve seen that it’s necessary to exhaustively verify code.
  • Unit tests are a good mechanism for finding defects that other techniques haven’t detected. Admittedly, I don’t know much about the tests that are currently run against the UVM. By the class of defects we’ve found, however, I think it’s fairly evident that we’re working at a much finer granularity.
  • Locking down code line-by-line has given us the opportunity to learn about and critique the UVM line-by-line; similar to how you’d do in a code review.
  • It’s obvious by some of the defects we’ve found that there are features of the UVM that no one has ever used. Mantis 4602 is a good example of this type of flexibility: the uvm_printer offers the flexibility of user specified scope separators, however, due to a very basic incompatibility between the uvm_printer and uvm_scope_stack it’s not properly supported.
  • Unit tests are great for exposing functionality that is incomplete, which is particularly critical for IP like UVM that’s intended to be used by many people and organizations. Mantis 4638 is an example of this. On the happy path, uvm_is_array will tell you whether or not a string is formatted as an array. If you’re a user that doesn’t always travel down the happy path, however, you can put in all kinds of poorly formatted input and the function will tell you you have an array.
  • In many places, we found it necessary to increase the level of visibility into a particular class or method to capture and verify internal interactions. We’ve created test doubles to do that. (Test doubles are a pretty big topic so rather than elaborate here, I’ll probably write more extensively about them in the next couple weeks.)

I’m sure there will be other lessons learned as we carry on. For now, I think the best way to summarize this project so far, regardless of what happens to it from here on, is that unit testing is a technique that hardware teams must consider adding to their development approach. If you’re familiar with the work that’s gone into UVM, you’ll know that it and it’s predecessors were developed cooperatively by a collection of very smart and recognizable people from relavent organizations – both product development and service organizations. And yet, we were able to find several basic defects in the framework over a short period of time.

I’ll be the first to admit I’m no genius – far from it in fact – so I’m quite sure we didn’t find these defects with sheer brainpower. I credit the technique – unit testing – for exposing these defects. It’s a technique that enables the thorough and exhaustive testing we like to think we get from directed and/or constrained random testing at the subsystem and chip levels…

…but we don’t… and I think 10 defects in 5 weeks with UVM-UTest shows that nicely.

-neil

Share/Bookmark

About nosnhojn

I've been working in ASIC and FPGA development for more than 13 years at various IP and product development companies and now as a consultant with XtremeEDA Corp. In 2008 I took an interest in agile software development. I've found a massive amount of material out there related to agile development, all of it is interesting and most of it is applicable to hardware development in one form or another. So I'm here to find what agile concepts will work for hardware development and to help other developers use them successfully. I've been fortunate to have the chance to speak about agile hardware development at various conferences like Agile2011, Agile2012, Intel Lean/Agile Conference 2013 and SNUG. I also do lunch-n-learn talks for small groups and enjoy talking to anyone with an agile hardware story to tell! You can find me at neil.johnson@agilesoc.com.
This entry was posted in Functional Verification and tagged , , . Bookmark the permalink.

One Response to How UVM-1.1d Makes The Case for Unit Testing

  1. Gordon says:

    Unfortunately the big problem with UVM development right now isn’t finding bugs in the BCL, it is finding people willing to fix those bugs.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>