If you need a good way to waste 36% or your day, debugging code is your best bet!
That’s what I figured Thursday while listening to Harry Foster’s analysis of the 2012 Wilson Research Group Functional Verification Survey commissioned by Mentor Graphics. The survey is meant to show design and functional verification trends from hardware development. It’s well done and well presented by Harry.
People that follow AgileSoC.com will know that I’ve been interested in one particular data point from the previous Wilson Research survey and I was quite interested to see the updated statistics. It has to do with the amount of time verification engineers spend debugging code. Here’s the numbers from the 2010 study.
Forget about everything except for debug because I don’t think any of the rest of this is significant. Debug is significant though and that’s why I love this slide. At 32%, a verification engineer in 2010 was wasting 2 hours, 33 minutes and 36 seconds a day, on average, debugging code. That’s fixing defects that we inject.
It’s two years later and what’s happened?
We’re up by 4%… that’s what’s happened. That means we’re now wasting an additional 19 minutes and 15 seconds a day debugging code compared to 2010. That’s almost 3 hours/day and more than we spend on anything else. Horrid.
At first glance, it’s hard to blame that statistic on anything other than garbage code. As much as people would hate to admit it, I think it clearly shows that as an industry we have defect rates that are unacceptably high. Harder to explain, though, is why the number hasn’t improved.
One explanation: adopting advanced verification practices actually leads to higher defect rates, not lower. Why go there? Well… the survey data suggest that’s a possibility. Consider the follow points…
Since the last survey there’s been a significant uptick in teams using UVM. There’s also been a significant uptick in teams using verification techniques like constrained random, functional coverage and assertions (NOTE: that last slide is a comparison with statistics from 2007, not 2010). This has all happened during a period where there’s been a slight increase in debug time. Hence my conclusion that advanced verification practices haven’t quite had the desired effect in the area of code quality.
Do I expect broad agreement with this analysis? Not really. Of course there are other valid arguments that take design and team size into account which could lead to a different conclusion. But whatever you might argue, as much as we want to believe we’re going in the right direction, we need to be able to talk openly about whether or not that’s actually the case. The bottom line is blowing more than a 3rd of a person’s time fixing defects is what it is: unacceptable. We have to wonder if the complexity we’re building into our development process is actually leading to better results? Yes or no? Is standardization leading to portable results? Yes or no? Is reuse leading to faster results? Yes or no? Are our best practices really the best we can do? Yes or no? Should we be looking elsewhere (i.e. agile development) for better best practices? Yes or no?
The answers to those questions will depend on who you are so regardless of who you are, the best thing you can do is to sit down, think about them and answer them for yourself. And don’t overlook the fact that with the possibility of earning back part of the 2 hours, 52 minutes 48 seconds a day we’re wasting now, the potential payoff from a little reflection is huge !
(fyi… all the graphics in this post have come from Harry’s webinar. That’s the source, I’m just the messenger/arm chair analyst. I’d recommend following along with Harry’s analysis as I assume he’ll be busy posting over the coming months. It was good reading last time around; I’m expecting the same this year).