The Time It Takes To Find A Bug

How much time passes between writing a line of RTL or testbench code and knowing that it works. I know it varies, but on average how long would it take? A line of code is written, time passes, then a test is run that either verifies it’s correct or discovers a bug. Does that time pass as minutes? As days? Maybe weeks? Months?

I’m preparing for a talk at DAC next week in the Verification Academy booth and in going through my material I’ve started asking myself that question. It’s the 3rd year in a row the folks at Mentor have invited me for a session in the booth. The last 2 years have been a lot of fun and I’m hoping for the same this year. This year’s session is called Add Unit Testing To Your Verification Toolbelt. As the name suggests, the focus will be on how we can use unit testing and test-driven development to improve the quality of the hardware we build.

I believe quite strongly in the value of unit testing and test-driven development because the quality of what I produce as a verification engineer is noticeably better because of them. I believe so strongly in TDD in particular that I know I get a little preachy at times. It’s the most important engineering skill I’ve ever learned and I don’t shy away from telling people about it.

I always think of rigour first when it comes to benefits. Testing code as you write it a few lines at a time is a rigorous process to say the least. But during the run-up to DAC I’m starting to think it’s timing that’s the real key. With TDD, the amount of time that passes between writing and testing a line of code is minutes, sometimes less. A design and test cycle measured in minutes is almost too short a time to think about anything else. I focus on getting exactly 1 thing right. There’s no thinking back, no back-and-forth with a teammate that found a bug, flipping through my logbook for clues as to what I’ve done or editing bug reports. In fact, with TDD there’s hardly any context switching at all. I write a line or 2, test them, if they’re broken I fix them, then I move to the next line. Most of the bugs I create are killed immediately; which coincidentally is the best time to kill them.

I’ve seen a lot of people talk about how bugs become more costly to fix the longer they live. For me, TDD has been the most effective way to respond.

If you’re at DAC next week in Austin and using TDD to kill bugs fast sounds good – or great! – I hope to see you at Mentor’s Verification Academy Booth at 11am on Monday. It’ll be a half hour presentation with lots of time for discussion. We’ll talk about TDD, unit testing, SVUnit and all the frustration that magically vanishes because of them!

Here’s a link to the session info for mine and others in the booth on VerificationAcademy.com.

See you at DAC!

-neil

You Can’t Automate Holistic Verification

For anyone stuck watching #48DAC via twitter (like I was) and presumably to those who were there in person, it was easy to feel that Cadence remains dedicated to the EDA360 vision it released over a year ago.

EDA360 has received a lot of attention. Some good… some bad. Put me in the camp that likes EDA360. I think it was an interesting move for Cadence, to share their view of the future so openly that is, thereby inviting criticism from anyone with a little time on their hands. I don’t mind that some of it qualifies as marketing hype and I don’t mind that it’s not entirely original. After filtering the hype, EDA360 is full of good stuff. I don’t even care if Cadence ends up delivering the vision exactly as stated through a new line of smart tools and by creating a collaborative ecosystem. I’d rather see them set the bar high and miss than the alternative.

But (isn’t there always a ‘but’)… the one section that has bugged me since I read it has been the one that talks about Holistic Verification. I’ve been pondering this for a while. At first, I liked the sound of it. Saying just those 2 words makes me think of a process where a team takes a step back from what they’ve always done so they can retool and rethink how they go about their business. Holistic makes it sound like every option is on the table. With a nice round view of what a team needs to accomplish, teammates build a strategy that is right for them and their product. They then arm themselves with the skills and tools they need to get it done.

To me, that is holistic verification.

The EDA360 view of holistic verification very close, yet very different. From page 24 of EDA360: The Way Forward for Electronic Design:

“Holistic verification—use the best tool for the job. There are many different techniques for digital verification, including simulation, formal verification, and emulation. Approaches to analog simulation range from transistor-level SPICEsimulation to analog behavioral modeling with the Verilog-AMS language. Working with the single goal of verifying the design intent, and utilizing a verification plan, EDA tools [the emphasis here is mine] must choose the best approach for any given phase of the verification process and feed this back to the verification plan. The result is a holistic approach to verification using the most productive methods for each task.”

It’s not too difficult to see where Cadence is going with this. They want to create smarter tools to offload as many monotonous tasks as possible, to allow teams to build test suites that are as comprehensive as possible as quickly as possible. They want to be an enabler for teams looking for more efficient ways to verify designs… which is pretty much every team I know of. I get that. I’m happy to see them (and other EDA vendors) do that. My problem though is the suggestion that “EDA tools must choose the best approach for the job” (I actually had to read it a few different times to realize what I missed at first). The tool driven decision making is something I have a problem with (for anyone that read why I think UVM Is Not A Methodology, that shouldn’t be a surprise).

I automatically question tools posing as solutions and that’s the feeling I get from EDA360 ‘holistic verification’. I hope an EDA360-like evolution is in the cards for the EDA industry as a whole, and I hope it leads to teams being able to automate everything possible save for one thing: thought. I’d prefer EDA companies leave that to their users.

To close, I appreciate Cadence setting the bar high and opening themselves up to public criticism, which at times has turned to ridicule. It was a gutsy move and I hope it pays off for them. I like EDA360, but here’s my bit of constructive criticism: Cadence (and others), please don’t attempt to automate holistic verification. Continue to build great tools, but leave the ‘holistic’ part to your users.

neil

Not A Problem… The “Fluffy Stuff” Isn’t That Important Anyway

There are 17 different stages at the Agile2011 conference in Salt Lake City that cover a wide variety of topics. Though there are obviously technical stages (it is a software conference after all), what would probably look odd to hardware developers browsing the program is the number of stages dedicated to the non-technical (aka: the “fluffy stuff”).

Yes, you did read stages dedicated to the “fluffy stuff”, not just individual sessions. The 5 that stick out to me are:

I know that these topics are important to agile developers so I wasn’t too surprised to see them. What did surprise me though is that not only are these topics covered at Agile2011, they’re well covered with an average of 15 sessions for each stage! Now, I’ve never been to previous Agile2xxx conferences, or any software conferences for that matter, so I know nothing of content and quality (both of which I’m assuming are decent). But by the simple fact that these sessions take up almost a 3rd of the program, the folks that put this on seem to think they’re worth the time and effort.

In another universe… could you imagine going to DVCon and seeing a track called Collaboration, Culture & Teams with 22 sessions? Would SNUG, User2User or CDNLive have a track called Coaching & Mentoring with 15 sessions. How about just 1 session? One bright light I do see is at DAC where they have a “Management Day” on June 7th. Other than that, topics stay pretty technical at all of the above. I list these conferences because they are reasonably accessible for the average joe and generally well attended. If there are other conferences I’ve missed that do fit the bill, I’d appreciate you letting me know.

I’ll venture a guess that the sessions I point out don’t get run at hardware conferences because we hardware guys don’t have much of an appetite for the “fluffy stuff”. Even though leadership is important, even though we could probably learn something by listening to how out peers work with customers, even though there are times when we need to coach and/or mentor colleagues, we probably rely more on what we’re born with and experience first hand than what we learn. Or maybe as a group we figure that technical excellence will compensate for lacking the “fluffy stuff”? Maybe we already know what we need to know! Who knows for sure.

It is hard to argue against the fact that “fluffy stuff” plays a tremendous role in hardware development. There’s no denying it. And we could likely get better… much better… though it seems we’re a little behind when it comes to realizing it!

neil

Q: Got an opinion for why the agile software crowd spends so much time on “fluffy stuff” like leadership, coaching and mentoring while we hardware folks appear indifferent? I’d like to hear what you think!