By Example: Done vs. DONE

In a previous post, When Done Actually Means DONE, I shared a slide that I’ll present at Agile2011. I use it to illustrate the differences between waterfall and agile development models in the context of hardware development. After posting that, the first response I got was examples could maybe make it even clearer from AgileSoC guest blogger @rolfostergaard.

Thanks Rolf. Good idea.

In case you haven’t read When Done Actually Means DONE, I’ve included the slide I used to get things started again in this post. I use it to show that there are different ways to describe how done you are based on the development model you’re using. If you’re basing progress on tasks you’ve completed, you’re using done to measure progress. If you’re using features, you’re using DONE.

What’s the difference? Being done means you’ve hit a milestone that won’t hold water mainly because there’s no way to objectively measure its quality. You may think you’re DONE, but without tests or some other reasonable way to measure quality, you’ll very likely need to come back to it. For that reason, done is misleading and it gets people into trouble.

DONE means you’ve hit a milestone that you can unambiguously quantify (or at least quantify with far less ambiguity). Here, you’re confident that what you’ve just finished will require very little or no follow-up because you can see and measure results.

In short, done isn’t done at all… but DONE is. Confused? Here’s where a few examples might help.

My RTL Is Done

Classic. Your design team is under pressure to meet a scheduled RTL complete project milestone. As always, it’s a high visible milestone – to the development team, management and possibly even customer – because it comes with the connotation that the product is nearly finished… save for the minor details that it hasn’t been verified nor has it been pushed through the back-end. The RTL is done though, so that’s great. Cross it off the list!

My Test Environment Is Done

This is a close second to my RTL is done. Your verification team has finished its test environment and supposedly all there is left is writing and running tests. Though of course there’s been very little to confirm that the test environment does what it is supposed to do. That’s immediately obvious when running the first test: the configurations are invalid, the stimulus transactions are poorly formed, the BFMs don’t obey protocols and the model is outdated; all unfortunate because now people are anxiously expecting results that the environment can’t quite deliver yet! Sure the test environment is done… except for everything that doesn’t work.

Feature <blah> is DONE

Now we’re getting somewhere. No, your RTL isn’t done. No, your verification environment isn’t done. But who cares! You have something better: a small portion of both are DONE and that’s enough to run tests and collect coverage results that verify feature <blah> is ready to go. No ambiguity there. The feature works and you have the passing tests to prove it. You’re delivering something that’s DONE.

(Ideally, you would have passed the design to the physical team as well. But by the fact that you’ve made a big step forward in terms of credibility and confidence relative to the first two examples, we’ll forget about the physical design for now.)

The Software API Is Done

A hardware team normally implements an API according to a spec received from the software team. After the hardware team is done, it’s assumed that sometime later the software team will build drivers and applications on top of the API and release the finished product. Problem is, the initial API was a best guess from the software team and early in their development, the team finds that the API that’s been sitting done for a couple months doesn’t give them the access to the hardware that they need. Sure the API was done, but until it’s updated, system performance is seriously restricted and some functionality is completely absent.

The Software Demo Is DONE

An SoC, by definition, is a part hardware, part software solution. So why settle with an API that’s done when the software is required for delivery? As the hardware team completes API functionality, give it to the software team so they can actually use it. Deliver it as a C model, an emulation platform or some other form that makes sense. Use this demo version of the entire solution (hardware + software) to judge whether or not you’re DONE.

I’m Still 90% Done

I’ll end with a personal favorite that kind of fits into this discussion. Like everyone else, I’ve played this card many times. What does 90% done mean? It means you think you’re almost DONE but you really have no idea because there’s no way of knowing for sure. Before you say it again, do yourself a favor:

  1. admit that you don’t know how DONE you are
  2. find a way to measure what you think you are DONE
  3. isolate what you aren’t DONE

I’m going to try and follow my own advice on that one :).

Done is a false milestone. It’s ambiguous. It’s one dimensional; you may have written some code but that’s it. There’s no reliable way to measure done and teams that measure progress in terms of done eventually find they’re not as DONE as they thought.

DONE on the other hand, comes with results. It’s multi-dimensional; you’ve written some chunk of code and it’s been tested so you know it works. DONE is measured in passing simulations, software demos and any other means that objectively confirm the code you’ve written is high quality. Teams that measure progress in terms of DONE know how far they’ve come and how far they have to go.

Done is a feeling. DONE is progress.

-neil

Q. What examples of done do you see in SoC development?

One thought on “By Example: Done vs. DONE

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.