Posts Tagged ‘Verification’

761 Days

Tuesday, March 29th, 2011

Clouds over San Francisco

761 days.

That’s 2 years, 1 month, and 3 days.

761 days ago, I hosted a small group of interested EDA folks, journalists, and bloggers in a small room in the Doubletree hotel after one of the evenings after DVCon.

Most of the discussion that year was around OVM and VMM and which methodology was going to win out and which was really open and which simulator supported more of the System-Verilog language. Well, all that is put to bed. This year at DVCon, 733 days later, we all sang Kumbaya as we sat around and our hearts were warmed by the UVM campfire.

But, back to that small group that I hosted 761 days ago. Those that attended this conclave had shrugged off all the OVM and VMM hoopla and decided to come hear this strange discussion about Cloud Computing and SaaS for EDA tools. Some, no doubt, thought there was going to be free booze served, and they were certainly disappointed. Those that stayed, however, heard a fiery discussion between individuals who were either visionaries or lunatics. For many, this was the first time they had heard the term cloud computing explained, and their heads spun as they tried to imagine what, if anything would come of it for the EDA industry.

Over the 761 days since, the voices speaking of cloud computing for EDA, once very soft, grew slowly in volume. All the reasons that it would not work were thrown about like arrows, and those objections continue. But slowly, over time, the voices in support of this model have grown to the point where the question no longer was “if” but “when”.

761 days, that’s when.

Yesterday, to the shock of many at SNUG San Jose, including many in attendence from Synopsys, Aart DeGeus personally answered the question asked 761 days earlier. Indeed, those individuals gathered in that small room at the Doubletree were visionaries, not lunatics.

There are many reasons why Synopsys should not be offering its tools on the cloud via SaaS:

  • Customers will never let their precious proprietary data off-site
  • It will cannibalize longer term license sales
  • The internet connection is too slow and unreliable
  • There’s too much data to transfer
  • The cloud is not secure
  • It’s more expensive
  • It just won’t work

But, as it turns out, there are better reasons to do it:

  • Customers want it

Sure, there are some other reasons. The opportunity to increase revenue by selling higher priced short-term pay-as-you-go licenses. Taking advantage of the parallelism inherent in the cloud. Serving a new customer base that has very peaky needs.

But in the end, Aart did what he does best. He put on his future vision goggles, gazed into the future, saw that the cloud was inevitable, and decided that Synopsys should lead and not follow.

761 days. Now the race is on.

A Tale of Two Booths - Certess and Nusym

Tuesday, June 10th, 2008

I had successfully avoided the zoo that is Monday at DAC and spent Tuesday zig-zagging the exhibit halls looking for my target list of companies to visit. (And former EDA colleagues, now another year older, greyer, and heavier). Interestingly enough, the first and last booths I visited on Tuesday seemed to offer opposite approaches to address the same issue. It was the best of times, it was the worst of times.

A well polished street magician got my attention at first at the Certess booth. After a few card tricks, finding the card I had picked out in the deck, he told me that it was as easy for him to find the card as it was for Certess to find the bugs in my design. Very clever!!! Someone must have been pretty proud they came up with that one. In any case, I’d had some exposure to Certess previously and was interested enough to invest 15 minutes.

Certess’ tool does something they call functional qualification. It’s kinda like ATPG fault grading for your verification suite. Basically, it seeds your DUT with potential bugs, then considers a bug “qualified” if the verification suite would cause the bug to be controlled and observed by a checker or assertion. If you have unqualified bugs (i.e. aspects of your design that are not tested), then there are holes in your verification suite.

This is a potentially useful tool since it helps you understand where the holes are in your verification suite. What next? Write more tests and run more vectors to get to those unqualified bugs. Ugh….more tests? I was hoping this would reduce the work, not increase it!!! This might be increasing my confidence, but life was so much simpler when I could delude myself that my test suite was actually complete.

Whereas the magician caught my attention at the Certess booth, I almost missed the Nusym booth as it was tucked away in the back corner of the Exhibit Hall. Actually, they did not really have a booth, just a few demo suites with a Nusymian guarding the entrance armed with nothing more than a RFID reader and a box of Twinkies. (I did not have my camera, so you’ll have to use your imagination). After all the attention they had gotten at DVCon and from Cooley, I was surprised that “harry the ASIC guy” could just walk up and get a demo in the suite.

(Disclaimer: There was no NDA required and I asked if this was OK to blog about and was told “Yup”, so here goes…)

The cool technology behind Nusym is the ability to do on-the-fly (during simulation) coverage analysis and reactively focused vector generation. Imagine a standard System Verilog testbench with constrained random generators and checkers and coverage groups defining your functional coverage goal. Using standard constrained random testing, the generators create patterns independent of what is inside the DUT and what is happening with the coverage monitors. If you hit actual coverage monitors or not, it doesn’t matter. The generators will do what they will do, perhaps hitting the same coverage monitors over and over and missing others altogether. Result: Lots of vectors run, insufficient functional coverage, more tests needed (random or directed).

The Nusym tool (no name yet) understands the DUT and does on-the-fly coverage analysis. It builds an internal model that includes all of the branches in your DUT and all of your coverage monitors. The constraint solver then generates patterns that try to get to the coverage monitors intentionally. In this way, it can get to deeply nested and hard to reach coverage points in a few vectors whereas constrained random may take a long time or never get there. Also, when you trigger a coverage monitor, it crosses it off the list and know it does not have to hit that monitor again. So the next vectors will try to hit something new. As compared to Certess, this is actually reducing the number of tests I need to write. In fact, they recommend just having a very simple generator that defines the basic constraints and focusing most of the energy on writing the coverage monitors. Result: Much fewer vectors run, high functional coverage. No more tests needed.

It sounds too good to be true, but it was obvious that these guys really believe in this tool and that they have something special. They are taking it slow. Nusym does not have a released product yet, but they have core technology with which they are working with a few customers/partners. They are also focusing on the core of the market, Verilog DUT, System Verilog Testbench. I would not throw out my current simulator just yet, but this seems like very unique and very powerful technology that can get coverage closure orders of magnitude faster than current solutions.

If anyone else saw their demo or has any comments, please chime in.

harry the ASIC guy

counter


Breaking News … Accellera Verification Working Group Forming

Thursday, April 24th, 2008

On her Standards Game Blog  today, Karen Bartleson announced that Accellera is forming a subcommittee to define a standard for verification interoperability.  That is, to try to settle the VMM / OVM war.  As I have stated before in comments on JL Gray’s Cool Veification Blog, this is the right move because it give us input into the process, rather than just the EDA vendors controlling the process for their own benefit.  Also, as I argued in a previous post entitled “The Revolution Will Not Be Televised”, the influence and pressure of the verification community and especially the Cool Verification Blog were at least in part responsible.

Of course, Synopsys will tell you that they are just doing the right thing :-)

It’s not clear how Cadence and Mentor will respond.  Hopefully they’ll join the effort.  Let’s keep the pressure on.