Last week I initiated a poll of verification methodologies being used for functional verification of ASICs. Unlike other polls or surveys, this one was done in a very “open” fashion using a website that allows everyone to view the raw data. In this way, anyone can analyze the data and draw the conclusions that make sense to them, and those conclusions can be challenged and debated based on the data.
What happened next was interesting. Within 48 hours, the poll had received almost 200 responses from all over the world. It had garnered the attention of the big EDA vendors who solicited their supporters to vote. And, as a result, had became a focal point for shenanigans from over-zealous VMM and OVM fans. I had several long nights digging through the data and now I am ready to present the results.
As promised, here is the raw data in PDF format and as an Excel workbook. The only change I have made is to remove the names of the individual 249 respondents.
In summary, the results are as follows:
(Note: The total is more than the 249 respondents because one respondent could be using more than one methodology.)
Regarding the big 3 vendors, the data shows a remarkable consistency with Gary Smith’s market share data. There are 85 respondents planning to use the Synopsys methodologies (VMM,RVM, or Vera) and there are 150 respondents planning to use the Mentor or Cadence methodologies (OVM, AVM, eRM, e). That represents 36% for Synopsys and 64% for Mentor/Cadence. Gary’s data shows Synopsys with 34% market share, Mentor with 35%, and Cadence with 30%.
I’ll share some more insights in upcoming posts. In the meantime, please feel free to offer any insights that you have through your comments. Remember, you too have access to the raw data. This invitation includes the EDA vendors. And feel free to challenge my conclusions … but back it up with data!
Regarding the Verification Methodology Poll I started the other day, I was able to go through the log files and identify the obvious malicious activity. There was a string of deletes and changes of VMM votes to OVM/e votes. Then a string of deletes of OVM votes. I’m going to add back the original entries to make the data whole again.
In the meantime, the obvious malicious activity has subsided, and now there is only a trickle of clearly valid votes coming in. It’s just like listening for the popcorn to stop popping, when I see that the votes slow down to a certain rate, I’ll do my tallies and publish the results.
There have been questions raised regarding my motivations for doing this poll. Some felt that I had some hidden agenda and some even thought that I was some sort of paid shill for one of the vendors. If you are a regular reader of my blog or if you know me, then you know that’s not true. If you don’t know me, then ask around.
At the risk of sounding defensive, my goal was purely to conduct an “open” survey of the verification methodologies being used because this has been such a hot topic this past year, because DVCon is coming up and this would be good information, and because one of my readers suggested it and I thought it was a good idea.The idea of using Doodle was in order that everyone can view the raw data, something you rarely or never get to see when vendors and other organizations conduct polls and then release only the results that suit them best. In this way, anyone could analyze the raw data and draw the conclusions that made sense, and those conclusions could be challenged based on the raw data. The mistake I made was not realizing how easily those who, unlike me, actually had an agenda could vandalize the data.
There have also been questions raised regarding the validity of this poll and how “scientific” it is after all that has occurred. I think they are valid concerns and certainly, if I had to do this over again, I’d fix some things to prevent multiple voting and malicious behavior. Still, as I look at the interim results, they are similar to what I had expected. Each vendor lobbied their constituencies, so the playing field is level. It will be interesting to compare this result to DVCon surveys from the vendors, from DVCon itself, and from John Cooley to see if there is consistency.
Finally, to those of you who legitimately voted, I thank you for participating openly and I apologize that the results will always be subject to some doubt. I hope you don’t feel you wasted your time.
The poll I set up the other day was getting interesting and meaningful responses related to the verification methodologies being used. FORTUNATELY, I saved a snapshot of this data as it was coming in.
UNFORTUNATELY, I apparently did not do enough due diligence with respect to the Doodle site and neglected to realize that there is a way to vandalize the data. Apparently, that is what started happening later in the day, to the point where this has now become a poll-war between the forces of OVM and the forces of VMM. I won’t go so far as to name names, but you know who you are.
I feel bad for those who provided honest data. Thank you for doing so and having faith in this poll. I have a snapshot that I feel is reasonably uncorrupted and I will still publish those results once I remove data that I feel was not entered in good faith.
I may have a way to find out if any of the EDA vendors were involved in this vandalism, so I encourage you to chill out and not make it any worse.
In response to a recent post regarding the verification survey on the DVCon website, Jeremy Ralph of PDTi expressed that he’d “be interested to know what proportion of the SV is OVM vs. VMM”, a question that was missing from the survey. Considering the whole kerfuffle concerning OVM and VMM over the last year, I thought this would be a good question for you, the ones really using the tools. I also thought it would be a good opportunity to try out this new Doodle survey tool I was told about.
So … I created the first ASIC guy survey on Doodle. That was very easy as was casting my own vote. Now it’s your turn.
Feel free to leave a pseudonym if you wish to be anonymous. Make it funny, but keep it clean. And please don’t impersonate someone else. I’ll know something is up if Aart votes for OVM
Also, please let other people know about this poll and ask them to vote. The more votes we have, the more accurate the survey results. And it would be really cool if we can get more respondents online than DVCon had in person.
If this works well, I’ll continue to do this every so often. Feel free to provide suggestions for future polls.
I came across some interesting survey results from the 2007 and 2008 DVCon. Keep in mind, like with any survey, results are skewed based on attendees, which tend to be verification engineers who tend to be using the more advanced methods than the general population. Hence, I’d put more weight on the trends than on the absolute numbers.
1) Which is your primary design language? 2007 2008 Verilog 56% 55% VHDL 9% 10% C/C++ 13% 12% SystemC 9% 8% SystemVerilog 13% 15%
2 Which primary verification language do you use? 2007 2008 C/C+ 18% 18% e 7% 5% OpenVera 4% 4% Verilog 28% 25% VHDL 7% 7% System C 13% 11% SystemVerilog 23% 30%
3) Which primary verification language do you plan to use for your next design? 2007 2008 C/C++ 16% 15% e 5% 4% OpenVera 1% 2% Verilog 16% 16% VHDL 4% 5% SystemC 15% 11% SystemVerilog 43% 47%
4) Which primary property specification (assertion-based verification) language do you use? 2007 2008 Verilog 31% 34% VHDL 7% 6% PSL 12% 10% SVA 49% 50%
Although not as hot a topic in the press and in the blogosphere, this represents a firm step forward in the standardization of the overall coverage driven verification methodology, whether you pray from the OVM or from the VMM hymnal. Whereas ratified or defacto standards already exist for the testbench languages, the requirements and coverage capture tools and formats are still proprietary to each of the 3 major vendors. This prevents the verification management tools of one vendor from being used with another vendor’s simulator. Having a UCDB standard will facilitate portability and enable more innovative solutions to be built by third parties on top of this standard.
Although Synopsys and Cadence have their own unique UCDB format, the basic elements of this standard should be much easier to agree upon without the political wrangling slowing the VIP subcommittee. I also think this is an opportunity for Synopsys, Mentor, and Cadence to show that they really can cooperate for the benefit of their customers and win back some of the goodwill lost in the OVM vs. VMM battle
I had successfully avoided the zoo that is Monday at DAC and spent Tuesday zig-zagging the exhibit halls looking for my target list of companies to visit. (And former EDA colleagues, now another year older, greyer, and heavier). Interestingly enough, the first and last booths I visited on Tuesday seemed to offer opposite approaches to address the same issue. It was the best of times, it was the worst of times.
A well polished street magician got my attention at first at the Certess booth. After a few card tricks, finding the card I had picked out in the deck, he told me that it was as easy for him to find the card as it was for Certess to find the bugs in my design. Very clever!!! Someone must have been pretty proud they came up with that one. In any case, I’d had some exposure to Certess previously and was interested enough to invest 15 minutes.
Certess’ tool does something they call functional qualification. It’s kinda like ATPG fault grading for your verification suite. Basically, it seeds your DUT with potential bugs, then considers a bug “qualified” if the verification suite would cause the bug to be controlled and observed by a checker or assertion. If you have unqualified bugs (i.e. aspects of your design that are not tested), then there are holes in your verification suite.
This is a potentially useful tool since it helps you understand where the holes are in your verification suite. What next? Write more tests and run more vectors to get to those unqualified bugs. Ugh….more tests? I was hoping this would reduce the work, not increase it!!! This might be increasing my confidence, but life was so much simpler when I could delude myself that my test suite was actually complete.
Whereas the magician caught my attention at the Certess booth, I almost missed the Nusym booth as it was tucked away in the back corner of the Exhibit Hall. Actually, they did not really have a booth, just a few demo suites with a Nusymian guarding the entrance armed with nothing more than a RFID reader and a box of Twinkies. (I did not have my camera, so you’ll have to use your imagination). After all the attention they had gotten at DVCon and from Cooley, I was surprised that “harry the ASIC guy” could just walk up and get a demo in the suite.
(Disclaimer: There was no NDA required and I asked if this was OK to blog about and was told “Yup”, so here goes…)
The cool technology behind Nusym is the ability to do on-the-fly (during simulation) coverage analysis and reactively focused vector generation. Imagine a standard System Verilog testbench with constrained random generators and checkers and coverage groups defining your functional coverage goal. Using standard constrained random testing, the generators create patterns independent of what is inside the DUT and what is happening with the coverage monitors. If you hit actual coverage monitors or not, it doesn’t matter. The generators will do what they will do, perhaps hitting the same coverage monitors over and over and missing others altogether. Result: Lots of vectors run, insufficient functional coverage, more tests needed (random or directed).
The Nusym tool (no name yet) understands the DUT and does on-the-fly coverage analysis. It builds an internal model that includes all of the branches in your DUT and all of your coverage monitors. The constraint solver then generates patterns that try to get to the coverage monitors intentionally. In this way, it can get to deeply nested and hard to reach coverage points in a few vectors whereas constrained random may take a long time or never get there. Also, when you trigger a coverage monitor, it crosses it off the list and know it does not have to hit that monitor again. So the next vectors will try to hit something new. As compared to Certess, this is actually reducing the number of tests I need to write. In fact, they recommend just having a very simple generator that defines the basic constraints and focusing most of the energy on writing the coverage monitors. Result: Much fewer vectors run, high functional coverage. No more tests needed.
It sounds too good to be true, but it was obvious that these guys really believe in this tool and that they have something special. They are taking it slow. Nusym does not have a released product yet, but they have core technology with which they are working with a few customers/partners. They are also focusing on the core of the market, Verilog DUT, System Verilog Testbench. I would not throw out my current simulator just yet, but this seems like very unique and very powerful technology that can get coverage closure orders of magnitude faster than current solutions.
If anyone else saw their demo or has any comments, please chime in.
OK … I’m registered to go to DAC for at least one day, maybe two. I’ll definitely be there on Tuesday and probably Wednesday evening for a Blogging “Birds-of-a-Feather” session that JL Gray is setting up. Besides hitting the forums and other activities, I’ll have about half a day to attack the exhibit floor or the “suites” to look at some new technology. If you want to meet up, drop me an email and we can arrange something.
Cadence won’t be there and I already talk to Synopsys and Mentor on a regular basis, so I’m planning on focusing on smaller companies with new technology. Here’s what’s on my list so far…
Nusym - They have some new “Path Tracing” technology that finds correlations between a constrained random testbench and hard-to-hit functional coverage points. With this knowledge, they claim to be able to modify the constraints to guide the simulation to hit the coverage points. The main benefit is in getting that last few % of functional coverage that can be difficult with unguided constrained random patterns.
Chip Estimate - Having been around for a few years and recently bought by Cadence, they are basically a portal where you can access 3rd party IP and use the information to do a rough chip floorplan. This allows you to estimate area, power, yield, etc. I’m real curious as to their business model and why Cadence bought them. At a minimum, it should be entertaining to see the hyper-competitive IP vendors present back-to-back at half hour intervals on the DAC floor.
I have a few others on my list, but there are so many small companies that it’s hard to go thru them all and decide what to see. That’s where I need your help.
My friend Ron has a knack for recognizing revolutionary technologies before most of us. He was one of the first to appreciate the power of the browser and how it would transform the internet, previously used only by engineers and scientists. He was one of the first and best podcasters. And now he’s become a self-proclaimed New Media Evangelist, preaching the good news of Web 2.0 and making it accessible to “the rest of us”.
Most of us are familiar with mainstream Web 2.0 applications, whether we use them or our friends use them or our kids use them. Social and professional networks such as My Space, Facebook, and LinkedIn. Podcasts in iTunes. Blogging sites on every topic. Virtual worlds such as Second Life. Collaboration tools such as Wikipedia. File sharing sites such as Youtube and Flickr. Social bookmarking sites such as Digg and Technorati. Open source publishing tools such as Wordpress and Joomla. Using these technologies we’re having conversations, collaborating, and getting smarter in ways that were unimaginable just 5 years ago. Imagine, a rock climber in Oregon can share climbing techniques with a fellow climber in Alice Springs. And mostly for free, save for the cost of the internet connection.
When we think of Web 2.0, we tend to think of teenagers and young adults. But this technology was invented by us geeks and so it’s no surprise that the ASIC design world is also getting on-board. Here are some examples from the ASIC Design industry:
Social media is networking ASIC designer to ASIC designer enabling us to be smarter faster. But that’s not all. Many forward looking companies have recognized the opportunity to talk to their customers directly. About 6 months ago, Synopsys launched several blogs on its microsite. Xilinx also has a User Community and a blog. It’s great that this is happening, but does it really make much of a difference? Consider what I believe could be a watershed event:
A few months ago, JL Grey published a post on his Cool Verification blog entitled The Brewing Standards War - Verification Methodology. As expected, verification engineers chimed in and expressed their ardent opinions and viewpoints. What came next was not expected … stakeholders from Synopsys and Mentor joined the conversation. The chief VMM developer from Synopsys, Janick Bergeron, put forth information to refute certain statements that he felt were erroneous. A marketing manager from Mentor, Dennis Brophy, offered his views on why OVM was open and VMM was not. And Karen Bartleson, who participates in several standards committees for Synopsys, disclosed Synopsys’ plan to encourage a single standard by donating VMM to Accellera.
From what I’ve heard, this was one of the most viewed ASIC related blog postings ever (JL: Do you have any stats you can share?). But did it make a difference in changing the behavior of any of the protagonists? I think it did and here is why:
This week at the Synopsys Users Group meeting in San Jose, the VMM / OVM issues were the main topic of questioning for CEO Aart DeGeus after his keynote address. And the questions picked up where they left off in the blog post…Will VMM ever be open and not just licensed? Is Synopsys trying to talk to Mentor and Cadence directly? If we have access to VMM, can we run it on other simulators besides VCS?
Speaking to several Synopsoids afterwards, I discovered that the verification marketing manager referenced this particular Cool Verification blog posting in an email to an internal Synopsys verification mailing list. It seems he approved of some of the comments and wanted to make others in Synopsys aware of these customer views. Evidently he sees these opinions as valuable and valid. Good for him.
Speaking to some at Synopsys who have a say in the future of VMM, I believe that Synopsys’ decision to donate VMM to Accellera has been influenced and pressured, at least in part, by the opinions expressed in the blog posting and the subsequent comments. Good for us.
I’d like to believe that the EDA companies and other suppliers are coming to recognize what mainstream companies have recognized … that the battle for customers is decreasingly being fought with advertisements, press releases, glossy brochures, and animated Power Point product pitches. Instead, as my friend Ron has pointed out, I am able to talk to “passionate content creators who know more about designing chips than any reporter could ever learn”, and find out what they think. Consider these paraphrased excerpts of the cluetrain manifesto : the end of business as usual:
The Internet is enabling conversations among human beings that were simply not possible in the era of mass media. As a result, markets are getting smarter, more informed, more organized.
People in networked markets have figured out that they get far better information and support from one another than from vendors.
There are no secrets. The networked market knows more than companies do about their own products. And whether the news is good or bad, they tell everyone.
Companies that don’t realize their markets are now networked person-to-person, getting smarter as a result and deeply joined in conversation are missing their best opportunity.
Companies can now communicate with their markets directly. If they blow it, it could be their last chance.
In short, this ASIC revolution will not be televised!!!