Posts Tagged ‘VMM’

An ASIC Guy Visits An FPGA World - Part II

Monday, June 22nd, 2009

Altera FPGA

I mentioned a few weeks ago that I am wrapping up a project with one of my clients and beating the bushes for another project to take its place. As part of my search, I visited a former colleague who works at a small company in Southern California. This company designs a variety of products that utilize FPGAs exclusively (no ASICs), so I got a chance to understand a little bit more about the differences between ASIC and FPGA design. Here’s the follow-on then to my previous post An ASIC Guy Visits An FPGA World.

Recall that the first 4 observations from my previous visit to FPGA World were:

Observation #1 - FPGA people put their pants on one leg at a time, just like me.

Observation #2 - I thought that behavioral synthesis had died, but apparently it was just hibernating.

Observation #3 - Physical design of FPGAs is getting like ASICs.

Observation #4 - Verification of FPGAs is getting like ASICs.

Now for the new observations:

Observation #5 - Parts are damn cheap - According to the CTO of this company, Altera Cyclone parts can cost as little as $10-$20 each in sufficient quantities. A product that requires thousands or even tens of thousands will still cost less than a 90nm mask set. For many non-consumer products with quantities in this range, FPGAs are compelling from a cost standpoint.

True, the high-end parts can cost thousands or even tens of thousands each (e.g. for the latest Xilinx Virtex 6). But considering that a Virtex 6 part is 45nm and has the gate-count equivalent of almost 10M logic gates, what would an equivalent ASIC cost?

Observation # 6 - FPGA verification is different (at least for small to medium sized FPGAs) - Since it is so easy and fast and inexpensive (compared to ASIC) to synthesize and place and route an FPGA, much more of the functional verification is done in the lab on real hardware. Simulation is typically used to get a “warm and fuzzy” that the design is mostly functional, and then the rest is done in the lab with the actual FPGA. Tools like Xilinx ChipScope allow logic-analyzer-like access into the device, providing some, but not all, of the visibility that exists in a simulation. And once bugs are found, they can be fixed with an RTL change and reprogramming the FPGA.

One unique aspect of FPGA verification is that it can be done in phases or “spirals”. Perhaps only some of the requirements for the FPGA are complete or only part of the RTL is available. No problem. One can implement just that part of the design that is complete (for instance just the dataplane processing) and program the part. Since the same part can be used over and over, the cost to do this is basically $0. Once the rest of the RTL is available, the part can be reprogrammed again.

Observation # 7 - FPGA design tools are all free or dirt cheap - I think everybody knows this fact already, but it really hit home talking to this company. Almost all the tools they use for design are free or very inexpensive, yet the tools are more than capable to “get the job done”. In fact, the company probably could not operate in the black if they had to make the kind of investment that ASIC design tools require.

Observation # 8 - Many tools and methods common in the ASIC world are still uncommon in this FPGA world - For this company, there is no such thing as logical equivalence checking. Verification tools that perform formal verification of designs (formal proof), System-Verilog simulation, OVM, VMM…not used at all. Perhaps they’ll be used for the larger designs, but right now they are getting along fine without them.

__________

FPGA verification is clearly the area that is the most controversial. In one camp are the “old skool” FPGA designers that want to get the part in the lab as soon as possible and eschew simulation. In the other camp are the high-level verification proponents who espouse the merits of coverage-driven and metric-driven verification and recommend achieving complete coverage in simulation. I think it would really be fun to host a panel discussion with representatives from both camps and have them debate these points. I think we’d learn a lot.

Hmmm…

harry the ASIC guy

Setting The Record Straight

Thursday, February 19th, 2009

Since conducting the Verification Methodology Poll and publishing the raw results last week, I’ve been planning to follow up with a post that digs a little deeper into the numbers. Things have gotten rather busy in the meantime, both at work and with organizing the SaaS and Cloud Computing EDA Roundtable for next week at DVCon. So I’ve let it slip a little.

Well, I noticed today that the verification methodology poll was referenced in a Cadence blog post by Adam Sherer. The results were somewhat mis-interpreted (in my opinion), so that kicked my butt to post my own interpretations to set the record straight. Says Adam:

According to the poll conducted by Harry Gries in his Harry the ASIC Guy blog, you should go “all in” on the OVM because it is the 2:1 favorite.

In fact, the raw results had VMM with 80 users and OVM with 125 users, a ratio of just over 1.5:1 (1.5625 to be exact). So the 2:1 ratio is not accurate. However, if you add in RVM/Vera users to the VMM numbers, and then add in AVM, eRM, and e users to the OVM numbers, that ratio is more like 1.8:1. Closer, but still not 2:1.

It also indicates that my poll says that “you should go ‘all in’ on the OVM”. I never said that nor does the poll say anything about what you “should do”. The data simply captures what people are planning on using next. If you are inclined to follow the majority, then perhaps OVM is the way to go. By contrast, there is nothing in the poll comparing the technical merits of the various methodologies. So, if you are inclined to make up your own mind, then you have some work to do and my poll won’t help you on that. You’re probably better off visiting JL Gray at Cool Verification.

No poll is perfect and it will be interesting to compare to DVCon and John Cooley polls to see if they are consistent. Here are a few other interesting stats that I pulled out of the poll results:

  • 91% of respondents are using some sort of SystemVerilog methodology
  • 10% are using both OVM and VMM (although I suspect many of these are consultants)
  • 27% are still using e or Vera (more e than Vera)
  • 4% are using ONLY VHDL or Verilog (this number may be low due to the skew of respondents towards advanced methodologies)

Again, I welcome you to download the raw data, which you can find in PDF format and as an Excel workbook, and draw your own conclusions.

harry the ASIC guy

Verification Methodology Poll Results

Wednesday, February 11th, 2009

Last week I initiated a poll of verification methodologies being used for functional verification of ASICs. Unlike other polls or surveys, this one was done in a very “open” fashion using a website that allows everyone to view the raw data. In this way, anyone can analyze the data and draw the conclusions that make sense to them, and those conclusions can be challenged and debated based on the data.

What happened next was interesting. Within 48 hours, the poll had received almost 200 responses from all over the world. It had garnered the attention of the big EDA vendors who solicited their supporters to vote. And, as a result, had became a focal point for shenanigans from over-zealous VMM and OVM fans.  I had several long nights digging through the data and now I am ready to present the results.

As promised, here is the raw data in PDF format and as an Excel workbook. The only change I have made is to remove the names of the individual 249 respondents.

In summary, the results are as follows:

RAW Results from Verification Methodology Poll


(Note: The total is more than the 249 respondents because one respondent could be using more than one methodology.)

Regarding the big 3 vendors, the data shows a remarkable consistency with Gary Smith’s market share data. There are 85 respondents planning to use the Synopsys methodologies (VMM,RVM, or Vera) and there are 150 respondents planning to use the Mentor or Cadence methodologies (OVM, AVM, eRM, e). That represents 36% for Synopsys and 64% for Mentor/Cadence. Gary’s data shows Synopsys with 34% market share, Mentor with 35%, and Cadence with 30%.

Methodology Split

Gary Smith Market Share Data


I’ll share some more insights in upcoming posts. In the meantime, please feel free to offer any insights that you have through your comments. Remember, you too have access to the raw data. This invitation includes the EDA vendors. And feel free to challenge my conclusions … but back it up with data!

harry the ASIC guy

Quick Update On Verification Methodology Poll

Friday, February 6th, 2009

Quick update for everyone…

Regarding the Verification Methodology Poll I started the other day, I was able to go through the log files and identify the obvious malicious activity.  There was a string of deletes and changes of VMM votes to OVM/e votes. Then a string of deletes of OVM votes. I’m going to add back the original entries to make the data whole again.

In the meantime, the obvious malicious activity has subsided, and now there is only a trickle of clearly valid votes coming in. It’s just like listening for the popcorn to stop popping, when I see that the votes slow down to a certain rate, I’ll do my tallies and publish the results.

There have been questions raised regarding my motivations for doing this poll.  Some felt that I had some hidden agenda and some even thought that I was some sort of paid shill for one of the vendors. If you are a regular reader of my blog or if you know me, then you know that’s not true.  If you don’t know me, then ask around.

At the risk of sounding defensive, my goal was purely to conduct an “open” survey of the verification methodologies being used because this has been such a hot topic this past year, because DVCon is coming up and this would be good information, and because one of my readers suggested it and I thought it was a good idea.The idea of using Doodle was in order that everyone can view the raw data, something you rarely or never get to see when vendors and other organizations conduct polls and then release only the results that suit them best. In this way, anyone could analyze the raw data and draw the conclusions that made sense, and those conclusions could be challenged based on the raw data. The mistake I made was not realizing how easily those who, unlike me, actually had an agenda could vandalize the data.

There have also been questions raised regarding the validity of this poll and how “scientific” it is after all that has occurred. I think they are valid concerns and certainly, if I had to do this over again, I’d fix some things to prevent multiple voting and malicious behavior. Still, as I look at the interim results, they are similar to what I had expected. Each vendor lobbied their constituencies, so the playing field is level. It will be interesting to compare this result to DVCon surveys from the vendors, from DVCon itself, and from John Cooley to see if there is consistency.

Finally, to those of you who legitimately voted, I thank you for participating openly and I apologize that the results will always be subject to some doubt. I hope you don’t feel you wasted your time.

harry the ASIC guy

Verification Methodology Poll

Tuesday, February 3rd, 2009

In response to a recent post regarding the verification survey on the DVCon website, Jeremy Ralph of PDTi expressed that he’d “be interested to know what proportion of the SV is OVM vs. VMM”, a question that was missing from the survey. Considering the whole kerfuffle concerning OVM and VMM over the last year, I thought this would be a good question for you, the ones really using the tools. I also thought it would be a good opportunity to try out this new Doodle survey tool I was told about.

So … I created the first ASIC guy survey on Doodle. That was very easy as was casting my own vote. Now it’s your turn.

HERE’S THE LINK

Feel free to leave a pseudonym if you wish to be anonymous. Make it funny, but keep it clean. And please don’t impersonate someone else.  I’ll know something is up if Aart votes for OVM :-)

Also, please let other people know about this poll and ask them to vote.  The more votes we have, the more accurate the survey results. And it would be really cool if we can get more respondents online than DVCon had in person.

If this works well, I’ll continue to do this every so often. Feel free to provide suggestions for future polls.

harry the ASIC guy

VMM on Questa & IUS Redux? Anything New Here?

Friday, December 5th, 2008

Considering what I’ve been hearing about the status of the Accellera VIP Subcommitee activity regarding OVM / VMM integration, I was rather surprised to see the following synchronized press releases from Mentor and Cadence yesterday:

As I understand, the Accellera VIP Subcommittee has just recently begun tackling the real crux issues regarding integrating the 2 methodologies such as:

  • Casting of disparate types
  • Synchronization of the simulation phases
  • Message reporting

My speculation is that Mentor and Cadence are just now formally announcing the availability of the “fixed up” VMM code that had previously leaked out in a blog post by JL Gray.

Does anyone out there know what’s really in this release? It would be good to hear directly from the vendors on this.

How about OVM on VCS? Has anybody been able to get that working?

harry the ASIC guy

Birth of an EDA Revolution

Friday, September 5th, 2008

I can’t sleep at night.

This Idea has been bouncing around in my head for the past few months. I can’t shake it. If you know me, then you’ve probably heard me talk about the Idea or ask your opinion about the Idea or whether I’m crazy. I’ve been itching to blog about this Idea, but haven’t been able to figure out the right way to approach it.

Then, the other day, Ron Ploof gave me a way to approach the Idea in my blog. Please read Ron’s post on the Birth of a New Media Revolution first before continuing. It’s damn good, you’ll get something out of it, and it gives context to this post.

OK … done? Good.

Ron’s main point is that a revolution can’t happen until all the enabling pieces are in place. New media required easy-to-use publishing tools, simple syndication (i.e. media distribution), and low-cost bandwidth.  Once those were in place, new media hit the tipping point.

Well, I’m going to go out on a limb today with a prediction:

The pieces are coming together for a revolution in EDA. Like most revolutions, it is starting small, hardly noticed by the big guys on the block. In the next 5 years, it will change our industry forever by leveling the playing field, allowing smaller EDA companies to compete with larger ones, giving customers greater flexibility on how and when they access tools and which vendor’s tools they use.

It’s going to happen.  And just as with new media, there are three barriers that will need to come down before we hit that tipping point.  They are:

  1. The high cost of sales, marketing, and support.
  2. Licensing models that lock-in customers.
  3. Lack of comprehensive standards for tool interoperability.

If you’ve been staring at the EDA horizon like I have, you’ve already seen that all of these barriers are starting to come down:

  1. A week ago, a company called Xuropa launched an online tradeshow platform that could greatly reduce the cost of sales for EDA companies and enable greater access to designers to evaluate tools.
  2. For several years now, Cadence has provided access to short-term licenses through their eDACard model and Synopsys will introduce a similar offering before the end of the year. Cadence also provides a service through their consulting organization called “hosted VCAD” whereby customers can access software and hardware on a Software-as-a-Service basis. How long before the other vendors follow?
  3. As Karen Bartleson noted on her blog yesterday, the EDA industry has moved into an “Age of Responsiveness” with regards to tool interoperability where tools are expected to be open and inclusive.  As witnessed in the latest OVM / VMM standards war, open standards are required as the price of admission and “woe be to those” that do not heed this call.

I’m a realist. This EDA revolution is just beginning and will take some time.  It won’t happen without a fight from those who stand to lose out. But I believe that the revolution is inexorable.  And the sooner the EDA companies learn to swim with the tide, the better off they will be after the revolution.

There’s a lot more that I need to say before I can sleep at night, but too much for one post.  Stay tuned.

harry the ASIC guy

Synopsys Calls, Mentor Raises

Thursday, July 24th, 2008

Not to be outdone, but with much less fanfare and ballyhoo than Synopsys’ donation of its Verification Methodology Manual (VMM) class library to the Accellera Verification IP (VIP) Technical Subcommittee, Mentor Graphics last week donated it’s Unified Coverage Database (UCDB) to the Accellera Unified Coverage Database Interoperability (UCIS) Technical Subcommittee.

Although not as hot a topic in the press and in the blogosphere, this represents a firm step forward in the standardization of the overall coverage driven verification methodology, whether you pray from the OVM or from the VMM hymnal. Whereas ratified or defacto standards already exist for the testbench languages, the requirements and coverage capture tools and formats are still proprietary to each of the 3 major vendors. This prevents the verification management tools of one vendor from being used with another vendor’s simulator. Having a UCDB standard will facilitate portability and enable more innovative solutions to be built by third parties on top of this standard.

Although Synopsys and Cadence have their own unique UCDB format, the basic elements of this standard should be much easier to agree upon without the political wrangling slowing the VIP subcommittee. I also think this is an opportunity for Synopsys, Mentor, and Cadence to show that they really can cooperate for the benefit of their customers and win back some of the goodwill lost in the OVM vs. VMM battle

harry the ASIC guy

Breaking News … Accellera Verification Working Group Forming

Thursday, April 24th, 2008

On her Standards Game Blog  today, Karen Bartleson announced that Accellera is forming a subcommittee to define a standard for verification interoperability.  That is, to try to settle the VMM / OVM war.  As I have stated before in comments on JL Gray’s Cool Veification Blog, this is the right move because it give us input into the process, rather than just the EDA vendors controlling the process for their own benefit.  Also, as I argued in a previous post entitled “The Revolution Will Not Be Televised”, the influence and pressure of the verification community and especially the Cool Verification Blog were at least in part responsible.

Of course, Synopsys will tell you that they are just doing the right thing :-)

It’s not clear how Cadence and Mentor will respond.  Hopefully they’ll join the effort.  Let’s keep the pressure on.

The Revolution Will Not Be Televised!!!

Thursday, April 3rd, 2008

My friend Ron has a knack for recognizing revolutionary technologies before most of us. He was one of the first to appreciate the power of the browser and how it would transform the internet, previously used only by engineers and scientists. He was one of the first and best podcasters. And now he’s become a self-proclaimed New Media Evangelist, preaching the good news of Web 2.0 and making it accessible to “the rest of us”.

Most of us are familiar with mainstream Web 2.0 applications, whether we use them or our friends use them or our kids use them. Social and professional networks such as My Space, Facebook, and LinkedIn. Podcasts in iTunes. Blogging sites on every topic. Virtual worlds such as Second Life. Collaboration tools such as Wikipedia. File sharing sites such as Youtube and Flickr. Social bookmarking sites such as Digg and Technorati. Open source publishing tools such as Wordpress and Joomla. Using these technologies we’re having conversations, collaborating, and getting smarter in ways that were unimaginable just 5 years ago. Imagine, a rock climber in Oregon can share climbing techniques with a fellow climber in Alice Springs. And mostly for free, save for the cost of the internet connection.

When we think of Web 2.0, we tend to think of teenagers and young adults. But this technology was invented by us geeks and so it’s no surprise that the ASIC design world is also getting on-board. Here are some examples from the ASIC Design industry:

Social media is networking ASIC designer to ASIC designer enabling us to be smarter faster. But that’s not all. Many forward looking companies have recognized the opportunity to talk to their customers directly. About 6 months ago, Synopsys launched several blogs on its microsite. Xilinx also has a User Community and a blog. It’s great that this is happening, but does it really make much of a difference? Consider what I believe could be a watershed event:

A few months ago, JL Grey published a post on his Cool Verification blog entitled The Brewing Standards War - Verification Methodology. As expected, verification engineers chimed in and expressed their ardent opinions and viewpoints. What came next was not expected … stakeholders from Synopsys and Mentor joined the conversation. The chief VMM developer from Synopsys, Janick Bergeron, put forth information to refute certain statements that he felt were erroneous. A marketing manager from Mentor, Dennis Brophy, offered his views on why OVM was open and VMM was not. And Karen Bartleson, who participates in several standards committees for Synopsys, disclosed Synopsys’ plan to encourage a single standard by donating VMM to Accellera.

From what I’ve heard, this was one of the most viewed ASIC related blog postings ever (JL: Do you have any stats you can share?). But did it make a difference in changing the behavior of any of the protagonists? I think it did and here is why:

  • This week at the Synopsys Users Group meeting in San Jose, the VMM / OVM issues were the main topic of questioning for CEO Aart DeGeus after his keynote address. And the questions picked up where they left off in the blog post…Will VMM ever be open and not just licensed? Is Synopsys trying to talk to Mentor and Cadence directly? If we have access to VMM, can we run it on other simulators besides VCS?
  • Speaking to several Synopsoids afterwards, I discovered that the verification marketing manager referenced this particular Cool Verification blog posting in an email to an internal Synopsys verification mailing list. It seems he approved of some of the comments and wanted to make others in Synopsys aware of these customer views. Evidently he sees these opinions as valuable and valid. Good for him.
  • Speaking to some at Synopsys who have a say in the future of VMM, I believe that Synopsys’ decision to donate VMM to Accellera has been influenced and pressured, at least in part, by the opinions expressed in the blog posting and the subsequent comments. Good for us.

I’d like to believe that the EDA companies and other suppliers are coming to recognize what mainstream companies have recognized … that the battle for customers is decreasingly being fought with advertisements, press releases, glossy brochures, and animated Power Point product pitches. Instead, as my friend Ron has pointed out, I am able to talk to “passionate content creators who know more about designing chips than any reporter could ever learn”, and find out what they think. Consider these paraphrased excerpts of the cluetrain manifesto : the end of business as usual:

  • The Internet is enabling conversations among human beings that were simply not possible in the era of mass media. As a result, markets are getting smarter, more informed, more organized.
  • People in networked markets have figured out that they get far better information and support from one another than from vendors.
  • There are no secrets. The networked market knows more than companies do about their own products. And whether the news is good or bad, they tell everyone.
  • Companies that don’t realize their markets are now networked person-to-person, getting smarter as a result and deeply joined in conversation are missing their best opportunity.
  • Companies can now communicate with their markets directly. If they blow it, it could be their last chance.

In short, this ASIC revolution will not be televised!!!

harry the ASIC guy