Posts Tagged ‘Cadence’

Mentor Is Listening

Thursday, June 11th, 2009

My morning routine is pretty, well, routine.

Get up.  Wake the kids.

Check email.  Ask the kids to stop jumping on the couch.

Check Twitter. Tell the kids again to stop jumping on the couch.

Check my Google Reader. Glare at the kids with that “I’ve asked you for the last time” look.

You get the idea.

This Wednesday morning, somewhere in between conversations with my kids, walking the dog, and getting ready for work, I came across the following comment on a friend’s blog:

Ron, we are listening.

http://www.mentor.com/blogs

Ron Fuller
Web Manager, Mentor Graphics

For background, Ron Ploof is the guy who got the crazy idea almost 3 years ago that Synopsys should be doing something in this new world called social media. (Actually, I don’t think the term “social media” had even been coined back then). He evangelized this belief to the VP of Marketing at Synopsys and created for himself a job as Synopsys’ “New Media Evangelist” (actual title on his business card). He launched Synopsys’ first foray into social media, including podcasts, videos, and most prominently, blogs.

Synopsys’ success motivated Cadence to follow suit (something confided to me by Cadence’s former community manager). And it seems, according to the comment on Ron’s blog, it also motivated Mentor’s move into social media.

__________

I wanted to find out more about the Mentor blogs and I was able to set up some time to talk over lunch with Sonia Harrison at Mentor (see her sing at the Denali DAC party) . Sonia had helped me set up my previous interview with Paul Hofstadler and had extended me an invitation to attend the Mentor User2User conference (which, unfortunately, I could not attend). As it turns out, Sonia was the absolutely right person to talk to.

Even though I had only now become aware of Mentor blogs, Mentor had evidently coordinated their launch with the launch of their new website several months ago. Sonia was quite humble, but it seems that she was the driving force behind the blogs and Mentor’s presence in other social media like Twitter. She had been watching what was going on for some time, hesitant to jump in without a good plan, and now was the time.

According to Sonia, Mentor’s motivation for doing the blogs was to extend into a new media their “thought leadership” in the industry, to draw customers in to their website, and to exchange information with customers. Interestingly, Mentor did not hire an outside social media consultant or community manager like Cadence had. Rather, the project was homegrown. Sonia recruited various technical experts and others as bloggers. She developed “common sense” social media guidelines to make sure bloggers were informed of and played by social media rules (e.g. no sensitive or proprietary information, be polite, respect copyrights, give attribution).

According to Sonia, “one of the more difficult things was to get people to commit to blogging regularly. Writing takes time, it’s almost a full time job.” Despite this additional work burden, Mentor has no plans to bring in professional journalists as bloggers like Richard Goering at Cadence. And it doesn’t seem they need to. Simon Favre received a blog of the week award from System Level Design a few weeks ago, so they are doing quite well on their own.

Sonia does not have any specific measurable goals (page views, subscribers, etc.), which I think is a mistake, especially when her upper management comes asking for evidence that these efforts are paying off. My friend Ron likes to tell me that social media is the most measurable media ever and it’s a shame not to use the data.

I started playing with the site later in the afternoon and noticed a few things. First, when I added a comment to one of the blogs without registering, it did not show up right away, nor did I get a message that the comment was being moderated. It did show up later in the day, but it would be nice to at least be told that it was “awaiting moderation”. Still better, why moderate or require registration at all? The likelihood of getting inappropriate comments from engineering professionals is very low, and they can always be removed if need be. Moderation of comments will also kill a hot topic in its tracks. I’ve personally had the experience of publishing a new blog post late at night and waking up to several comments, some addressing other comments. Had I moderated the blog, none of those comments would have even showed up until later in the day.

Second, there was no way to enter a URL or blog address when leaving a comment. It is pretty standard practice to have this feature to allow readers to “check out” the person leaving the comment. Hopefully thay can add this.

On the positive side, the most important feature of a blog is the content and the content looks very good, especially the PCB blogs. Also, there is apparently no internal review or censorship of blog posts, so bloggers have the freedom to write whatever they want, within the social media guidelines of course.

 __________

It’s been almost 3 years since Ron made his first pitch to his manager. Who would have thought that the Big 3 and many others would have adopted social media in such a short time. Meanwhile, my kids are still jumping on the couch.

GTG

harry the ASIC guy

Thoughts On Synopsys’ Q2 2009 Earnings Call

Thursday, May 21st, 2009

Last night you may have watched the NBA Playoff game in which the Orlando Magic came back to defeat the heavily favored Cleveland Cavaliers. Great game!!!

Or the finale of American Idol in which Kris Allen came back to defeat the heavily favored Adam Lambert. Great show!!!

What did I do last night? I listened to the Q2 2009 Synopsys earnings call. Great conference call!!!

(OK … I’ll admit it wasn’t as exciting and nail biting as either of the other viewing options. Just think of it like this: I took on the work of listening to the call and summarizing it for you, in order to free you up to watch the game or idol. You can thank me later :-) )

Here’s the summary. (You can read the full transcript here if you like).

Financials

On the up side, Synopsys had a good Q2, beating their revenue and earnings per share guidance slightly. On the down side, Synopsys lowered its revenue and cash flow guidance slightly for the rest of the year, allowing for potential customer bankruptcies, late payments, and reduced bookings. Customers are approaching Synopsys to “help them right now through this downturn”, i.e. to reduce their cost of software. It looks like the recession is finally catching up to them.

As I finish off this post on Thursday morning, it looks like the analysts agree. Synopsys shares are down 10%, so it seems they are getting punished for revising their forecast. 

Still, Synopsys is in very good financial health, with $877M in cash and short term investments. Their cash flow is going to go down the rest of the year, so they will eat into this fund, but they will still have plenty to selectively acquire strong technology that might add to their portfolio, as they did with the MIPs Analog Business Group.

Themes

There were 2 themes or phrases that kept recurring in the call that I am sure were points of emphasis for Aart.

First, the word “momentum” was used 6 times (by my count) during the call. Technology momentum. Customer momentum. Momentum in the company. Clearly, Synopsys is trying to portray an image of the company building up steam while the rest of the industry wallows in the recession.

Second, customers are “de-risking their supplier relationships”, i.e. looking to consolidate with an EDA vendor with strong financials who’ll still be there when the recession ends. Again, Synopsys is trying to portray itself as the safe choice for customers, hoping to woo customers away from less financially secure competitors like Cadence and Magma. This ties in with the flurry of “primary EDA vendor” relationships that Synopsys has announced recently.

The opportunity for Synopsys (and danger for the competition) is to pick up market share during this downturn and it looks like that may be happening as companies “de-risk” by going with the company with the “momentum” and a “extraordinarily strong position”. Or at least that’s the message that Synopsys is sending.

Technology

Aart did rattle off the usual laundry list of technology that he wanted to highlight, including some introduced last year (e.g. Z-route). Of note were the following:

  • Multi-core technology in VCS with 2x speedup (is 2x a lot?)
  • Custom Designer, which Aart called “a viable alternative to the incumbent” (ya know marketing didn’t pick the word “viable”)
  • Analog IP via the MIPS Analog Business Group acquisition, especially highlighting how that complements the Custom Designer product (do I see “design kits” in the future?)
  • The Lynx Design System (see my 5-part series)
  • IC-Validator (smells like DRC fixing in IC Compiler - Webinar today, I’ll find out more)

__________

In summary, Synopsys had a good quarter, but they have finally acknowledged that they are not immune to the downturn and they expect to get impacted the next few quarters.

harry the ASIC guy

TSMC Challenges Lynx With Flow Of Their Own

Wednesday, May 6th, 2009

About a month and a half ago, I wrote a 5 part series of blog posts on the newly introduced Lynx Design System from Synopsys:

One key feature, the inclusion of pre-qualified technology and node specific libraries in the flow, was something I had pushed for when I was previously involved with Lynx (then called Pilot). These libraries would have made Lynx into a complete out-of-the-box foundry and node specific design kit … no technology specific worries. Indeed, everyone thought that it was a good idea and would have happened had it not been for resistance from the foundries that were approached. Alas!

In the months before the announcement of Lynx, I heard that Synopsys had finally cracked that nut and that foundry libraries would be part of Lynx after all. Whilst speaking to Synopsys about Lynx in preparation for my posts, I asked whether this was the case. Given my expectations, I was rather surprised when I was told that no foundry libraries would be included as part of Lynx or as an option.

The explanation was that it proved too difficult to handle the many options that customers used. High Vt and low Vt. Regular and low power process. IO and RAM libraries from multiple vendors like ARM and Virage. Indeed, this was a very reasonable explanation to me since my experience was that all chips used some special libraries along the way. How could one QA a set of libraries for all the combinations? So, I left it at that. Besides, Synopsys offered a script that would build the Lynx node from the DesignWare TSMC Foundry Libraries.

Two weeks ago, at the TSMC Technology Symposium in San Jose, TSMC announced their own Integrated Sign-off Flow that competes with the Lynx flow, this one including their libraries. Now it seems to make sense. TSMC may  have backed out of providing libraries to Synopsys to use with Lynx since they were cooking up a flow offering of their own. I don’t know this to be a fact, but I think it’s a reasonable explanation.

So, besides the libraries, how does the TSMC flow compare to the Synopsys Lynx flow? I’m glad you asked. Here are the salient details of the TSMC offering:

  • Complete RTL to GDSII flow much like Lynx
  • Node and process specific optimizations
  • Uses multiple EDA vendors’ tools  (Synopsys mostly, but also Cadence, Mentor, and Azuro)
  • Available only for TSMC 65nm process node (at this time)
  • No cost (at least to early adopters … the press release is unclear whether TSMC will charge in the future)
  • And of course, libraries are included.

In comparison to Synopsys’ Lynx Design System, there were some notable features missing from the announcement:

  • No mention of anything like a Management Cockpit or Runtime Manager
  • No mention of how this was going to be supported
  • No mention of any chips or customers that have been through the flow

To be fair, just because these were not mentioned, does not mean that they are really missing, I have not seen a demo of the flow or spoken to TSMC (you know how to reach me) and that would help a lot in evaluating how this compares to Lynx. Still, from what I know, I’d like to give you my initial assessment of the strength of these offerings.

TSMC Integrated Signoff Flow

  • The flow includes EDA tools from multiple vendors. There is an assumption that TSMC has created a best-of-breed flow by picking the tool that performed each step in the flow the best and making all the tools work together. Synopsys will claim that their tools are all best-of-breed and that other tools can be easily integrated. But, TSMC’s flow comes that way with no additional work required. (Of course, you still need to go buy those other tools).
  • Integrated libraries, as I’ve described above. Unfortunately if you are using any 3rd party libraries, you’ll need to integrate them yourself it seems.
  • Node and process specific optimizations should provide an extra boost in quality of results.
  • Free (at least for now)

Synopsys Lynx Design System

  • You can use the flow with any foundry or technology node. A big advantage unless you are set on TSMC 65nm (which a lot of people are).
  • Other libraries and tools are easier to integrate into the flow I would think. It’s not clear whether TSMC even supports hacking the flow for other nodes.
  • Support from the Synopsys field and support center. Recall, this is now a full fledged product. Presumably, the price customers pay for Lynx will fund the support costs. If there is no cost for the TSMC flow, how will they fund supporting it? Perhaps they will take on the cost to get the silicon business, but that’s a business decision I am not privy to. And don’t underestimate the support effort. This is much like a flow that ASIC vendors (TI, Motorola/Freescale, LSI Logic), not foundries, would have offered. They had whole teams developing and QA’ing their flows. And then they would be tied to a specific set of tool releases and frozen.
  • Runtime Manager and Management Cockpit. Nice to have features.
  • Been used to create real chips before. As I’d said, the core flow in Lynx dates back almost 10 years and has been updated continuously. It’s not clear what is the genesis of the new TSMC flow. Is it a derivative of the TSMC reference flows? Is it something that has been used to create chips? Again, I don’t know, but I’ve got to give Synopsys the nod in terms of “production proven”.

So, what do I recommend. Well, if you are not going to TSMC 65 nm with TSMC standard cell libraries, then there is not much reason to look at the TSMC flow. However, if you are using the technology that TSMC currently supports, the appeal of a turnkey, optimized, and FREE flow is pretty strong. I’d at least do my due diligence and look at the TSMC flow. It might help you get better pricing from TSMC.

If anyone out there has actually seen or touched the TSMC flow, please add a comment below. Everyone would love to know what you think first hand.
harry the ASIC guy

EDA Merger Poll - What’d Be The Best Merger

Friday, May 1st, 2009

Rumors are flying concerning some big changes next week in EDA amongst the big players. It first got started by John Blyler on Twitter. Then Magma stock took off this week for no apparent reason. And rumors of a Cadence-Magma merger have been flying around for about a month since Rajeev denied them.

Something may happen or nothing may happen. But it’s always fun to speculate. So, what do you think would be the best merger of the top 4 EDA companies?

Vote here or feel free to leave your comments below. We’ll see who, if anyone, is right :-)

harry the ASIC guy

What To Do With 1000 CPUs - The Answers

Wednesday, April 15th, 2009

I recall taking a course called The Counselor Salesperson when I was an AE at Synopsys. The course was very popular across the industry and was the basis for the book Win-Win Selling. It advocated a consultative approach to sales, one in which the salesperson tries to understand the customer’s problem first and provide a solution that he needs second. Sounds obvious, but how often do you encounter a salesperson who knows he has what you need and then tries to convince you that you have a problem?

One of the techniques in the process is called the “Magic Wand” wherein the salesperson asks the customer “What would it be like if …”. This open-ended type of question is designed to free the customer’s mind to imagine solutions that he’d otherwise not consider due to real or imagined constraints. That’s the type of question I asked last week when I asked: What would you do with 1000 CPU’s? And boy did it free your minds!

Before I go into the responses, you may be wondering what was my point in asking the question in the first place.  Well, not so surprisingly, I’m looking to understand better the possible applications of cloud computing to EDA and ASIC design. If a designer, design team, or company can affordably access a large number of CPUs for a short period of time, as needed, what would that mean? What would they be able to do with this magic wand that they would not even have thought of otherwise?

I received 8 separate responses, some of them dripping with humor, sarcasm, and even disdain. Good stuff! I’ve looked them over and noticed that they seem to fall into 4 groups, each of which highlights a different aspect or issue of this question.

“Rent Them Out”

Gabe Moretti had the best response along these lines, “(I’d) heat my house and pool while selling time to shivering engineers”. Jeremy Ralph of PDTi put some dollar value on the proposition, calculating that he could make $8.25M per month sub-licensing the licenses and CPUs. While Guarav Jalan pointed out that I’d need to also provide bandwidth to support this “pay-as-you-use” batch farm.

The opportunity is to aggregate users together to share hardware and software resources. If I buy a large quantity of hardware and software on a long-term basis at discounted rates, then I can rent it out on a shorter-term basis at higher rates and make money. The EDA company wins because they get a big sale at a low cost-of-sales. The customers win because they get access to tools on a pay-as-you-go basis at lower cost without a long-term commitment. And I win because I get to pocket the difference for taking the risk.

“Philanthropy”

One of the reasons that Karen Bartleson and I get along so well is that we’ve both been around the EDA industry for some time (we’ll leave it at that). As a result, we not only feel connected to the industry, but also some sense of responsibility to give back. Karen would train university student’s on designing SOCs. I’d train displaced workers on tools that can help them find a new job.

Even though this is not really a business model, I think it is still something that the EDA vendors should consider. Mentor is already very active in promoting it’s Displaced Worker Program. Autodesk and SolidWorks are giving away free licenses to the unemployed. This type of program should be universal. Using cloud computing resources is an easy way to make it happen without investing in lots of hardware.

(On a side note: PLEASE, PLEASE encourage anyone you know at Synopsys and Cadence to follow Mentor’s lead. Synopsys did this in 2001 and Cadence once had a “Retool-To-Work” program that was similar. I truly believe that both companies have that same sense of corporate responsibility as Mentor has, but for some reason they have not felt the urgency of the current situation. I am personally going to issue a daily challenge on Twitter to Synopsys and Cadence to follow suit until it happens. Please Retweet.)

“Do Nothing”

John Eaton pointed out that it is very difficult to use any additional capability offered as “pumpkinware” if you know it will evaporate within a month. It would take that long to set up a way to use it. And John McGehee stated that his client already has all the “beer, wine, and sangria” they can drink (New Yorkers - do you remember Beefsteak Charlie’s?), so he’d pass. John: Can you hook me up with your client :-) ?

Seriously,  it certainly requires some planning to to take advantage of this type of horsepower. You don’t just fire off more simulations or synthesis runs or place and route jobs without a plan. For design teams that might have access to this type of capability, it’s important to figure out ahead of time how you will use it and for how long you will need it. If you will be running more sims, which sims will they be? How will you randomize them? How will you target them to the most risky parts of the design?

Run Lots of Experiments”

Which brings us to Jeremy Ralph’s 2nd response. This one wins the prize as best response because it was well thought out and also addressed the intention of the magic wand question: what problem could you solve that you otherwise could not have solved? Jeremy would use the resources to explore many different candidate architectures for his IP (aka chiplet) and select the best one.

One of the key benefits of the cloud is that anyone can have affordable access to 1000 CPUs if they want it. If that is the case, what sorts of new approaches could be implemented by the EDA tools in addressing design challenges? Could we implement place and route on 1000 CPUs and have it finish in an hour on a 100M gate design? Could we partition formal verification problems into smaller problems and solve what was formerly the unsolvable? Could we run lots more simulations to find the one key bug that will kill our chip? The cloud opens up a whole new set of possibilities.

__________

I’ve learned a lot from your responses. Some were expected and some were not. That’s what’s fun about doing this type of research … finding the unexpected. I’ll definitely give it some thought.

harry the ASIC guy

The Missing Lynx - The ASIC Cloud

Friday, April 3rd, 2009

My last blog post, entitled The Weakest Lynx, got a lot of attention from the Synopsys Lynx CAEs and Synopsys marketing. Please go see the comments on that post for a response from Chris Smith, the lead support person for Lynx at Synopsys. Meanwhile, the final part of this series … The Missing Lynx.

About 7 months ago, I wrote a blog post entitled Birth of an EDA Revolution in which I first shared my growing excitement over the potential for cloud computing and Software-as-a-Service (SaaS) to transform EDA. About a week later, Cadence announced a SaaS offering that provides their reference flows, their software, and their hardware for rent to projects on a short-term basis. About a week after that, I wrote a third post on this topic, asking WWSD (what will Synopsys do) in response to Cadence.

In that last post, I wrote the following:

Synopsys could probably go one better and offer a superior solution if it wanted to, combining their DesignSphere infrastructure and Pilot Design Environment.  If fact, they have done this for select customers already, but not as a standard offering. There is some legwork that they’d need to do, but the real barrier is Synopsys itself. They’ve got to decide to go after this market and put together a standard offering like Cadence has … And while they are at it, if they host it on a secure cloud to make it universally accessible and scalable, and if they offer on-demand licensing, and if they make it truly open by allowing third party tools to plug into their flow, they can own the high ground in the upcoming revolution.

Although I wrote this over 6 months ago, I don’t think I could have written it better today. The only difference is that Pilot has now become Lynx. “The ASIC Cloud”, as I call it, would look something like this:

The ASIC Cloud

As I envision it, Synopsys Lynx will be the heart of The ASIC Cloud and will serve to provide the overall production design flow. The Runtime Manager will manage the resources including provisioning of additional hardware (CPU and storage) and licenses, as needed. The management cockpit will provide real-time statistics on resource utilization so the number of CPUs and licenses can be scaled on-the-go. Since The ASIC Cloud is accessible through any web browser, this virtual design center is accessible to large corporate customers and to smaller startups and consultants. It’s also available to run through portable devices such as netbooks and smartphones.

If you think I’m insane, you may be right, I may be crazy. But it just might be a lunatic you’re looking for. To show you that this whole cloud computing thing is not just my fever (I have been sick this past week), take a look at what this one guy in Greece did with Xilinx tools. He basically pays < $1 per hour to access hardware to run Xilinx synthesis tools on the Amazon Elastic Compute Cloud. Now, this is nothing like running an entire RTL2GDSII design flow, but he IS running EDA tools on the cloud, taking advantage of pay-as-you go CPU and storage resources, and taking advantage of multiple processors to speed up his turnaround time. The ASIC Cloud will be similar and on a much greater scale.

It may take some time for Synopsys to warm up to this idea, especially since it is a whole new business model for licensing software. But for a certain class of customers (startups, design services providers) it has definite immediate benefits. And many of these customers are also potential Lynx customers.

So, Synopsys, if you want to talk, you know where to find me.

__________

That wraps up my 5-part series on Synopsys Lynx. If you want to find the other 4 parts, here they are:

Part 1 - Synopsys Lynx Design System Debuts at SNUG

Part 2 - Lynx Design System? - It’s The Flow, Stupid!

Part 3 - Strongest Lynx

Part 4 - The Weakest Lynx

harry the ASIC guy

set_max_area 0

Friday, March 13th, 2009

I stopped by a lunchtime presentation yesterday given by the local Synopsys AC. He was updating my client on what was new in Design Compiler and other tools when he put up a slide that said something like this:

set_max_area 0 (now default setting)

For those who don’t know what this means, it tells the synthesis engine to try to make the design area as small as possible, which is obviously a desirable goal. Why would anyone ever want to set their area goal higher? If you’ve used Design Compiler before, you know that this has been somewhat of a running joke, a command that was in each and every synthesis script every written, as follows:

set_max_area 0

So it got me thinking. Were there any other artifacts of a bygone EDA era that were still were hanging around, like a running joke, that had served their purpose and needed to be put to rest. Of course there were, or I would not be writing this post. Here are 3:

1) Perpetual licenses. As Paul McClellan points out on his excellent EDA Graffiti blog, in the early days of EDA “the business model was the same business model as most hardware was sold: you bought the hardware, digitizers, screens and so on… And you paid an annual maintenance contract for them to keep it all running which was about 15-20% of the hardware cost per year.” EDA companies loved perpetual licenses for 2 reasons.

  1. They got to recognize all the revenue for the purchase at the time of the sale, so they were able to show better numbers on the books quicker.
  2. Once you “bought” the software, you only paid a small fee each year for maintenance. If you wanted to switch to another competitor’s tool, you’d need to pay that up-front perpetual license cost again, which was a real disincentive to switch. Basically, they could lock you in.

Even though most EDA companies have gone to a subscription license model, some still predominantly license software as perpetual. With the advent of short term licensing like Cadence’s eDaCard and Synopsys’ e-licensing, the perpetual model is as outdated as Sarah Palin.

2) Large direct sales teams. I need to be really careful here, because I worked in various customer facing roles at Synopsys for almost 15 years and I still have several friends who work in direct sales at various EDA companies. Many of them are very skilled and I don’t want to cause them to lose their jobs. But the fact is that all of us rely on “the Web” to get information on all things, including EDA tools, much more than we rely on salespeople. I’m part of the older generation (although I don’t feel or act that way), but the newer generation of customers views the internet in all its forms (static web pages, social networks, blogs, podcasts, twitter) like the air that they breath. They can’t live without it. And if you think they are going to want to have to schedule a visit from a “salesperson” to get access to a tool they are interested in, then you don’t have a clue about what these people expect. They expect to go to a web page, log in (maybe), and be off and running. And if your tool ain’t accessible that way, sorry, they’re not interested. Of course, that sounds shortsighted, but that’s they way it is and will be, like it or not.

This does not mean that some direct sales has no use or value. After all, a company is writing the check, not an individual with a credit card. And sophisticated customers will still (for now) want to install software and use it, so they will still need support. So there will still be a need for some direct sales and support, but much of the early stages of the sales process will move to the Web.

3)  Closed tool suites and solutions. As I stated in a previous post, most EDA companies seek to fence customers in rather than provide streams to nourish them. With all due regards to folks like Karen Bartleson and Dennis Brophy who have unselfishly worked to promote standards, we fall far short of the goals of the Cad Framework Initiative, which sought to enable true plug-n-play interoperability between EDA tools. It’s definitely getting better, due mostly to customer pressure. But we still have a long way to go before we have truly standard standards that enable collaboration between EDA suppliers. So, if you’re an EDA company, get with the standards.

That’s just 3. I’m sure there’s more. Let me know if you come up with others:

harry the ASIC guy

Setting The Record Straight

Thursday, February 19th, 2009

Since conducting the Verification Methodology Poll and publishing the raw results last week, I’ve been planning to follow up with a post that digs a little deeper into the numbers. Things have gotten rather busy in the meantime, both at work and with organizing the SaaS and Cloud Computing EDA Roundtable for next week at DVCon. So I’ve let it slip a little.

Well, I noticed today that the verification methodology poll was referenced in a Cadence blog post by Adam Sherer. The results were somewhat mis-interpreted (in my opinion), so that kicked my butt to post my own interpretations to set the record straight. Says Adam:

According to the poll conducted by Harry Gries in his Harry the ASIC Guy blog, you should go “all in” on the OVM because it is the 2:1 favorite.

In fact, the raw results had VMM with 80 users and OVM with 125 users, a ratio of just over 1.5:1 (1.5625 to be exact). So the 2:1 ratio is not accurate. However, if you add in RVM/Vera users to the VMM numbers, and then add in AVM, eRM, and e users to the OVM numbers, that ratio is more like 1.8:1. Closer, but still not 2:1.

It also indicates that my poll says that “you should go ‘all in’ on the OVM”. I never said that nor does the poll say anything about what you “should do”. The data simply captures what people are planning on using next. If you are inclined to follow the majority, then perhaps OVM is the way to go. By contrast, there is nothing in the poll comparing the technical merits of the various methodologies. So, if you are inclined to make up your own mind, then you have some work to do and my poll won’t help you on that. You’re probably better off visiting JL Gray at Cool Verification.

No poll is perfect and it will be interesting to compare to DVCon and John Cooley polls to see if they are consistent. Here are a few other interesting stats that I pulled out of the poll results:

  • 91% of respondents are using some sort of SystemVerilog methodology
  • 10% are using both OVM and VMM (although I suspect many of these are consultants)
  • 27% are still using e or Vera (more e than Vera)
  • 4% are using ONLY VHDL or Verilog (this number may be low due to the skew of respondents towards advanced methodologies)

Again, I welcome you to download the raw data, which you can find in PDF format and as an Excel workbook, and draw your own conclusions.

harry the ASIC guy

SaaS & Cloud Computing EDA Roundtable @ DVCon

Tuesday, February 17th, 2009

I’ve been writing about Software-as-a-Service (SaaS) and Cloud Computing as relates to EDA for some time now. Then back in January I made a New Years resolution to organize a SaaS EDA roundtable at the 2009 Design and Verification Conference (DVCon).  About a month ago I asked for volunteers and several of you have stepped up to help. Now, just a week before DVCon, I’d like to formally announce the event.

The SaaS and Cloud Computing Roundtable will be held from 6:30 - 8:00 pm on Wed Feb 25th in the Monterey/Carmel rooms at the San Jose Doubletree Hotel. This is immediately following the DVCon reception down the hall, so grab a drink and a bite and then wander on over.

SaaS and Cloud Computing are 2 of the hottest trends in the Information Technology and software industries. Some EDA companies have already put their toes in the water. This roundtable will explore the following question: Are they trailblazing the future of the industry or are they chasing an empty fad?

The format will consist of 5 brief (< 10 minute) presentations from people involved in various perspectives in SaaS and cloud computing for EDA:

This will be followed by an open, and hopefully lively, discussion.

I’m greatly looking forward to this event, especially since I get to collaborate with such a high-powered team and I have no idea what to expect. I truly believe that this could be one of the more interesting events at DVCon this year.

I hope to see many of you there.

harry the ASIC guy

 

Verification Methodology Poll Results

Wednesday, February 11th, 2009

Last week I initiated a poll of verification methodologies being used for functional verification of ASICs. Unlike other polls or surveys, this one was done in a very “open” fashion using a website that allows everyone to view the raw data. In this way, anyone can analyze the data and draw the conclusions that make sense to them, and those conclusions can be challenged and debated based on the data.

What happened next was interesting. Within 48 hours, the poll had received almost 200 responses from all over the world. It had garnered the attention of the big EDA vendors who solicited their supporters to vote. And, as a result, had became a focal point for shenanigans from over-zealous VMM and OVM fans.  I had several long nights digging through the data and now I am ready to present the results.

As promised, here is the raw data in PDF format and as an Excel workbook. The only change I have made is to remove the names of the individual 249 respondents.

In summary, the results are as follows:

RAW Results from Verification Methodology Poll


(Note: The total is more than the 249 respondents because one respondent could be using more than one methodology.)

Regarding the big 3 vendors, the data shows a remarkable consistency with Gary Smith’s market share data. There are 85 respondents planning to use the Synopsys methodologies (VMM,RVM, or Vera) and there are 150 respondents planning to use the Mentor or Cadence methodologies (OVM, AVM, eRM, e). That represents 36% for Synopsys and 64% for Mentor/Cadence. Gary’s data shows Synopsys with 34% market share, Mentor with 35%, and Cadence with 30%.

Methodology Split

Gary Smith Market Share Data


I’ll share some more insights in upcoming posts. In the meantime, please feel free to offer any insights that you have through your comments. Remember, you too have access to the raw data. This invitation includes the EDA vendors. And feel free to challenge my conclusions … but back it up with data!

harry the ASIC guy