Archive for the ‘Verification’ Category

761 Days

Tuesday, March 29th, 2011

Clouds over San Francisco

761 days.

That’s 2 years, 1 month, and 3 days.

761 days ago, I hosted a small group of interested EDA folks, journalists, and bloggers in a small room in the Doubletree hotel after one of the evenings after DVCon.

Most of the discussion that year was around OVM and VMM and which methodology was going to win out and which was really open and which simulator supported more of the System-Verilog language. Well, all that is put to bed. This year at DVCon, 733 days later, we all sang Kumbaya as we sat around and our hearts were warmed by the UVM campfire.

But, back to that small group that I hosted 761 days ago. Those that attended this conclave had shrugged off all the OVM and VMM hoopla and decided to come hear this strange discussion about Cloud Computing and SaaS for EDA tools. Some, no doubt, thought there was going to be free booze served, and they were certainly disappointed. Those that stayed, however, heard a fiery discussion between individuals who were either visionaries or lunatics. For many, this was the first time they had heard the term cloud computing explained, and their heads spun as they tried to imagine what, if anything would come of it for the EDA industry.

Over the 761 days since, the voices speaking of cloud computing for EDA, once very soft, grew slowly in volume. All the reasons that it would not work were thrown about like arrows, and those objections continue. But slowly, over time, the voices in support of this model have grown to the point where the question no longer was “if” but “when”.

761 days, that’s when.

Yesterday, to the shock of many at SNUG San Jose, including many in attendence from Synopsys, Aart DeGeus personally answered the question asked 761 days earlier. Indeed, those individuals gathered in that small room at the Doubletree were visionaries, not lunatics.

There are many reasons why Synopsys should not be offering its tools on the cloud via SaaS:

  • Customers will never let their precious proprietary data off-site
  • It will cannibalize longer term license sales
  • The internet connection is too slow and unreliable
  • There’s too much data to transfer
  • The cloud is not secure
  • It’s more expensive
  • It just won’t work

But, as it turns out, there are better reasons to do it:

  • Customers want it

Sure, there are some other reasons. The opportunity to increase revenue by selling higher priced short-term pay-as-you-go licenses. Taking advantage of the parallelism inherent in the cloud. Serving a new customer base that has very peaky needs.

But in the end, Aart did what he does best. He put on his future vision goggles, gazed into the future, saw that the cloud was inevitable, and decided that Synopsys should lead and not follow.

761 days. Now the race is on.

Winners and Losers

Sunday, March 6th, 2011

Washington General and Harlem Globetrotter at LAXEngineers tend to view the world in binary. There’s the good guys and the bad guys. There’s the right way and the wrong way. There are rich folks and poor folks. Democrats and Republicans. You’re with us or against us.

And there are winners and losers.

This week, working the Agnisys booth at DVCon, I got to see all these types and all the shades in between. I got to see the good guys (me, of course, and anyone who was with me) and the bad guys (the competition). I saw people doing things the right way (telling the truth, or close to the truth) and the wrong way (pure fabrications). I saw rich folks (CEOs in expensive suits and shoes) and poor folks (the guys at the hotel tearing down after the show). Most of the people from Silicon Valley were Democrats, I suppose, and many of the others were Republicans. And, of course, for the Big 3 EDA vendors, it was all about who was with them (on the EDA360 passport) or against them (everyone else).

But, when you look a little closer, you see a lot of shades in between. Personally, I knew people at almost every booth with whom I had worked before. They’re not good or bad, right or wrong, rich or poor, democrats or republicans, or with me or against me. They’re just old friends working in an industry they love on technology they are psyched about.

I actually had some foreshadowing of this as I was flying up to the conference. As I was passing through the metal detectors at LAX, I had noticed some tall gentlemen dressed in green warmup suits. Realizing it was a basketba,ll team, I curiously glanced at their logo and saw the name “Generals”. Later, I was able to get a full view of the name “Washington Generals”.

If you are not familiar, the Washington Generals are the basketball team that travels with the Harlem Globetrotters. They are perennial losers. The spoil and object of countless Globetrotter jokes. According to Wikipedia, the Generals lost over 13,000 games to the Globetrotters between 1953 and 1995, and won only 6 times. That’s a winning percentage of 0.0005! If anyone deserves the title of “Losers”, it’s the Washington Generals.

As I sat waiting for my flight, I noticed some other apparent basketball players dressed in red with white and blue trim. Could it be? Yes, they were the Globetrotters, winners of those same 13,000 games that the Generals had lost. If anyone deserves the ttle of “Winners”, it’s the Harlem Globetrotters.

What surprised me at the time was that these eternal rivals, Winners and Losers, were traveling together, joking and laughing like best friends. Although I know that they obviously travel together and they know eachother, for some reason I had expected them to be separated. The good guys and the bad guys. The Winners and the Losers.

Just as the Generals and Globetrotters are rivals on the court but friends off the court, these EDA veterans were rivals at the booths at DVCon but friends in the bar afterwards. The EDA industry is kind of like a professional sports league. Sure, the teams compete with eachother. But players move between teams all the time and most of the players are friends off the field. In the end, what’s most important is that the league grows and is successful.

Hopefully, going forward, EDA will be more like the NBA than one of these failed leagues.

harry the ASIC guy

The Burning Platform

Monday, March 1st, 2010

The Burning PlatformAlthough I was unable to attend DVCon last week, and I missed Jim Hogan and Paul McLellan presenting “So you want to start an EDA Company? Here’s how“, I was at least able to sit in on an interesting webinar offered by RTM Consulting entitled Achieving Breakthrough Customer Satisfaction through Project Excellence.

As you may recall, I wrote a previous blog post about a Consulting Soft Skills training curriculum developed by RTM in conjunction with Mentor Graphics for their consulting organization. Since that time, I’ve spoken on and off with RTM CEO Randy Mysliviec. During a recent conversation he made me aware of this webinar and offered one of the slots for me to attend. I figured it would be a good refresher, at a minimum, and if I came out of it with at least one new nugget or perspective, I was ahead of the game. So I accepted.

I decided to “live tweet” the seminar. That is to say, I posted tweets of anything interesting that I heard during the webinar, all using the hash tag #RTMConsulting. If you want to view the tweets from that webinar, go here.

After 15 years in the consulting biz, I certainly had learned a lot, and the webinar was indeed a good refresher on some of the basics of managing customer satisfaction. There was a lot of material for the 2 hours that we had, and there were no real breaks, so it was very dense and full of material. The only downside is that I wish there had been some more time for discussion or questions, but that’s really a minor nit to pick.

I did get a new insight out of the webinar, and so I guess I’m ahead of the game. I had never heard of the concept of the “burning platform” before, especially as applies to projects. The story goes that there was an oil rig in the North Sea that caught fire and was bound to be destroyed. One of the workers had to decide whether to stay on the rig or jump into the freezing waters. The fall might kill him and he’d face hypothermia within minutes if not rescued, but he decided to jump anyway, since probable death was better than certain death. According to the story, the man survived and was rescued. Happy ending.

The instructor observed that many projects are like burning platforms, destined for destruction unless radically rethought. In thinking back, I immediately thought of 2 projects I’d been involved with that turned out to be burning platforms.

The first was a situation where a design team was trying to reverse engineer an asynchronously designed processor in order to port it to another process. The motivation was that the processor (I think it was an ADSP 21 something or other) was being retired by the manufacturer and this company wanted to continue to use it nonetheless. We were called in when the project was already in trouble, significantly over budget and schedule and with no clear end in sight. After a few weeks of looking at the situation, we decided that there was no way they would ever be able to verify the timing and functionality of the ported design. We recommended that they kill this approach and start over with a standard processor core that could do the job. There was a lot of resistance, especially from the engineer whose idea it was to reverse engineer the existing processor. But, eventually the customer made the right choice and redesigned using an ARM core.

Another group at the same company also had a burning platform. They were on their 4th version of a particular chip and were still finding functional bugs. Each time they developed a test plan and executed it, there were still more bugs that they had missed. Clearly their verification methodology was outdated and insufficient, depending on directed tests and FPGA prototypes rather than more current measurable methods. We tried to convince them to use assertions, functional coverage, constrained random testing, etc. But they were convinced that they just had to fix the few known bugs and they’d be OK. From their perspective, it wasn’t worth all the time and effort to develop and execute a new plan. They never did take our recommendations and I lost track of that project. I wonder if they ever finished.

As I think about these 2 examples, I realize that “burning platform” projects have some characteristics in common. And they align with the 3 key elements of a project. To tell if you have a “burning platform” on your hands, you might ask yourself the following 3 questions:

  1. Scope - Are you spending more and more time every week managing issues and risks? Is the list growing, rather than shrinking?
  2. Schedule - Are you on a treadmill with regards to schedule? Do you update the schedule every month only to realize that the end date has moved out by a month, or more?
  3. Resources - Are the people that you respect the most trying to jump off of the project? Are people afraid to join you?

If you answered yes to at least 2 of these, then you probably have a burning platform project on your hands. It’s time to jump in the water. That is, it’s time to scrap the plan and rethink your project from a fresh perspective and come up with a new plan. Of course, this is not a very scientific way of identifying an untenable project, but I think it’s a good rule-of-thumb.

There are other insights that I had from the webinar, but I thought I’d only share just the one. I don’t know if this particular webinar was recorded, but there are 2 more upcoming that you can attend. If you do, please feel free to live tweet the event like I did, using the #RTMConsulting hash tag.

But please, no “flaming” :-)

harry the ASIC guy

So, you want to start an EDA company?

Tuesday, February 9th, 2010 CC BY-NC 2.0Lightbulb

In the almost 2 years since I started this blog, I’ve been paying pretty close attention to the EDA industry. And one of the themes I keep hearing goes something like this:

“There’s no more innovation in EDA”
I hear it on blogs and on Twitter. I hear it from design engineers, from consultants, from old media, from new media, and even from EDA people.

One person I know, someone who has been an executive at an EDA company and a venture capitalist, says that EDA is persona non-grata for VC folks. Maybe you can start a “lifestyle company” doing EDA, but don’t expect any more companies like Synopsys to come along.

And then, about a month ago, I get an email from someone out of the blue. He’s got an idea for a new EDA tool that would transform the industry. He’s been in the semiconductor business. He’s developed EDA tools. He knows everybody there is to know. And he’s not able to get anyone’s attention. As he puts it, nobody is working on anything “disruptive”. They are all doing “incremental improvements” that are “woefully inadequate”.

I spent about an hour talking to him on the phone. As I got off the phone, I was not sure what to make of the conversation. He was either insane or a visionary. He was either deluded or optimistic. He was either obsessed or determined. I’m still not sure which.

And that is what makes this industry so much frickin’ fun! You never know. That crazy idea of turning VHDL into gate-level schematics … who figured that would be the biggest innovation in design in decades?

Then, last week, I heard about this event/gathering/workshop happening during DVCon at the San Jose Doubletree. Presented by EDA veterans Jim Hogan and Paul McLellan. It’s called “So, you want to start an EDA Company. Here’s how …” And I immediately thought of my new friend with the idea about a new EDA company. This is exactly what he was looking for … an audience of people with open minds who were asking “why not” instead of “why”.

Maybe you also have a crazy idea. Maybe it really is crazy. Or maybe not.

I invited him and I hope I can get there myself. If so, I think you might want to come too.  You might just meet the founder of the next Synopsys. Here’s the skinny: San Jose Doubletree on Feb 23 at 6:30-7:30 in the Oak Ballroom.

I’ve also written a little prediction of what I expect to hear on the Xuropa Blog. Who knows? Maybe the naysayers are right and EDA is Dead. Then again, maybe not. I, for one, am dying to find out which.

harry the ASIC guy

An ASIC Guy Visits An FPGA World - Part II

Monday, June 22nd, 2009

Altera FPGA

I mentioned a few weeks ago that I am wrapping up a project with one of my clients and beating the bushes for another project to take its place. As part of my search, I visited a former colleague who works at a small company in Southern California. This company designs a variety of products that utilize FPGAs exclusively (no ASICs), so I got a chance to understand a little bit more about the differences between ASIC and FPGA design. Here’s the follow-on then to my previous post An ASIC Guy Visits An FPGA World.

Recall that the first 4 observations from my previous visit to FPGA World were:

Observation #1 - FPGA people put their pants on one leg at a time, just like me.

Observation #2 - I thought that behavioral synthesis had died, but apparently it was just hibernating.

Observation #3 - Physical design of FPGAs is getting like ASICs.

Observation #4 - Verification of FPGAs is getting like ASICs.

Now for the new observations:

Observation #5 - Parts are damn cheap - According to the CTO of this company, Altera Cyclone parts can cost as little as $10-$20 each in sufficient quantities. A product that requires thousands or even tens of thousands will still cost less than a 90nm mask set. For many non-consumer products with quantities in this range, FPGAs are compelling from a cost standpoint.

True, the high-end parts can cost thousands or even tens of thousands each (e.g. for the latest Xilinx Virtex 6). But considering that a Virtex 6 part is 45nm and has the gate-count equivalent of almost 10M logic gates, what would an equivalent ASIC cost?

Observation # 6 - FPGA verification is different (at least for small to medium sized FPGAs) - Since it is so easy and fast and inexpensive (compared to ASIC) to synthesize and place and route an FPGA, much more of the functional verification is done in the lab on real hardware. Simulation is typically used to get a “warm and fuzzy” that the design is mostly functional, and then the rest is done in the lab with the actual FPGA. Tools like Xilinx ChipScope allow logic-analyzer-like access into the device, providing some, but not all, of the visibility that exists in a simulation. And once bugs are found, they can be fixed with an RTL change and reprogramming the FPGA.

One unique aspect of FPGA verification is that it can be done in phases or “spirals”. Perhaps only some of the requirements for the FPGA are complete or only part of the RTL is available. No problem. One can implement just that part of the design that is complete (for instance just the dataplane processing) and program the part. Since the same part can be used over and over, the cost to do this is basically $0. Once the rest of the RTL is available, the part can be reprogrammed again.

Observation # 7 - FPGA design tools are all free or dirt cheap - I think everybody knows this fact already, but it really hit home talking to this company. Almost all the tools they use for design are free or very inexpensive, yet the tools are more than capable to “get the job done”. In fact, the company probably could not operate in the black if they had to make the kind of investment that ASIC design tools require.

Observation # 8 - Many tools and methods common in the ASIC world are still uncommon in this FPGA world - For this company, there is no such thing as logical equivalence checking. Verification tools that perform formal verification of designs (formal proof), System-Verilog simulation, OVM, VMM…not used at all. Perhaps they’ll be used for the larger designs, but right now they are getting along fine without them.


FPGA verification is clearly the area that is the most controversial. In one camp are the “old skool” FPGA designers that want to get the part in the lab as soon as possible and eschew simulation. In the other camp are the high-level verification proponents who espouse the merits of coverage-driven and metric-driven verification and recommend achieving complete coverage in simulation. I think it would really be fun to host a panel discussion with representatives from both camps and have them debate these points. I think we’d learn a lot.


harry the ASIC guy

Interview with GateRocket Founder Chris Schalick

Wednesday, May 27th, 2009

A colleague of mine, Alvin Cheung, recently interviewed Chris Schalick of GateRocket regarding his experiences in founding a high-tech startup company. That interview is reposted below by permission of both parties.


Chris Schalick is VP of Engineering, CTO, and Founder of GateRocket, Inc. After working in the ASIC and FPGA industry for more than 15 years, Chris founded the company to solve one of the fundamental problems with FPGA design, the ability to simulate hardware FPGA behavior within the design verification environment. GateRocket partners with the three major Electronic Design Automation (EDA) providers, Mentor Graphics, Cadence, and Synopsys, to be able to “plug-in” their hardware to the software simulation environment.

Alvin Cheung is currently a CAD Manager in the aerospace industry. Previously, Alvin worked at TI, Artisan Components and other companies doing ASIC and library development.


Alvin: Hi Chris. I want to start off by asking a couple of questions that are not necessarily related to FPGA technology but more towards a start-up company. I see that you founded the company in Oct. 2004. You were working for someone else before you decided to found your company. What made you want to start you own business?

Chris: Well, it is something that I always wanted to do as a kid. I love to build things. In the 15 years that I was an ASIC designer, I saw that when I went from ASIC to FPGA there were a lot of problems with debugging the FPGA. The parts that would work in simulation perfectly ended up not working in the lab at all. A lot of ASIC designers have the same issues going from ASIC to FPGA and there were not any tools out there to address this problem. I thought to myself, “There has to be a better way to do the debugging on the FPGAs.” That’s how I came up with the idea. I tested the idea with a couple of colleagues and founded the company. Our RocketDrive builds on the idea of using a logic analyzer in the lab and puts it at the finger tips of the designers doing functional verification with a simulator. You don’t have to reprogram the FPGA over and over again, troubleshoot, and recode your design.

Alvin: Did you find that you needed to adapt from your engineering skills to marketing or sales skills? Did you find that a challenge and a difficult transition?

Chris: In a small company you have to do a lot of things. In the beginning, I had to do everything from calling the customers, talking to the vendors, talking to partners and talking to investors. You’re right in that most engineers don’t have a lot of skills in those areas. I had to learn a lot by trial and error.

Alvin: Do you see a change in lifestyle since you started the company? Is it worthwhile?

Chris: I worked at several start-ups before starting GateRocket. I’m used to working long hours and with a small group of people. Over time, working focused hours with a small group can be more productive than larger groups with more resources. Our company has had its up and downs; keeping a positive attitude and going back to do the right thing is the most important thing.

Alvin: I see that you’ve secured your funding from venture and angel investors. Was it difficult to secure the Series A funding? Did the VC require you to change your plans? Were there a lot of obstacles?

Chris: Raising money is trial and error. The process is always lengthy but not necessarily an obstacle. There are always people who will say “No” and want you to address “one more” thing, but addressing it does not necessary mean that they’ll invest or that you will succeed. All you really need is the one “Yes” from the right guys and you can’t be concerned about the “No’s”. As far as the obstacles, they always want more data, analysis, financial projections and references. Those things are not unreasonable and you do your best to provide them with the information. When I put my own money on the line, I ask for the same things.

Alvin: How long did it take you to develop the “RocketDrive? Is it your first product?

Chris: Yes, the “RocketDrive” is our first product and our only product. It took me 18 months for the first prototype and since our first prototype we have dramatically enhanced the hardware. It took us 2 ½ years to ship our 1st production unit and we worked closely with our customers and partners to develop the product. We are in our 5th year and shipping units. We are constantly improving the product and continue development of RocketDrive as a platform and new software products that run on it. Stay tuned!

Alvin: What would you say are the top three skills needed to be a successful entrepreneur?

Chris: Hmm… I would say the ability to maintain focus. Things don’t always go the way you want. Many things that you don’t expect to happen will happen. You have to maintain focus and go back and look at problems from another angle. The second would be the ability to stay positive. You just keep your chin up and tell yourself you can do it. The third would be imagination. Sometimes the right answer is not obvious. There’s a saying that, “You have to think outside the box.” You really do to succeed. You need to see things from many different angles and sometimes the right answer is not the obvious one. Our first prototype was nothing like what we currently ship.

Alvin: What is your favorite aspect of being an entrepreneur?

Chris: You know the saying that, “You have to play big to win big.”? Well that’s true. From the creative aspect, I’ve always liked to build things and starting a company provides a unique chance to build things that you might not otherwise be able to. Of course money is also a big factor. Although there’s no guarantee of financial success, it certainly is a motivator. I would say it is the combination of the two.

Alvin: Was there a lot of trial and error with your product? Do you find yourself in situations where there is already a competitor out there that has similar technology? If yes, how did you differentiate them from your product?

Chris: There was a huge amount of trial and error. Success is always a trial. Sometimes the answers are not obvious. You have to be persistent and look outside the box. You keep looking for the solution until your find the answers. Our current model looks nothing like our original prototype on the inside. On the outside with the simulator, it looks the same. But on the inside, everything has changed. We believe we are the first product in the market that does logic simulation directly with the FPGA. So, no, there is no direct competition. There are alternatives to develop FPGAs – build a prototype, program the FPGA, take it to the lab, connect it to the logic analyzer and hope everything works according to what you simulate with your RTL and testbench. What we are doing is changing the design flow and people’s concept of verifying the design. You can debug your design on actual silicon before you take it to the lab.

Alvin: How is your company adjusting to the current economic downturn? Did you have to downsize or change your priorities to adjust?

Chris: The environment looks bad on the surface, but people are still working on FPGAs. Some people have fewer dollars to spend, but we are still getting positive feedback with our product and we are selling more of them. We certainly have lots of activity lately with our product due to the growing size and number of FPGA designs.

Alvin: Actually in the current economic climate, would people choose FPGA over ASIC?

Chris: Yes, you are right. With the cost of the ASIC process, more and more people are looking to see how they can fit their designs in FPGAs instead of ASICs. With FPGAs getting larger and design technology getting smaller, more and more designers are choosing FPGA for their designs.

Alvin: I’m going to ask my last question of the interview and don’t want to take too much of your time. So I’m going to end with asking, what is your next step? Where do you see the company going from here?

Chris: Though our company is still growing, we are looking for ways to become the household name when it comes to FPGA development. We are demo’ing to customers and showing them the actual behavior of the simulator on silicon. We are working to craft the message and to expand our presence in the market. We are developing our online presence. We are going to DVCon, FPGA summit, and DAC, and really our best marketing is from “word of mouth”. We want our customers to be successful and in turn we can become successful.

Alvin: Do you think you would IPO or get the company to be on a merger/acquisition deal anytime soon?

Chris: In this environment, I don’t think it is the right time for an IPO. We are focusing on our customers, enhancing the product and expanding our market presence.

Alvin: Ok. Well, thank you for letting me take a big chunk of your time from your busy schedule. Thank you so much for the interview.

EDA Is Only “Mostly Dead”

Wednesday, March 4th, 2009

Last Wednesday at DVCon, Peggy Aycinena MC’ed what used to be known as the Troublemakers Panel, formerly MC’ed by John Cooley. The topic, “EDA: Dead or Alive?” Well, having attended Aart’s Keynote address immediately preceding and having attended Peggy’s panel discussion, I can answer that question in the immortal words of Miracle Max, “EDA is only MOSTLY dead”. But first, some background.

Back in the mid 90s, I attended a Synopsys field conference where Aart delivered a keynote addressing the challenges of achieving higher and higher productivity in the face of increasing chip size. The solution, he predicted, would be design reuse in the form of intellectual property. Although most of us had only the faintest idea of what design reuse entailed and could barely fathom such a future, Aart’s prediction has indeed come true. Today, there is hardly a chip designed without some form of soft or hard IP and many chips are predominantly IP.

Some years later, he delivered a similar keynote preaching the coming future of embedded software. This was before the term SoC was coined to designate a chip with embedded processors running embedded software. Again, only a handful understood or could fathom this future, but Aart was correct again.

So, this year, immediately preceding Peggy’s Panel, Aart delivered another very entertaining and predictive keynote. After describing the current economic crisis in engineering terms using amplifiers and feedback loops, he moved to the real meat of the presentation which addressed the growing amount of software content in today’s SoCs. He described how project schedules are often paced by embedded software development and validation. How products are increasingly differentiated based on software, not hardware. And he predicted a day when chips would only have custom hardware to implement functions that could not be performed with programmable software. In essence, he described a future with little electronic design as we know it today, where hardware designers are largely replaced by programmers.

Immediately following Aart’s keynote was Peggy’s panel. (If you want to know exactly what occurred, there is no place better to go than Mike Demler’s blow-by-blow account.) Peggy did her best to challenge the EDA execs to defend why EDA would not die out. She kept coming back to that same question in different ways and the execs kept avoiding directly answering the question, choosing instead to offer such philosophical logic such as: “If EDA is dead, then semiconductors are dead. If semiconductors are dead, then electronics are dead. And since electronics will never die, EDA will never die”.

On the surface, logic such as this is certainly comforting. After all, who can imagine a future without electronics? Upon closer inspection, however, and in light of Aart’s keynote, there is plenty reason for skepticism.

Just as Aart was right about design reuse and IP…

Just as Aart was right about embedded software …

I believe that Aart is right about hardware design being replaced by software development.

As processors and co-processors become faster and more capable of handling tasks formerly delegated to hardware…

As time-to-market drives companies to sell products that can be upgraded or fixed later via software patches…

As fewer and fewer companies can afford the cost of chip design at 32nm and below…

More companies will move capabilities to software running on standard chips.

With that, what becomes of the current EDA industry. Will it adapt to embrace software as part of its charter. Or will it continue to focus on chip development.

Personally, I think Aart is right again. Hardware will increasingly become software. And an EDA industry focused on hardware, will be increasingly “mostly dead”.

harry the ASIC guy

Mentor Graphics Displaced Worker Program

Thursday, February 26th, 2009

I’m still up at the Design Verification Conference (DVCon) and have not had a chance to summarize last evening’s Software-As-A-Service and Cloud Computing EDA Roundtable. I will do that over the weekend and have a complete rundown next week, including slides.

In the meantime, I wanted to pass on some information that was announced a week or so ago and which I became aware of just this week. Mentor Graphics has initiated a Displaced Worker Program to provide free training to customers who have lost thier jobs in the last 6 months. Back last Decemeber I had issued a challenge to the EDA vendors to do just this. I don’t know if this challenge had any affect; hopefully they did this because they thought it was the right thing to do.

So far Mentor is the only company that has done this, to my knowledge. I’ve personally had discussions with one other of the “Big 3″, so hopefully they will follow suit. Maybe Mentor’s offer will help prompt them.

What do you think? Should they do this?

harry the ASIC guy

Setting The Record Straight

Thursday, February 19th, 2009

Since conducting the Verification Methodology Poll and publishing the raw results last week, I’ve been planning to follow up with a post that digs a little deeper into the numbers. Things have gotten rather busy in the meantime, both at work and with organizing the SaaS and Cloud Computing EDA Roundtable for next week at DVCon. So I’ve let it slip a little.

Well, I noticed today that the verification methodology poll was referenced in a Cadence blog post by Adam Sherer. The results were somewhat mis-interpreted (in my opinion), so that kicked my butt to post my own interpretations to set the record straight. Says Adam:

According to the poll conducted by Harry Gries in his Harry the ASIC Guy blog, you should go “all in” on the OVM because it is the 2:1 favorite.

In fact, the raw results had VMM with 80 users and OVM with 125 users, a ratio of just over 1.5:1 (1.5625 to be exact). So the 2:1 ratio is not accurate. However, if you add in RVM/Vera users to the VMM numbers, and then add in AVM, eRM, and e users to the OVM numbers, that ratio is more like 1.8:1. Closer, but still not 2:1.

It also indicates that my poll says that “you should go ‘all in’ on the OVM”. I never said that nor does the poll say anything about what you “should do”. The data simply captures what people are planning on using next. If you are inclined to follow the majority, then perhaps OVM is the way to go. By contrast, there is nothing in the poll comparing the technical merits of the various methodologies. So, if you are inclined to make up your own mind, then you have some work to do and my poll won’t help you on that. You’re probably better off visiting JL Gray at Cool Verification.

No poll is perfect and it will be interesting to compare to DVCon and John Cooley polls to see if they are consistent. Here are a few other interesting stats that I pulled out of the poll results:

  • 91% of respondents are using some sort of SystemVerilog methodology
  • 10% are using both OVM and VMM (although I suspect many of these are consultants)
  • 27% are still using e or Vera (more e than Vera)
  • 4% are using ONLY VHDL or Verilog (this number may be low due to the skew of respondents towards advanced methodologies)

Again, I welcome you to download the raw data, which you can find in PDF format and as an Excel workbook, and draw your own conclusions.

harry the ASIC guy

SaaS & Cloud Computing EDA Roundtable @ DVCon

Tuesday, February 17th, 2009

I’ve been writing about Software-as-a-Service (SaaS) and Cloud Computing as relates to EDA for some time now. Then back in January I made a New Years resolution to organize a SaaS EDA roundtable at the 2009 Design and Verification Conference (DVCon).  About a month ago I asked for volunteers and several of you have stepped up to help. Now, just a week before DVCon, I’d like to formally announce the event.

The SaaS and Cloud Computing Roundtable will be held from 6:30 - 8:00 pm on Wed Feb 25th in the Monterey/Carmel rooms at the San Jose Doubletree Hotel. This is immediately following the DVCon reception down the hall, so grab a drink and a bite and then wander on over.

SaaS and Cloud Computing are 2 of the hottest trends in the Information Technology and software industries. Some EDA companies have already put their toes in the water. This roundtable will explore the following question: Are they trailblazing the future of the industry or are they chasing an empty fad?

The format will consist of 5 brief (< 10 minute) presentations from people involved in various perspectives in SaaS and cloud computing for EDA:

This will be followed by an open, and hopefully lively, discussion.

I’m greatly looking forward to this event, especially since I get to collaborate with such a high-powered team and I have no idea what to expect. I truly believe that this could be one of the more interesting events at DVCon this year.

I hope to see many of you there.

harry the ASIC guy