Posts Tagged ‘Xilinx’

An ASIC Guy Visits An FPGA World - Part II

Monday, June 22nd, 2009

Altera FPGA

I mentioned a few weeks ago that I am wrapping up a project with one of my clients and beating the bushes for another project to take its place. As part of my search, I visited a former colleague who works at a small company in Southern California. This company designs a variety of products that utilize FPGAs exclusively (no ASICs), so I got a chance to understand a little bit more about the differences between ASIC and FPGA design. Here’s the follow-on then to my previous post An ASIC Guy Visits An FPGA World.

Recall that the first 4 observations from my previous visit to FPGA World were:

Observation #1 - FPGA people put their pants on one leg at a time, just like me.

Observation #2 - I thought that behavioral synthesis had died, but apparently it was just hibernating.

Observation #3 - Physical design of FPGAs is getting like ASICs.

Observation #4 - Verification of FPGAs is getting like ASICs.

Now for the new observations:

Observation #5 - Parts are damn cheap - According to the CTO of this company, Altera Cyclone parts can cost as little as $10-$20 each in sufficient quantities. A product that requires thousands or even tens of thousands will still cost less than a 90nm mask set. For many non-consumer products with quantities in this range, FPGAs are compelling from a cost standpoint.

True, the high-end parts can cost thousands or even tens of thousands each (e.g. for the latest Xilinx Virtex 6). But considering that a Virtex 6 part is 45nm and has the gate-count equivalent of almost 10M logic gates, what would an equivalent ASIC cost?

Observation # 6 - FPGA verification is different (at least for small to medium sized FPGAs) - Since it is so easy and fast and inexpensive (compared to ASIC) to synthesize and place and route an FPGA, much more of the functional verification is done in the lab on real hardware. Simulation is typically used to get a “warm and fuzzy” that the design is mostly functional, and then the rest is done in the lab with the actual FPGA. Tools like Xilinx ChipScope allow logic-analyzer-like access into the device, providing some, but not all, of the visibility that exists in a simulation. And once bugs are found, they can be fixed with an RTL change and reprogramming the FPGA.

One unique aspect of FPGA verification is that it can be done in phases or “spirals”. Perhaps only some of the requirements for the FPGA are complete or only part of the RTL is available. No problem. One can implement just that part of the design that is complete (for instance just the dataplane processing) and program the part. Since the same part can be used over and over, the cost to do this is basically $0. Once the rest of the RTL is available, the part can be reprogrammed again.

Observation # 7 - FPGA design tools are all free or dirt cheap - I think everybody knows this fact already, but it really hit home talking to this company. Almost all the tools they use for design are free or very inexpensive, yet the tools are more than capable to “get the job done”. In fact, the company probably could not operate in the black if they had to make the kind of investment that ASIC design tools require.

Observation # 8 - Many tools and methods common in the ASIC world are still uncommon in this FPGA world - For this company, there is no such thing as logical equivalence checking. Verification tools that perform formal verification of designs (formal proof), System-Verilog simulation, OVM, VMM…not used at all. Perhaps they’ll be used for the larger designs, but right now they are getting along fine without them.


FPGA verification is clearly the area that is the most controversial. In one camp are the “old skool” FPGA designers that want to get the part in the lab as soon as possible and eschew simulation. In the other camp are the high-level verification proponents who espouse the merits of coverage-driven and metric-driven verification and recommend achieving complete coverage in simulation. I think it would really be fun to host a panel discussion with representatives from both camps and have them debate these points. I think we’d learn a lot.


harry the ASIC guy

The Missing Lynx - The ASIC Cloud

Friday, April 3rd, 2009

My last blog post, entitled The Weakest Lynx, got a lot of attention from the Synopsys Lynx CAEs and Synopsys marketing. Please go see the comments on that post for a response from Chris Smith, the lead support person for Lynx at Synopsys. Meanwhile, the final part of this series … The Missing Lynx.

About 7 months ago, I wrote a blog post entitled Birth of an EDA Revolution in which I first shared my growing excitement over the potential for cloud computing and Software-as-a-Service (SaaS) to transform EDA. About a week later, Cadence announced a SaaS offering that provides their reference flows, their software, and their hardware for rent to projects on a short-term basis. About a week after that, I wrote a third post on this topic, asking WWSD (what will Synopsys do) in response to Cadence.

In that last post, I wrote the following:

Synopsys could probably go one better and offer a superior solution if it wanted to, combining their DesignSphere infrastructure and Pilot Design Environment.  If fact, they have done this for select customers already, but not as a standard offering. There is some legwork that they’d need to do, but the real barrier is Synopsys itself. They’ve got to decide to go after this market and put together a standard offering like Cadence has … And while they are at it, if they host it on a secure cloud to make it universally accessible and scalable, and if they offer on-demand licensing, and if they make it truly open by allowing third party tools to plug into their flow, they can own the high ground in the upcoming revolution.

Although I wrote this over 6 months ago, I don’t think I could have written it better today. The only difference is that Pilot has now become Lynx. “The ASIC Cloud”, as I call it, would look something like this:

The ASIC Cloud

As I envision it, Synopsys Lynx will be the heart of The ASIC Cloud and will serve to provide the overall production design flow. The Runtime Manager will manage the resources including provisioning of additional hardware (CPU and storage) and licenses, as needed. The management cockpit will provide real-time statistics on resource utilization so the number of CPUs and licenses can be scaled on-the-go. Since The ASIC Cloud is accessible through any web browser, this virtual design center is accessible to large corporate customers and to smaller startups and consultants. It’s also available to run through portable devices such as netbooks and smartphones.

If you think I’m insane, you may be right, I may be crazy. But it just might be a lunatic you’re looking for. To show you that this whole cloud computing thing is not just my fever (I have been sick this past week), take a look at what this one guy in Greece did with Xilinx tools. He basically pays < $1 per hour to access hardware to run Xilinx synthesis tools on the Amazon Elastic Compute Cloud. Now, this is nothing like running an entire RTL2GDSII design flow, but he IS running EDA tools on the cloud, taking advantage of pay-as-you go CPU and storage resources, and taking advantage of multiple processors to speed up his turnaround time. The ASIC Cloud will be similar and on a much greater scale.

It may take some time for Synopsys to warm up to this idea, especially since it is a whole new business model for licensing software. But for a certain class of customers (startups, design services providers) it has definite immediate benefits. And many of these customers are also potential Lynx customers.

So, Synopsys, if you want to talk, you know where to find me.


That wraps up my 5-part series on Synopsys Lynx. If you want to find the other 4 parts, here they are:

Part 1 - Synopsys Lynx Design System Debuts at SNUG

Part 2 - Lynx Design System? - It’s The Flow, Stupid!

Part 3 - Strongest Lynx

Part 4 - The Weakest Lynx

harry the ASIC guy

The Revolution Will Not Be Televised!!!

Thursday, April 3rd, 2008

My friend Ron has a knack for recognizing revolutionary technologies before most of us. He was one of the first to appreciate the power of the browser and how it would transform the internet, previously used only by engineers and scientists. He was one of the first and best podcasters. And now he’s become a self-proclaimed New Media Evangelist, preaching the good news of Web 2.0 and making it accessible to “the rest of us”.

Most of us are familiar with mainstream Web 2.0 applications, whether we use them or our friends use them or our kids use them. Social and professional networks such as My Space, Facebook, and LinkedIn. Podcasts in iTunes. Blogging sites on every topic. Virtual worlds such as Second Life. Collaboration tools such as Wikipedia. File sharing sites such as Youtube and Flickr. Social bookmarking sites such as Digg and Technorati. Open source publishing tools such as Wordpress and Joomla. Using these technologies we’re having conversations, collaborating, and getting smarter in ways that were unimaginable just 5 years ago. Imagine, a rock climber in Oregon can share climbing techniques with a fellow climber in Alice Springs. And mostly for free, save for the cost of the internet connection.

When we think of Web 2.0, we tend to think of teenagers and young adults. But this technology was invented by us geeks and so it’s no surprise that the ASIC design world is also getting on-board. Here are some examples from the ASIC Design industry:

Social media is networking ASIC designer to ASIC designer enabling us to be smarter faster. But that’s not all. Many forward looking companies have recognized the opportunity to talk to their customers directly. About 6 months ago, Synopsys launched several blogs on its microsite. Xilinx also has a User Community and a blog. It’s great that this is happening, but does it really make much of a difference? Consider what I believe could be a watershed event:

A few months ago, JL Grey published a post on his Cool Verification blog entitled The Brewing Standards War - Verification Methodology. As expected, verification engineers chimed in and expressed their ardent opinions and viewpoints. What came next was not expected … stakeholders from Synopsys and Mentor joined the conversation. The chief VMM developer from Synopsys, Janick Bergeron, put forth information to refute certain statements that he felt were erroneous. A marketing manager from Mentor, Dennis Brophy, offered his views on why OVM was open and VMM was not. And Karen Bartleson, who participates in several standards committees for Synopsys, disclosed Synopsys’ plan to encourage a single standard by donating VMM to Accellera.

From what I’ve heard, this was one of the most viewed ASIC related blog postings ever (JL: Do you have any stats you can share?). But did it make a difference in changing the behavior of any of the protagonists? I think it did and here is why:

  • This week at the Synopsys Users Group meeting in San Jose, the VMM / OVM issues were the main topic of questioning for CEO Aart DeGeus after his keynote address. And the questions picked up where they left off in the blog post…Will VMM ever be open and not just licensed? Is Synopsys trying to talk to Mentor and Cadence directly? If we have access to VMM, can we run it on other simulators besides VCS?
  • Speaking to several Synopsoids afterwards, I discovered that the verification marketing manager referenced this particular Cool Verification blog posting in an email to an internal Synopsys verification mailing list. It seems he approved of some of the comments and wanted to make others in Synopsys aware of these customer views. Evidently he sees these opinions as valuable and valid. Good for him.
  • Speaking to some at Synopsys who have a say in the future of VMM, I believe that Synopsys’ decision to donate VMM to Accellera has been influenced and pressured, at least in part, by the opinions expressed in the blog posting and the subsequent comments. Good for us.

I’d like to believe that the EDA companies and other suppliers are coming to recognize what mainstream companies have recognized … that the battle for customers is decreasingly being fought with advertisements, press releases, glossy brochures, and animated Power Point product pitches. Instead, as my friend Ron has pointed out, I am able to talk to “passionate content creators who know more about designing chips than any reporter could ever learn”, and find out what they think. Consider these paraphrased excerpts of the cluetrain manifesto : the end of business as usual:

  • The Internet is enabling conversations among human beings that were simply not possible in the era of mass media. As a result, markets are getting smarter, more informed, more organized.
  • People in networked markets have figured out that they get far better information and support from one another than from vendors.
  • There are no secrets. The networked market knows more than companies do about their own products. And whether the news is good or bad, they tell everyone.
  • Companies that don’t realize their markets are now networked person-to-person, getting smarter as a result and deeply joined in conversation are missing their best opportunity.
  • Companies can now communicate with their markets directly. If they blow it, it could be their last chance.

In short, this ASIC revolution will not be televised!!!

harry the ASIC guy