Posts Tagged ‘EDA’

Which Direction for EDA - 2D, 3D, or 360?

Sunday, May 23rd, 2010

2d3d360.JPGA hiker comes to a fork in the road and doesn’t know which way to go to reach his destination. Two men are at the fork, one of whom always tells the truth while the other always lies. The hiker doesn’t know which is which. He may ask one of the men only one question to find his way.

Which man does he ask, and what is the question?


There’s been lots of discussion over the last month or 2 about the direction of EDA going forward. And I mean literally, the “direction” of EDA. Many semiconductor industry folks and proponents have been telling us to hold off on that obituary for 2D scaling and Moore’s law. Others have been doing quiet innovation in the technologies needed for 3D die and wafer stacks. And Cadence has recently unveiled its holistic 360 degree vision for EDA that has us developing apps first and silicon last.

I’ll examine each of these orthogonal directions in the next few posts. In this post, I’ll first examine the problem that is forcing us to make these choices.

The Problem

One of the great things about writing this blog is that I know that you all are very knowledgeable about the industry and technology and I don’t need to start with the basics. So I’ll just summarize them here for clarity:

  • Smaller semiconductor process geometries are getting more and more difficult to achieve and are challenging the semiconductor manufacturing equipment, the EDA tools, and even the physics. No doubt there have been and always will be innovations and breakthroughs that will move us forward, but we can no longer see clearly the path to the next 3 or 4 process geometries down the road. Even if you are one of the people who feels there is no end to the road, you’d have to admit that it certainly is getting steeper.
  • The costs to create fabs for these process nodes is increasing drastically, forcing consolidation in the semiconductor manufacturing industry. Some predict there will only be 3 or 4 fabs in a few years. This cost is passed on to the cost of the semiconductor device. Net cost per gate may not be rising, but the cost to ante up with a set of masks at a new node certainly is.
  • From a device physics and circuit design perspective, we are hitting a knee in the curve where lower geometries are not able to deliver on the full speed increases and power reductions achieved at larger nodes without new “tricks” being employed.
  • Despite these challenges, ICs are still growing in complexity and so are the development costs, some say as high as $100M. Many of these ICs are complex SoCs with analog and digital content, multiple processor cores, and several 3rd party IP blocks. Designing analog and digital circuits in the same process technology is not easy. The presence of embedded processors means that software and hardware have intersected and need to be developed harmoniously … no more throwing the hardware over-the-wall to software. And all this 3rd party IP means that our success is increasingly dependent on the quality of work of others that we have never met.
  • FPGAs are eating away at ASIC market share because of all the factors above. The break even quantity between ASIC and FPGA is increasing, which means more of the lower volume applications will choose FPGAs. Nonetheless, these FPGAs are still complex SoCs requiring similar verification methods as ASICs, including concurrent hardware and software development.

There are no doubt many other factors, but these are the critical ones in my mind. So, then, what does all this mean for semiconductor design and EDA?

At the risk of using a metaphor, many feel we are at a “fork-in-the-road”. One path leads straight ahead, continuing the 2D scaling with new process and circuit innovations. Another path leads straight up, moving Moore’s law into the 3D dimension with die stacks in order to cost effectively manage increasing complexity. And one path turns us 180 degrees around, asking us to look at the applications and software stack first and the semiconductor last. Certainly, 3 separate directions.

Which is the best path? Is there another path to move in? Perhaps a combination of these paths?

I’ll try to examine these questions in the next few posts. Next Post: Is 2D Scaling Really Dead or Just Mostly Dead?


Answer to Riddle: Either man should be asked the following question: “If I were to ask you if this is the way I should go, would you say yes?” While asking the question, the hiker should be pointing at either of the directions going from the fork.

harry the ASIC guy

Small Gathering in Monterey

Sunday, April 4th, 2010

There’s going to be a small gathering of luminaries in Monterey this week. And no, it’s not the jellyfish at the Monterey Bay Aquarium.

Jellyfish at Monterey Bay Aquarium / CC BY-NC-SA 2.0

It’s the Electronic Design Processes Symposium Workshop. This will be my first time attending and it will be very interesting on several accounts.

First, everyone who has attended in the past has said that it’s truly unique in the industry. It’s not really a conference but a workshop, a small gathering of the best and brightest in EDA exchanging their research and ideas. With only 25-50 participants, it’s not uncommon for someone to  stand up and challenge the presenter or for spontaneous discussions to break out during sessions. It’s not uncommon to meet people who have been thinking about the same problems you’ve considered or someone who has been thinking about problems you never thought existed. And it’s not uncommon to walk away with new insights and revelations and ideas.

Second, there probably are very few, if any, EDA customers attending. This is an EDA workshop for EDA people. There is no trade show. There are no booths to set up. There are no big press releases or sales guys walking up to shake your hand or schlocky giveaways. I imagine that DAC probably started out this way back in the 1960s, but DAC is now more of a trade show that a workshop. This will be back to the grass roots of EDA.

Third, I am going to be moderating one of the sessions, something I’ve not done before (except at my own round table). I’ll be moderating the session entitled “Moving to a Brave New World” which includes a presentation by James Colgan of Xuropa on cloud computing. Obviously, I’m a little biased on that topic, but I’ll try not to let that influence my moderation. I’m sure this and all the other sessions will be very interesting.

Last, I’m going to be traveling with my family. It’s spring break week and so we’re heading up the coast to visit Monterey for a few days, then visit some friends up in Danville. Fortunately there is plenty to do in Monterey and the hotel is right on the beach, so they’ll be fine. So, if you want to meet me or my family, come on over. Registration is still open.

We promise we won’t sting.

harry the ASIC guy

The Burning Platform

Monday, March 1st, 2010

The Burning PlatformAlthough I was unable to attend DVCon last week, and I missed Jim Hogan and Paul McLellan presenting “So you want to start an EDA Company? Here’s how“, I was at least able to sit in on an interesting webinar offered by RTM Consulting entitled Achieving Breakthrough Customer Satisfaction through Project Excellence.

As you may recall, I wrote a previous blog post about a Consulting Soft Skills training curriculum developed by RTM in conjunction with Mentor Graphics for their consulting organization. Since that time, I’ve spoken on and off with RTM CEO Randy Mysliviec. During a recent conversation he made me aware of this webinar and offered one of the slots for me to attend. I figured it would be a good refresher, at a minimum, and if I came out of it with at least one new nugget or perspective, I was ahead of the game. So I accepted.

I decided to “live tweet” the seminar. That is to say, I posted tweets of anything interesting that I heard during the webinar, all using the hash tag #RTMConsulting. If you want to view the tweets from that webinar, go here.

After 15 years in the consulting biz, I certainly had learned a lot, and the webinar was indeed a good refresher on some of the basics of managing customer satisfaction. There was a lot of material for the 2 hours that we had, and there were no real breaks, so it was very dense and full of material. The only downside is that I wish there had been some more time for discussion or questions, but that’s really a minor nit to pick.

I did get a new insight out of the webinar, and so I guess I’m ahead of the game. I had never heard of the concept of the “burning platform” before, especially as applies to projects. The story goes that there was an oil rig in the North Sea that caught fire and was bound to be destroyed. One of the workers had to decide whether to stay on the rig or jump into the freezing waters. The fall might kill him and he’d face hypothermia within minutes if not rescued, but he decided to jump anyway, since probable death was better than certain death. According to the story, the man survived and was rescued. Happy ending.

The instructor observed that many projects are like burning platforms, destined for destruction unless radically rethought. In thinking back, I immediately thought of 2 projects I’d been involved with that turned out to be burning platforms.

The first was a situation where a design team was trying to reverse engineer an asynchronously designed processor in order to port it to another process. The motivation was that the processor (I think it was an ADSP 21 something or other) was being retired by the manufacturer and this company wanted to continue to use it nonetheless. We were called in when the project was already in trouble, significantly over budget and schedule and with no clear end in sight. After a few weeks of looking at the situation, we decided that there was no way they would ever be able to verify the timing and functionality of the ported design. We recommended that they kill this approach and start over with a standard processor core that could do the job. There was a lot of resistance, especially from the engineer whose idea it was to reverse engineer the existing processor. But, eventually the customer made the right choice and redesigned using an ARM core.

Another group at the same company also had a burning platform. They were on their 4th version of a particular chip and were still finding functional bugs. Each time they developed a test plan and executed it, there were still more bugs that they had missed. Clearly their verification methodology was outdated and insufficient, depending on directed tests and FPGA prototypes rather than more current measurable methods. We tried to convince them to use assertions, functional coverage, constrained random testing, etc. But they were convinced that they just had to fix the few known bugs and they’d be OK. From their perspective, it wasn’t worth all the time and effort to develop and execute a new plan. They never did take our recommendations and I lost track of that project. I wonder if they ever finished.

As I think about these 2 examples, I realize that “burning platform” projects have some characteristics in common. And they align with the 3 key elements of a project. To tell if you have a “burning platform” on your hands, you might ask yourself the following 3 questions:

  1. Scope - Are you spending more and more time every week managing issues and risks? Is the list growing, rather than shrinking?
  2. Schedule - Are you on a treadmill with regards to schedule? Do you update the schedule every month only to realize that the end date has moved out by a month, or more?
  3. Resources - Are the people that you respect the most trying to jump off of the project? Are people afraid to join you?

If you answered yes to at least 2 of these, then you probably have a burning platform project on your hands. It’s time to jump in the water. That is, it’s time to scrap the plan and rethink your project from a fresh perspective and come up with a new plan. Of course, this is not a very scientific way of identifying an untenable project, but I think it’s a good rule-of-thumb.

There are other insights that I had from the webinar, but I thought I’d only share just the one. I don’t know if this particular webinar was recorded, but there are 2 more upcoming that you can attend. If you do, please feel free to live tweet the event like I did, using the #RTMConsulting hash tag.

But please, no “flaming” :-)

harry the ASIC guy

So, you want to start an EDA company?

Tuesday, February 9th, 2010 CC BY-NC 2.0Lightbulb

In the almost 2 years since I started this blog, I’ve been paying pretty close attention to the EDA industry. And one of the themes I keep hearing goes something like this:

“There’s no more innovation in EDA”
I hear it on blogs and on Twitter. I hear it from design engineers, from consultants, from old media, from new media, and even from EDA people.

One person I know, someone who has been an executive at an EDA company and a venture capitalist, says that EDA is persona non-grata for VC folks. Maybe you can start a “lifestyle company” doing EDA, but don’t expect any more companies like Synopsys to come along.

And then, about a month ago, I get an email from someone out of the blue. He’s got an idea for a new EDA tool that would transform the industry. He’s been in the semiconductor business. He’s developed EDA tools. He knows everybody there is to know. And he’s not able to get anyone’s attention. As he puts it, nobody is working on anything “disruptive”. They are all doing “incremental improvements” that are “woefully inadequate”.

I spent about an hour talking to him on the phone. As I got off the phone, I was not sure what to make of the conversation. He was either insane or a visionary. He was either deluded or optimistic. He was either obsessed or determined. I’m still not sure which.

And that is what makes this industry so much frickin’ fun! You never know. That crazy idea of turning VHDL into gate-level schematics … who figured that would be the biggest innovation in design in decades?

Then, last week, I heard about this event/gathering/workshop happening during DVCon at the San Jose Doubletree. Presented by EDA veterans Jim Hogan and Paul McLellan. It’s called “So, you want to start an EDA Company. Here’s how …” And I immediately thought of my new friend with the idea about a new EDA company. This is exactly what he was looking for … an audience of people with open minds who were asking “why not” instead of “why”.

Maybe you also have a crazy idea. Maybe it really is crazy. Or maybe not.

I invited him and I hope I can get there myself. If so, I think you might want to come too.  You might just meet the founder of the next Synopsys. Here’s the skinny: San Jose Doubletree on Feb 23 at 6:30-7:30 in the Oak Ballroom.

I’ve also written a little prediction of what I expect to hear on the Xuropa Blog. Who knows? Maybe the naysayers are right and EDA is Dead. Then again, maybe not. I, for one, am dying to find out which.

harry the ASIC guy

My Obligatory TOP 10 for 2009

Thursday, December 31st, 2009

2009 To 2010 / CC BY 2.0

What’s a blog without some sort of obligatory year end TOP 10 list?

So, without further ado, here is my list of the TOP 10 events, happenings, occurrences, observations that I will remember from 2009. This is my list, from my perspective, of what I will remember. Here goes:

  1. Verification Survey - Last February, as DVCon was approaching, I thought it would be interesting to post a quickie survey to see what verification languages and methodologies were being used. Naively, I did not realize to what extent the fans of the various camps would go to rig the results in their favor. Nonetheless, the results ended up very interesting and I learned a valuable lesson on how NOT to do a survery.
  2. DVCon SaaS and Cloud Computing EDA Roundtable - One of the highlights of the year was definitely the impromptu panel that I assembled during DVCon to discuss Software-as-a-Service and Cloud Computing for EDA tools. My thanks to the panel guests, James Colgan (CEO @ Xuropa), Jean Brouwers (Consultant to Xuropa),  Susan Peterson (Verification IP Marketing Manager @ Cadence), Jeremy Ralph (CEO @ PDTi), Bill Alexander (VP Marketing @ Blue Pearl Software), Bill Guthrie (VP Marketing @ Numetrics). Unfortunately, the audio recording of the event was not of high enough quality to post, but you can read about it from others at the following locations:

    > 3 separate blog posts from Joe Hupcey (1, 2, 3)

    > A nice mention from Peggy Aycinena

    > Numerous other articles and blog posts throughout the year that were set in motion, to some extent, by this roundtable

  3. Predictions to the contrary, Magma is NOT dead. Cadence was NOT sold. Oh, and EDA is NOT dead either.
  4. John Cooley IS Dead - OK, he’s NOT really dead. But this year was certainly a turning point for his influence in the EDA space. It started off with John’s desperate attempt at a Conversation Central session at DAC to tell bloggers that their blog sucks and convince them to just send him their thoughts. For those who took John up on his offer by sending their thoughts, they would have waited 4 months to see them finally posted by John in his December DAC Trip report. I had a good discussion on this topic with John earlier this year, which he asked me to keep “off the record”. Let’s just say, he just doesn’t get it and doesn’t want to get it.
  5. The Rise of the EDA Bloggers.
  6. FPGA Taking Center Stage - It started back in March when Gartner issued a report stated that there were 30 FPGA design starts for every ASIC start. That number seemed very high to me and to others, but that did not stop this 30:1 ratio from being quoted as fact in all sorts of FPGA marketing materials throughout the year. On the technical side, it was a year where the issues of verification of large FPGAs came front-and-center and where a lot of ASIC people started transitioning to FPGA.
  7. Engineers Looking For Work - This was one of the more unfortunate trends that I will remember from 2009 and hopefully 2010 will be better. Personally, I had difficulty finding work between projects. DAC this year seemed to be as much about finding work as finding tools. A good friend of mine spent about 4 months looking for work until he finally accepted a job at 30% less pay and with a 1.5 hour commute because he “has to pay the bills”. A lot of my former EDA sales and AE colleagues have been laid off. Some have been looking for the right position for over a year. Let’s hope 2010 is a better year.
  8. SaaS and Cloud Computing for EDA - A former colleague of mine, now a VP of Sales at one of the small but growing EDA companies, came up to me in the bar during DAC one evening and stammered some thoughts regarding my predictions of SaaS and Cloud Computing for EDA. “It will never happen”. He may be right and I may be a bit biased, but this year I think we started to see some of the beginnings of these technologies moving into EDA. On a personal note, I’m involved in one of those efforts at Xuropa. Look for more developments in 2010.
  9. Talk of New EDA Business Models - For years, EDA has bemoaned the fact that the EDA industry captures so little of the value ($5B) of the much larger semiconductor industry ($250B) that it enables. At the DAC Keynote, Fu-Chieh Hsu of TSMC tried to convince everyone that the solution for EDA is to become part of some large TSMC ecosystem in which TSMC would reward the EDA industry like some sort of charitable tax deduction. Others talked about EDA companies having more skin in the game with their customers and being compensated based on their ultimate product success. And of course there is the SaaS business model I’ve been talking about. We’ll see if 2010 brings any of these to fruition.
  10. The People I Got to Meet and the People Who Wanted to Meet Me- One of the great things about having a blog is that I got to meet so many interesting people that I would never have had an opportunity to even talk to. I’ve had the opportunity to talk with executives at Synopsys, Cadence, Mentor, Springsoft, GateRocket, Oasys, Numetrics, and a dozen other EDA companies. I’ve even had the chance to interview some of them. And all the fellow bloggers I’ve met and now realize how much they know. On the flip side, I’ve been approached by PR people, both independent and in-house. I was interviewed 3 separate times, once by email by Rick Jamison, once by Skype by Liz Massingill, and once live by Dee McCrorey. EETimes added my blog as a Trusted Source. For those who say that social media brings people together, I can certainly vouch for that.

harry the ASIC guy

EDA Trends for 2010???

Wednesday, December 9th, 2009

Fortune Teller With An AttitudeI’ve been asked by a fellow blogger to offer a prediction of the top trend in EDA in 2010 as a contribution to that blog. I have one in mind, but I think it would be interesting to hear from everyone else, since you all are a lot smarter than me.

So, what do you think will be the top trend in EDA in 2010?

harry the ASIC guy

Are Sales People Really Needed?

Monday, November 30th, 2009

SalesmanMy former-EDA-salesperson friend had just finished his lunch when he leaned back in his chair and said:

“Listen. You’ve been on both sides, in EDA and a customer. Lemme ask you a question. Do you think sales people are really needed?”

At first, I was really shocked to hear this question, especially from someone who had been in EDA sales for the last 10 years. After all, you don’t hear plumbers asking if plumbers are needed. Or doctors. Or auto mechanics. Even folks in professions that are experiencing job losses, such as journalism, hardly ever question the value they bring.

I let the question sink in for a few seconds, which seemed like minutes, and answered the only way I could. With another question, “how do you mean?”

As it turns out, my friend was not really having a deep identity crisis. He was just trying to understand why EDA companies, including his former employer, seem to view direct sales people, especially him, as expendable costs, easily replaced with inside sales, marketing campaigns, and online sales methods.

Put that way, it’s an interesting question to consider. Although I have never been a “bag carrying” sales person, I did spend the better party of 14 years on the EDA side in some sort of sales support or semi-sales role. And I still have many friends in sales or applications engineering roles. Were my friends and my old jobs becoming obsolete? Are new technologies, ones that connect customers with companies directly (blogs, forums, etc.), making sales people unnecessary?

On the other hand, I’ve spent the last 3 years of my career back on the other side of the fence, in the customer world. I’ve had the opportunity for many interactions with folks whose shoes I used to wear. Certainly, some of these folks do provide value, marshaling corporate resources to address a tool issue or providing methodology assistance for a new technology. There are also the dirty parts of the job. Without sales people’s efforts, many opportunities would die an early death in the hands of lawyers, accountants, and purchasing reps, or at least they would not occur as quickly as they do.

At the same time, we cannot deny that technology is replacing the need for sales people in many of our other daily purchases, especially consumer electronics. We do all of our research online. We compare product specs on web sites. We seek out product reviews by trusted tech gadget bloggers and ratings by actual customers. We compare prices online and make our purchases with a click. No sales person in the loop.

You’d be correct in pointing out that buying an EDA tool is not like buying a digital camera. Still, there are changes going on in EDA as well. This blog and those of many of my colleagues are now considered product research resources. The work I’ve been doing recently with Xuropa has been aimed at moving part of the sales process, specifically product evaluations, online.  And forums such as TechBites are springing up to provide independent opinions. So maybe there is some cause for my friend’s concern.

As I’ve had time to consider this question since our lunch, I’ve come to feel that salespeople are still needed and will be for some time to come in EDA. Good salespeople know how to find customers, to manage sales campaigns, to manage complex issues, and to ultimately “close the deal”. However, many of their up-front functions will be taken over by other methods, driven by thechnology. As a result, the salesperson will increasingly encounter a more educated customer, one that knows he has alternatives, and one that feels more in control of the sales process than before. Salespeople will have to adapt to that type of customer.

We finished up our lunch and our discussion without reaching any definite conclusions. On the way to our cars I asked him, “mind if I blog about it?”


So, what do you think? Are sales people really needed?

harry the ASIC guy

Synopsys Synphony Synopsis

Monday, October 12th, 2009

sheet_music.jpgI was contacted a few weeks ago by Synopsys’ PR agency to see if I’d be interested in covering an upcoming product announcement. I usually ignore these “opportunities” since the information provided is usually carefully wordsmithed marketing gobbledygook and not enough for me to really form an opinion. However, it turned out that this announcement was on a subject I know a little bit about, so I took them up on their offer.

The announcement was “embargoed“, that is, I was not to make it public until today. Embargoes are a vestige of the days when traditional journalism ruled the roost and when PR departments thought they could control the timing of their message. I don’t think embargoes benefit companies anymore since news is reported at light speed (literally) and people will write what they want when they want. Still, I consider it a sort of gentleman’s agreement so I’m not writing about it until today.

I also waited a little bit until the “mainstream press” wrote their articles. That let’s me point you to the best of them and conserve the space here for my own views, rather that regurgitating the press release and nuts and bolts.

(Update: Here is a very good description of the Synphony flow from Ron Wilson).

Today, Synopsys announced a new product called Synphony High Level Synthesis. You can read about this here. Basically, Synopsys is introducing a high level synthesis (aka behavioral synthesis) product that takes as its input Matlab M-Code and produces RTL Code, a cycle accurate C-model, and a testbench for simulation. Since I have not used the tool, I cannot comment on the capabilities or the quality of results or compare it to other tools on the market. However, I have had some past experience with tools like Matlab (specifically SPW) and Synphony (specifically Behavioral Compiler). So, here are my thoughts, observations, opinions that come to mind.

  1. Synopsys, once the leader in behavioral synthesis, is now the follower - When Synopsys introduced Behavioral Compiler over a decade ago they were the first to preach the gospel of high-level synthesis and all the associated benefits. Architectural optimization. Faster simulation. Bridging the gap between system design and ASIC design. Smaller and easier to understand code. Dogs and cats living together. The promises never fully materialized and Synopsys seemingly moved out of the market. Meanwhile, Mentor introduced Catapult C, Cadence introduced C-to-Silicon, and several others including Forte, Agility, Bluespec, Synfora, ChipVision, and AutoESL introduced their own high-level synthesis tools. Now, after acquiring Synplify DSP through Synplicity, Synopsys is finally re-entering the market (at least for ASIC design) with Synphony. The hunted have become the hunters.
  2. Synphony takes M-code from Matlab as its only source - Whereas most (but not all) other high-level synthesis tools input C like languages, Synopsys has chosen to input M-code only, at least for now. According to Chris Eddington, who is Director of Product Marketing for System-Level Products at Synopsys (according to his LinkedIn profile), approximately 60% of those who say they do “high-level design” are using M-code or some form of C (ANSI C, C++, System-C) for some portion of their design activities. Of those, slightly more use the C variants than M-code, which means that somewhere close to 25% of all ASIC designers could be a possible market for this tool.
  3. Synopsys can try to leverage the Matlab installed base - As mentioned above, Synopsys estimates that 25% of high-level designers could use the Synphony tool which is a pretty big market. By targeting mainly algorithmic design, not control logic, Synopsys can try to serve the Matlab installed base with a more narrowly targeted offering which should make it easier to support. It also allows Synopsys to avoid a bloody battle over C dominance and to pursue a blue ocean strategy with Matlab’s installed base. Interestingly though, there is no partnership with MathWorks implied by this announcement.
  4. Synphony leverages existing IP libraries - Libraries already exist for many common functions that were available for the Synplify DSP tool. The library elements are available as well for Synphony, allowing the designer to specify his functionality using this library or using M-code as the source.
  5. An FPGA tool is being adapted for ASIC - This is probably one of the first times that a tool initially developed for FPGAs (Synplify DSP) is being adapted for ASICs. It’s usually the other way around (e.g. FPGA Compiler grew out of Design Compiler). It should be interesting to see if the FPGA tool can “cut-it” in the ASIC world.
  6. Ties to implementation are seemingly tenuous - A tool that can take M-code as its input and produce RTL and C and do all the other things is all fine and good. But for Synphony to become more than an experimentation tool, it has to produce results (speed, area, power) as good or better than hand-coded RTL. However, the ties to the implementation tool (Design Compiler) are not as direct as even Behavioral Compiler was a decade ago. It seems that Synphony takes an approach where it pre-compiles and estimates timing for various blocks (kind of like building DesignWare libraries), but it assembles the design outside of DesignCompiler without all the associated timing views and engines necessary for true design and timing closure. It’s hard to understand how this can reliably produce results that consistently meet timing, but perhaps there is something that I am not aware of?
  7. Focus on “algorithmic design”, not control - As mentioned above, Synopsys is going after the folks using Matlab. And those designers are developing algorithms, not state machines. In essence, Synphony can focus on the fairly straightforward problem of scheduling mathematical operations to hit throughput and latency goals and not deal with more complex control logic. Much simpler.
  8. Conversion from Floating Point to Fixed Point - Anyone who has designed a filter or any DSP function knows that the devil is in the details, specifically the details of fixed point bit width. One choice of bit width affects downstream choices. You have to decide whether to round or truncate and these decisions can introduce unexpected artifacts into your signal. Synphony converts the floating point Matlab model into a fixed point implementation. Supposedly, it then allows you to easily fiddle with the bit widths to tweak the performance. Some earlier Synopsys products did this (Cossap, System Studio) and it’s a nice feature that can save time. We’ll see how useful it really is over time.
  9. Synphony produces real RTL, as well as C-code and a testbench - One of the drawbacks of Behavioral Compiler is that it never produced a human readable form of RTL code. This made it hard to simulate and debug the RTL. Synphony supplies readable RTL (or so I am told) as well as cycle accurate C-code for system simulation and a testbench for block simulation. This should help facilitate full chip simulations for chip integration, since Synphony will probably only be used on blocks, not entire chips.
  10. Couldn’t Synopsys come up with a better reference than Toyon Research Corporation - No offense to Toyon, but they are hardly a household name. It makes me wonder how many partners Synopsys has engaged in this development and how well tested this flow is. Not saying it isn’t well tested, just that Synopsys is making me wonder. Gimme a name I’ve heard of, please.

Only time will tell if Synphony is truly music to our ears, or if it is just SYNthesis that is PHONY.

harry the ASIC guy

DAC Theme #3 - “Increasing Clouds Over SF Bay”

Sunday, August 16th, 2009

Clouds over San FranciscoIt was easy to spot the big theme’s at DAC this year. This was the “Year of ESL” (again). The state of the economy and the future of EDA was a constant backdrop. Analog design was finally more than just Cadence Virtuoso. And social media challenged traditional media.

It was harder to spot the themes that were not front and center, that were not spotlighted by the industry beacons, that were not reported by press or bloggers. Still, there were important developments if you  looked in the right places and noticed what was changing. At least one of those themes came across to me loud and clear. This was the year that the clouds started forming over EDA.

If you’ve read my blog for a while, you know I’m not talking about the weather or some metaphor for the health of the EDA industry. You know I am talking about cloud computing, which moved from crazy idea of deluded bloggers to solidly in the early adopter category. Though this technology is still “left of chasm”, many companies were talking about sticking their toes in the waters of cloud computing and some even had specific plans to jump in. Of note:

  • Univa UD - Offering a “hybrid cloud” approach to combine on premise hardware and public cloud resources. Many view this as the first step into the cloud since it is incremental to existing on premise hardware.
  • Imera Systems - Offering a product called EDA Remote Debug that enables an EDA company to place a debug version of their software on a customer’s site in order to debug a tool issue. This reduces the need to send an AE on site or to have the customer package up a testcase.
  • R Systems - A spinoff from the National Center for Supercomputing Applications (best known for Telnet and Mosaic), they were wandering the floor pitching their own high performance computing resources (that they steadfastly insisted were “not a cloud”) available remotely or brought to your site to increase your computing capacity.
  • Cadence - One of the first (after PDTi) to have an official Hosted Design Solutions offering, they host their software and your data in a secure datacenter and are looking at the cloud as well for the future.

And then there’s Xuropa.

Before I cover Xuropa, I need to take a brief digression. You see, July 27th was not just the first day of DAC. It was also my first official day working for Xuropa as one of my clients. I’ll be doing social media consulting (blogging, tweeting, other online social community stuff) and also helping their customers get their tools on the Xuropa platform. This is very exciting for me, something I’ll blog about specifically on the Xuropa Blog and also here. In the meantime, under full disclosure, you’ve now been told. You can factor in the appropriate amount of skepticism to what I have to say about cloud computing, hosted design, Software-as-a-Service and Xuropa.

  • Xuropa - Offering to EDA companies and IP providers the ability to create secure online labs in the cloud for current and prospective customers to test drive a tool, do tool training, etc. They also have plans to make the tools available for “real work”.

These companies and technologies are very exciting on their own. Still, the cloud computing market is very new and there is a lot of churn so it is very difficult to know what will survive or become the standard. Perhaps something not even on this list will emerge.

Even though the technology side is cloudy (pun intended), the factors driving companies to consider using the cloud are very clear. They all seem to come down to one economic requirement. Doing more with less. Whenever I speak to people about cloud computing (and I do that a lot) they always seem to “get it” when I speak in terms of doing more with less. Here are some examples:

  • I spoke to an IT person from a large fabless semiconductor company that is looking at cloud computing as a way to access more IT resources with less of an on premise hardware datacenter.
  • Cadence told me that their Hosted Design Solutions are specifically targeted at smaller companies that want to be able to access a complete EDA design environment (hardware, software, IT resources) without making any long-term commitment to the infrastructure.
  • EDA and IP companies of all sizes are looking to reduce the cost of customer support while providing more immediate and accessible service.
  • EDA and IP companies are looking to go global (e.g. US companies into Europe and Asia) without hiring a full on sales and support team.
  • Everyone is trying to reduce their travel budgets.

Naysayers point out that we’ve seen this trend before. EDA companies tried to put their tools in datacenters. There were Application Service Providers trying to sell Software-as-a-Service. These attempts failed or the companies moved into other offerings. And so they ask (rightly) “what is different now?”

There is certainly a lot of new technology (as you see above) that help to make this all more secure and convenient than it was in the past. We live in a time of cheap computing and storage and ubiquitous internet access which makes this all so much more affordable and accessible than before. And huge low cost commodity hardware data centers like those at Amazon and Google never existed before now. But just because all this technology exists so that it can be done, doesn’t mean it will be done.

What is different is the economic imperative to do more with less. That is why this will happen. If cloud computing did not exist, we’d have to invent it.

harry the ASIC guy

DAC Theme #2 - “Oasys Frappe”

Monday, August 10th, 2009

Sean Murphy has the best one sentence description of DAC that I have ever read:

FrappeThe emotional ambience at DAC is what you get when you pour the excitement of a high school science fair, the sense of the recurring wheel of life from the movie Groundhog Day, and the auld lang syne of a high school re-union, and hit frappe.

That perfectly describes my visit with Oasys Design Systems at DAC.

Auld Lang Syne

When I joined Synopsys in June of 1992, the company had already gone public, but still felt like a startup. Logic synthesis was going mainstream, challenging schematic entry for market dominance. ASICs (they were actually called gate arrays back then) were heading towards 50K gates capacity using 0.35 uM technology. And we were aiming to change the world by knocking off Joe Costello’s Cadence as the #1 EDA company.

As I walked through the Oasys booth at DAC, I recognized familiar faces. A former Synopsys sales manager, now a sales consultant for Oasys. A former Synopsys AE, now managing business development for Oasys. And not to be forgotten, Joe Costello, ever the Synopsys nemesis, now an Oasys board member. Even the company’s tag line “the chip synthesis company” is a takeoff on Synopsys’ original tag line “the synthesis company”. It seemed like 1992 all over again … only 17 years later.

Groundhog Day

In the movie Groundhog Day, Bill Murray portrays Phil, a smug, self-centered, yet popular TV reporter who is consigned by the spirits of Groundhog Day to relive Feb 2nd over and over. After many tries, Phil is finally able to live a “perfect day” that pleases the spirits and he is able to move on, as a better person, to Feb 3rd.

As I mentioned in a previous post, I’ve seen this movie before. In the synthesis market, there was Autologic on Groundhog Day #1. Then Ambit on Groundhod Day #2. Then Get2chip on Groundhod Day #3. Compass had a synthesis tool in there somewhere as well. (I’m sure Paul McLellan could tell me when that was.) None of these tools, some of which had significant initial performance advantages, were able to knock off Design Compiler as market leader. This Groundhog Day it’s Oasys’ turn. Will this be the day they finally “get it right”?

Science Fair

A good science fair project is part technology and part showmanship. Oasys had the showmanship with a pre-recorded 7-minute rock medley featuring “Bass ‘n’ Vocal Monster” Joe Costello, Sanjiv “Tropic Thunder” Kaul, and Paul “Van Halen” Besouw. Does anyone know if this has been posted on Youtube yet?

On the technology side, I had one main mission at the Oasys booth … to find out enough about the RealTime Designer product to make my own judgment whether it was “too good to be true”. In order to do this, I needed to get a better explanation of the algorithms working on “under-the-hood”, which I was able to get from founder Paul van Besouw.

For the demo, Paul ran on a Dell laptop with a 2.2 GHz Core Duo processor, although he claims that only 1 CPU was used. The demo design was a 1.6M instance design based on multiple instantiations of the open source Sparc T1 processor. The target technology was the open source 45nm Nangate library. Parts of the design flow ran in real time as we spoke about the tool, but unfortunately we did not run through the entire chip synthesis on his laptop in the 30 minutes I was there, so I cannot confirm the actual performance of the tool. Bummer.

Paul did describe, though, in some detail, the methods that enable their tool to achieve such fast turnaround time and high capacity. For some context, you need to go back in time to the origins and evolution of logic synthesis.

At 0.35 uM, gate delays were 80%+ of the path delay and the relatively small wire delays could be estimated accurately enough using statistical wire load models. At 0.25 uM, wire delays grew as a percentage of the path delay. The Synopsys Floorplan Manager tool allowed front-end designers to create custom wire load models from an initial floorplan. This helped maintain some accuracy for a while, but eventually was also too inaccurate. At 180 nM and 130 nM, Physical Compiler (now part of IC Compiler) came along to do actual cell placement and estimate wire lengths based on a global route. At 90 nM and 65 nM came DC-Topographic and DC-Graphical, further addressing the issues of wire delay accuracy and also layout congestion.

These approaches seem to work well, but certain drawbacks are starting to appear:

  1. Much of the initial logic optimization takes place prior to placement, so the real delays (now heavily dependent on placement) are not available yet.
  2. The capacity is limited because the logic optimization problem scales faster than order(n). Although Synopsys has come out with methods to address the turnaround time issue, such as automatic chip synthesis, these approaches amount to not much more than divide and conquer (i.e.budget and compile).
  3. The placement developed by the front-end synthesis tool (e.g. DC-Topographic) is not passed on to the place and route tool. As a result, once you place the design again in the place and route tool, the timing has changed.

According to Paul van Besouw, Oasys decided to take an approach they call “place first”. That is, rather than spend a lot of cycles in logic optimization before even getting to placement, they do an initial placement of the design as soon as possible so they are working with real interconnect delays from the start. Because of this approach, RealTime Designer can get to meaningful optimizations almost immediately in the first stage of optimization.

A second key strategy according to van Besouw is the RTL partitioning which chops the design up into RTL blocks that are floorplaned and placed on the chip. The partitions are fluid, sometimes splitting apart, sometimes merging with other partitions during the optimization process as the design demands. The RTL can be revisited and changed for a new structure during the optimization as well. Since the RTL partitions are higher-level than gates, the number of design objects in much fewer, leading to faster runtime with lower memory foot print according to van Besouw. Exactly how Oasys does the RTL partitioning and optimizations is the “secret sauce”, so don’t expect to hear a lot of detail.

Besides this initial RTL optimization and placement, there are 2 more phases of synthesis in which the design is further optimized and refined to a legal placement. That final placement can be taken into any place and route tool and give you better results than the starting point netlist from another tool, says van Besouw.

In summary, Oasys claims that they achieve faster turnaround time and higher capacity by using a higher level of abstraction (RTL vs. gate). They claim that they can achieve a better starting point for and timing correlation with place and route because they use actual placement from the start and feed that placement on to the place and route tool. And the better placement also runs faster because it converges faster.

What Does Harry Think?

Given the description that I got from Oasys at DAC, I am now convinced that it is “plausible” that Oasys can do what they claim. Although gory detail is still missing, the technical approach described above sounds exactly right, almost obvious when you think about it. Add to that the advantage of starting from scratch with modern coding languages and methods and not being tied to a 20 year old code base, and you can achieve quite a bit of improvement.

However, until I see the actual tool running for myself in a neutral environment on a variety of designs and able to demonstrate faster timing closure through the place and route flow, I remain a skeptic. I’m not saying it is not real, just that I need to see it.

There are several pieces of the solution that were not addressed adequately, in my opinion:

  1. Clock tree synthesis - How can you claim to have a netlist and placement optimized to meet timing until you have a clock tree with its unique slew and skew. CTS is not address in this solution. (To be fair, it’s not addressed directly in Design Compiler either).
  2. A robust interface to the backend - Oasys has no backend tools in-house, which means that the work they have done integrating with 3rd party place and route has been at customer sites, either by them or by the customer. How robust could those flows be unless they have the tools in-house (and join the respective partner programs).
  3. Bells and whistles - RealTime designer can support multi-voltage, but not multi-mode optimization. Support for low power design is not complete. What about UPF? CPF? All of these are important in a real flow and it is not clear what support Oasys has.
  4. Tapeouts - This is probably the key question. For as long as EDA has existed, tapeouts have been the gold standards by which to evaluate a tool and its adoption. When I asked Paul if there are any tapeouts to date, he said “probably”. That seems odd to me. He should know.

However, if Oasys can address these issues, this might actually be the game changer that gets us out of the Groundhog Day rut and onto a new day.

harry the ASIC guy