Archive for the ‘Intellectual Property’ Category

The Burning Platform

Monday, March 1st, 2010

The Burning PlatformAlthough I was unable to attend DVCon last week, and I missed Jim Hogan and Paul McLellan presenting “So you want to start an EDA Company? Here’s how“, I was at least able to sit in on an interesting webinar offered by RTM Consulting entitled Achieving Breakthrough Customer Satisfaction through Project Excellence.

As you may recall, I wrote a previous blog post about a Consulting Soft Skills training curriculum developed by RTM in conjunction with Mentor Graphics for their consulting organization. Since that time, I’ve spoken on and off with RTM CEO Randy Mysliviec. During a recent conversation he made me aware of this webinar and offered one of the slots for me to attend. I figured it would be a good refresher, at a minimum, and if I came out of it with at least one new nugget or perspective, I was ahead of the game. So I accepted.

I decided to “live tweet” the seminar. That is to say, I posted tweets of anything interesting that I heard during the webinar, all using the hash tag #RTMConsulting. If you want to view the tweets from that webinar, go here.

After 15 years in the consulting biz, I certainly had learned a lot, and the webinar was indeed a good refresher on some of the basics of managing customer satisfaction. There was a lot of material for the 2 hours that we had, and there were no real breaks, so it was very dense and full of material. The only downside is that I wish there had been some more time for discussion or questions, but that’s really a minor nit to pick.

I did get a new insight out of the webinar, and so I guess I’m ahead of the game. I had never heard of the concept of the “burning platform” before, especially as applies to projects. The story goes that there was an oil rig in the North Sea that caught fire and was bound to be destroyed. One of the workers had to decide whether to stay on the rig or jump into the freezing waters. The fall might kill him and he’d face hypothermia within minutes if not rescued, but he decided to jump anyway, since probable death was better than certain death. According to the story, the man survived and was rescued. Happy ending.

The instructor observed that many projects are like burning platforms, destined for destruction unless radically rethought. In thinking back, I immediately thought of 2 projects I’d been involved with that turned out to be burning platforms.

The first was a situation where a design team was trying to reverse engineer an asynchronously designed processor in order to port it to another process. The motivation was that the processor (I think it was an ADSP 21 something or other) was being retired by the manufacturer and this company wanted to continue to use it nonetheless. We were called in when the project was already in trouble, significantly over budget and schedule and with no clear end in sight. After a few weeks of looking at the situation, we decided that there was no way they would ever be able to verify the timing and functionality of the ported design. We recommended that they kill this approach and start over with a standard processor core that could do the job. There was a lot of resistance, especially from the engineer whose idea it was to reverse engineer the existing processor. But, eventually the customer made the right choice and redesigned using an ARM core.

Another group at the same company also had a burning platform. They were on their 4th version of a particular chip and were still finding functional bugs. Each time they developed a test plan and executed it, there were still more bugs that they had missed. Clearly their verification methodology was outdated and insufficient, depending on directed tests and FPGA prototypes rather than more current measurable methods. We tried to convince them to use assertions, functional coverage, constrained random testing, etc. But they were convinced that they just had to fix the few known bugs and they’d be OK. From their perspective, it wasn’t worth all the time and effort to develop and execute a new plan. They never did take our recommendations and I lost track of that project. I wonder if they ever finished.

As I think about these 2 examples, I realize that “burning platform” projects have some characteristics in common. And they align with the 3 key elements of a project. To tell if you have a “burning platform” on your hands, you might ask yourself the following 3 questions:

  1. Scope - Are you spending more and more time every week managing issues and risks? Is the list growing, rather than shrinking?
  2. Schedule - Are you on a treadmill with regards to schedule? Do you update the schedule every month only to realize that the end date has moved out by a month, or more?
  3. Resources - Are the people that you respect the most trying to jump off of the project? Are people afraid to join you?

If you answered yes to at least 2 of these, then you probably have a burning platform project on your hands. It’s time to jump in the water. That is, it’s time to scrap the plan and rethink your project from a fresh perspective and come up with a new plan. Of course, this is not a very scientific way of identifying an untenable project, but I think it’s a good rule-of-thumb.

There are other insights that I had from the webinar, but I thought I’d only share just the one. I don’t know if this particular webinar was recorded, but there are 2 more upcoming that you can attend. If you do, please feel free to live tweet the event like I did, using the #RTMConsulting hash tag.

But please, no “flaming” :-)

harry the ASIC guy

Synopsys’ Digital to Analog Conversion

Tuesday, May 12th, 2009

Last Thursday, the same day that Synopsys announced it’s acquisition of MIPS’ Analog Business Group (ABG) for $22M in cash, I had a long overdue lunch with a former colleague of mine at Synopsys. We spent most of the time talking about family, and how each other’s jobs were going, and the economy, and the industry in general.

At some point, the discussion got around to Aart DeGeus and his leadership qualities. My friend, who plays bass guitar with Aart on occasion, shared with me his observations of Synopsys’ CEO outside of work. “He’s a born leader, even when he’s playing music,” my friend said as he related one story of how Aart lead the band in an improvisational session with the same infectious enthusiasm he brings to Synopsys. Here’s a look.

While driving back from lunch, I recalled a field conference from the mid 1990s where Aart introduced the notion of “Synopsys 2″. Synopsys 2 was to be a new company (figuratively, not literally) that would obsolete Synopsys 1 and take a new leadership role in a transforming industry. At that time, Synopsys 1 was the original “synthesis company” along with some test and simulation tools. The industry challenge driving Synopsys 2 was the need for increased designer productivity to keep up with chip sizes increasing due to the inexorable and ubiquitous Moore’s Law.

Aart’s vision for this new EDA order was twofold. First, behavioral synthesis would allow designers to design at a higher, more efficient, and more productive level of abstraction, thereby increasing their productivity. In fact, your’s truly helped develop and deliver the very first DAC floor demo of Behavioral Compiler. I also developed a very simple but elegant presentation of the power of behavioral synthesis that was used throughout Synopsys, garnered the praise of Aart himself, and sits in my desk as a memento of my time at Synopsys. Unfortunately, behavioral synthesis never really caught on at the time. Oh well. So much for that.

The second part of Aart’s productivity vision was design reuse. Needless to say, that vision has come true in spades. I don’t have reliable numbers at my finger tips, but I would guess that there is hardly a chip designed without some sort of implementation or verification IP reuse. Some chips are almost entirely reusable IP, with the only custom logic stitching it all together. I can’t imagine designing 100M gate chips without design reuse.

Design teams looking for digital IP were faced with a straightforward make vs. buy decision. On the one hand, most design teams could design the IP themselves given enough time and money. They could even prototype and verify the IP via FPGA protoytype to make sure it would work. But could they do it faster and cheaper than buying the IP and could they do it with a higher level of quality? The design team that decided they could do a better, faster, cheaper job themselves, did so. The others bought the IP.

But analog and mixed signal IP is very different. Whereas most design teams have the skills and ability to design digital IP, they usually do not have the expertise to design complex analog and mixed signal IP. Not only are analog designers more scarce, but the problem keeps getting harder at smaller geometries. Ask any analog designer you know how hard it is to design a PLL at 65 nm or 45 nm. What were 4 corner simulations at 90nm become 16 corner or even monte-carlo simulations at 45 nm and below. Not only is analog design difficult, but it often requires access to foundry specific information only available to close partners of the foundries. And even if you can get the info and design the IP, there is no quick FPGA prototype to prove it out. You need to fab a test chip (which is several months), complete with digital noise sources to stress the IP in its eventual environs. The test chip can cost several million dollars (much more than an FPGA protoype for digital IP) and you’d better count on at least one respin to get it right.

That is why Synopsys’ acquisition of the MIPS ABG IP is such a good move. The “value proposition” for analog IP is so much greater than for digital IP. It’s not a matter of whether the customer can design the IP faster, better, cheaper, it’s whether he can design it at all. By expanding its analog IP portfolio, at a bargain price, Synopsys is well positioned to provide much of the analog and mixed signal IP at 65 nm and below. In addition, this acquisition gives Synopsys a real analog design team with which they can perform design services, something they have coveted but lacked for some time.

Once again, it looks like Aart is taking the leadership role. Look for other companies to follow the leader.

harry the ASIC guy

EDA Is Only “Mostly Dead”

Wednesday, March 4th, 2009

Last Wednesday at DVCon, Peggy Aycinena MC’ed what used to be known as the Troublemakers Panel, formerly MC’ed by John Cooley. The topic, “EDA: Dead or Alive?” Well, having attended Aart’s Keynote address immediately preceding and having attended Peggy’s panel discussion, I can answer that question in the immortal words of Miracle Max, “EDA is only MOSTLY dead”. But first, some background.

Back in the mid 90s, I attended a Synopsys field conference where Aart delivered a keynote addressing the challenges of achieving higher and higher productivity in the face of increasing chip size. The solution, he predicted, would be design reuse in the form of intellectual property. Although most of us had only the faintest idea of what design reuse entailed and could barely fathom such a future, Aart’s prediction has indeed come true. Today, there is hardly a chip designed without some form of soft or hard IP and many chips are predominantly IP.

Some years later, he delivered a similar keynote preaching the coming future of embedded software. This was before the term SoC was coined to designate a chip with embedded processors running embedded software. Again, only a handful understood or could fathom this future, but Aart was correct again.

So, this year, immediately preceding Peggy’s Panel, Aart delivered another very entertaining and predictive keynote. After describing the current economic crisis in engineering terms using amplifiers and feedback loops, he moved to the real meat of the presentation which addressed the growing amount of software content in today’s SoCs. He described how project schedules are often paced by embedded software development and validation. How products are increasingly differentiated based on software, not hardware. And he predicted a day when chips would only have custom hardware to implement functions that could not be performed with programmable software. In essence, he described a future with little electronic design as we know it today, where hardware designers are largely replaced by programmers.

Immediately following Aart’s keynote was Peggy’s panel. (If you want to know exactly what occurred, there is no place better to go than Mike Demler’s blow-by-blow account.) Peggy did her best to challenge the EDA execs to defend why EDA would not die out. She kept coming back to that same question in different ways and the execs kept avoiding directly answering the question, choosing instead to offer such philosophical logic such as: “If EDA is dead, then semiconductors are dead. If semiconductors are dead, then electronics are dead. And since electronics will never die, EDA will never die”.

On the surface, logic such as this is certainly comforting. After all, who can imagine a future without electronics? Upon closer inspection, however, and in light of Aart’s keynote, there is plenty reason for skepticism.

Just as Aart was right about design reuse and IP…

Just as Aart was right about embedded software …

I believe that Aart is right about hardware design being replaced by software development.

As processors and co-processors become faster and more capable of handling tasks formerly delegated to hardware…

As time-to-market drives companies to sell products that can be upgraded or fixed later via software patches…

As fewer and fewer companies can afford the cost of chip design at 32nm and below…

More companies will move capabilities to software running on standard chips.

With that, what becomes of the current EDA industry. Will it adapt to embrace software as part of its charter. Or will it continue to focus on chip development.

Personally, I think Aart is right again. Hardware will increasingly become software. And an EDA industry focused on hardware, will be increasingly “mostly dead”.

harry the ASIC guy

Big DAC Attack

Tuesday, May 20th, 2008

OK … I’m registered to go to DAC for at least one day, maybe two. I’ll definitely be there on Tuesday and probably Wednesday evening for a Blogging “Birds-of-a-Feather” session that JL Gray is setting up. Besides hitting the forums and other activities, I’ll have about half a day to attack the exhibit floor or the “suites” to look at some new technology. If you want to meet up, drop me an email and we can arrange something.

Cadence won’t be there and I already talk to Synopsys and Mentor on a regular basis, so I’m planning on focusing on smaller companies with new technology. Here’s what’s on my list so far…

Nusym - They have some new “Path Tracing” technology that finds correlations between a constrained random testbench and hard-to-hit functional coverage points. With this knowledge, they claim to be able to modify the constraints to guide the simulation to hit the coverage points. The main benefit is in getting that last few % of functional coverage that can be difficult with unguided constrained random patterns.

Chip Estimate - Having been around for a few years and recently bought by Cadence, they are basically a portal where you can access 3rd party IP and use the information to do a rough chip floorplan. This allows you to estimate area, power, yield, etc. I’m real curious as to their business model and why Cadence bought them. At a minimum, it should be entertaining to see the hyper-competitive IP vendors present back-to-back at half hour intervals on the DAC floor.

I have a few others on my list, but there are so many small companies that it’s hard to go thru them all and decide what to see. That’s where I need your help.

What would you recommend seeing and why?

Is IP a 4-letter Word ???

Friday, May 9th, 2008

As I’ve been thinking a lot about Intellectual Property (IP) lately, I recently recalled a consulting project that I had led several years ago … I think it was 2002. The client was designing a processor chip that had a PowerPC core and several peripherals. The core and some of the peripherals were purchased IP and our job was to help with the verification and synthesis of the chip.

Shaun was responsible for the verification. As he started to verify one of the interfaces, he started to uncover bugs in the associated peripheral, which was purchased IP. We contacted the IP provider and were told most assuredly that it had all been 100% verified and silicon proven. But we kept finding bugs. Eventually, faced with undeniable proof of the poor quality of their IP, they finally fessed up. It seems the designer responsible for verifying the design had left the company half way through the project. They never finished the verification. Ugh 1!

Meanwhile, Suzanne was helping with synthesis of the chip, including the PowerPC core. No matter what she did, she kept finding timing issues in the core. Eventually, she dug into the PowerPC core enough to figure out what was going on. Latches! They had used latches in order to meet timing. All well and good, but the timing constraints supplied with the design did not reflect any of that. Ugh 2!

About a week later, I was called to a meeting with Gus, who was the client’s project lead’s boss’s boss. As I walked into his office, he said something that I’ll never forget …

“I’m beginning to believe that IP is a 4-letter word”.

How true. Almost every IP I have every encountered, be it a complex mixed-signal hard IP block, a synthesizable processor core, an IO library … they all have issues. How can an industry survive when the majority of the products don’t work? Do you think the HDTV market would be around if more than half the TVs did not work? Or any market. Yet this is tolerated for IP.

That is not to say that some IP providers don’t take quality seriously. Synopsys learned it’s lesson many years ago when it came out with a PCI core that was a quality disaster. To their credit, they took failure as a learning opportunity, developed a robust reuse methodology along with Mentor Graphics, and reintroduced a PCI core that is still in use today.

Still … no IP is 100% perfect out-of-the-box. IP providers need to have a relationship and business model with their customers that encourages open sharing of design flaws. This is a two-way street. The IP provider must notify its customers when it finds bugs, and the customer must inform the IP provider when it finds bugs. As an example, Synopsys and many other reputable IP providers will inform customers of any design issue immediately, a transparency that I could have only prayed for from the company providing IP to my client. In return, they need their customers support by reporting design issues to them. Sounds simple, right?

Maybe not. I had another client who discovered during verification that there was a bug in a USB Host Controller IP. They had debugged and corrected the problem already, so I asked the project manager if they had informed the IP provider yet. He refused. The rationale? He wanted his competition to have the buggy design while he had the only fix!

We, as users, play a role because we have a responsibility to report bugs for the good of all of us using the product. Karen Bartleson talks about a similar situation with her luggage provider, where customers are encouraged to send back their broken luggage in order to help the company improve their luggage design. The luggage gets better and better as a result.

So, besides reporting bugs and choosing IP carefully, what else can we as designers do to drive IP quality. I have one idea. One day, when I have some free time, I’d like to start an independent organization that would objectively assess and grade IP. We’d take it though all the tools and flows and look at all the views, logical and physical, and come out with an assessment. This type of open grading system would encourage vendors to improve their IP and would allow us to make more informed choices rather than playing Russian Roulette.

I’m half inclined to start one today … anybody with me?

harry the ASIC guy

OSU - Open Source University

Monday, April 7th, 2008

Below is a video presentation that was given in 2006 by Rice University Engineering Professor Richard Baraniuk at the TED conference in Monterey, CA. Professor Baraniuk is founder of Connexions, a free, open-source, global clearinghouse of course materials that allows teachers to quickly “create, rip, mix and burn” coursework — without fear of copyright violations. Think of it as Napster for education. I think this is well worth 18 minutes of our time, since we all have an interest in education for ourselves, our children, and our peers. When you’re done, a challenge.



Now for the challenge … I’d like you to consider what this approach would do for the ASIC Intellectual Property industry, if we could all collaborate to create, rip, mix, and burn design IP under user friendly legal terms such as Creative Commons.

What do you think?

harry the ASIC guy