Archive for August, 2009

Who Ya Gonna Trust?

Thursday, August 27th, 2009

Joe Isuzu - You Have My Word On ItOur summer has been pretty hectic and full of uncertainty, so we put off planning a short vacation until just this past weekend. We usually go up to Big Bear and stay at this one place that is dog friendly and has a pool for the kids and is close to town. We’ve stayed there 3 times before and have always been very happy.

This past Saturday morning, I Googled the name of the resort in order to get the web address when I noticed that there was a Trip Advisor listing for the place. So, I thought I’d check it out. Much to my surprise there was a slew of negative reviews. I dug a little further and found that many of these reviews were placed around the same date (since we had been there last) by people who only rated this one place and who had very similar complaints. These reviews seemed suspicious, but who knew, maybe some had merit. These could be legit or they could be someone posting them on the behalf of competing resorts to discredit their competitor.

As I surfed a little more, I found comments on some other pages indicating that this sort of negative posting on rating sites had become epidemic for Big Bear. Who knew that the lodging industry in this cozy little town in the mountains was so cutthroat? It’s a good example of a lose-lose strategy. Now I can’t trust any of the ratings!

In the end we ended up booking at a different resort, mostly due to other factors, but admittedly also due in part to the FUD (fear, uncertainty, and doubt) caused by these reviews.

On Sunday morning I came across an article  that describes how one PR firm allegedly hires interns “to trawl iTunes and other community forums posing as real users, and has them write positive reviews for their client’s applications.” Now, I knew that this sort of mischief happened, but I thought it was all amateurish behavior on the part of overzealous business owners and their fans. I did not realize it was an actual service one could select from a PR firm. How brazen!

On the other hand, maybe this article I read was actually secretly sponsored by a competing PR firm in order to discredit the PR firm being decried in the article. Who is to believe whom? Hmmm…..

Before you say that I am naïve about all this behavior, I’m not. The verification methodology survey I posted back in February was vandalized by VMM and OVM fans. And more than a year ago, someone copied a blog post of mine onto comp.lang.verilog for the sole purpose of posting in response a personal attack on my credibility. I’ve seen this stuff first hand.

A big part of the problem is anonymity and impunity. When someone uses a fictitious name and email address to post such a review as the one described above, we never know who that person is and he never suffers any consequences. After all, who is Vactioner287 after all? However, let’s say that one could only leave a comment by using his LinkedIn profile. I bet that would kill 99% of the issues right there.

(Actually, it would probably result in a proliferation of fictitious LinkedIn accounts, but then you could tell pretty well from those accounts that they are fakes since they’d be very bare. To some extent, like metastability, you can never totally get rid of the problem … you can only make it less likely.)

Most websites that accept reviews require registration. Although the hassle of registration deters some legitimate people from leaving legitimate comments, it also beneficially deters those with malicious intentions to a great degree. Almost all the online communities in EDA require some sort of registration, the Synopsys blogs being the only one that I can name that does not.

So, who ya gonna trust?

Personally, there are 3 types of people who I trust on the internet and they are as follows:

  1. People I already know and trust - These are people who I know personally. Maybe they are current or former colleagues or customers or suppliers or partners or friends. I have reason to trust them because I know them.
  2. People I’ve come to trust - These are people whom I have come to know through the internet who have demonstrated over a period of time that they are trustworthy. Maybe it’s a blogger who has proven to be right most of the time. Or whose advice rings true. Or who provides me with valuable information and insight. Hopefully, I am one of those people for you.
  3. People I’ve been told to trust by others I trust - This is where social capital and influence come into play. If someone I trust links to someone else, then I gain trust in that person to whom he is linking. If he’s on his blogroll. If he’s a guest blogger. If he’s written a book that is referred to. Not that everyone that is referenced is automatically trustworthy, but it helps.

If you were to look at my Google Reader and see who I subscribe to, they pretty much fall into the 3 categories above. That gives me plenty to read.

Unfortunately, this doesn’t help too much with the situation I originally described, because Vacationer287 doesn’t fall any of these categories. What do you do then? Ask yourself the following:

  1. Did he write anything else under this name or did he just join to post this one review. If the former, then he may be legit (you need to look at what they wrote). If the former, that’s suspicious.
  2. Did he use a real name? Vandals often hide behind fictitious and non-descript names.
  3. Does it pass the smell test? I can smell bad milk without a lab test and you can too. Does it all make sense or does some of the writeup just seem too good or bad to be true?

I don’t know if this post helps you or confuses you more. Probably, it confuses you because now you have to consider why and how you come to trust some people and not others on the internet. That’s good. From reconciling confusion comes understanding.

Trust me, you have my word on it.

harry the ASIC guy

The Road Not Taken

Thursday, August 20th, 2009

Fork in RoadI’d like to offer you the opportunity to help someone out who needs to make a key decision in her life.

As I’ve written about and spoken about recently, the economic woes of the past year have impacted many of my peers and I’m sure yours as well. Especially hard hit seem to be those in the middle of their careers, a group that I count myself a part of. For those of us who have faced or are facing these uncertainties, I think it’s only natural to second guess the key decisions we made in our careers and wonder if we made the right choices. Some may have decided to take a chance on a new opportunity only to have it evaporate. Others may have passed on that opportunity only to see their current “safe” position turn out not to be so safe after all.

It’s with this in mind that I received an email from a young woman at a crossroads in her career, having to make just such a decision, one that she prays she will cherish but fears she will regret. She has the opportunity to move from her current “safe” position of many years to a new opportunity filled with uncertainty. In order to afford her the best possible insight and advice, I’d like to open this up to you (with her permission) since you all collectively have a ton more experience than I will ever have.

As you read her email, you’ll realize that she is facing several smaller decisions as part of this one big decision, namely:

  1. ASIC vs. FPGA
  2. ASIC Design vs. IP development
  3. Existing company that she knows vs. a new company that she has to learn
  4. Comfort zone vs. Temporary Incompetence
  5. Hands-on Technical Work vs. Management
  6. Expert vs. Generalist

Each of these decisions could be the basis for a debate on its own. Feel free to comment on any of these or all of these or on other aspects that you find important. If you can take some time to respond, I think this will not only serve to advise this woman, but will also be a great guide to anyone looking to make a career change.

__________

I am a Lead ASIC Designer with 13 years of experience in front end ASIC design and have worked on multiple ASICs to this date at a company in India. Everything is fine here, just that the work is getting very repetitive lately. I have an offer from a IP development firm and need to decide soon. The following things come to my mind when I think about the offer:

1. The work would be mostly on FPGAs (no ASICs involved).

2. I won’t work with the Physical Design guys anymore.

3. I may get good exposure on different networking IPs.

4. I am currently leading a sizable design in a big ASIC. Though this position is glamorous and coveted by many, there is nothing new to learn since I have been doing it for the past several years.

I have the following queries,

  1. If I join the new company and start working on FPGAs, will it take away something from me, e.g., my “ASIC Gal” tag?
  2. Will taking up the manageress role and doing project management ‘formally’ be better that working as a Lead Engineer, from a long term employability perspective? or will it be detrimental?
  3. Will it be a one way path with little chances to come back to ASICs without a compromise? (after, say, 4-5 years).
  4. I want to move towards system design/architecture in the future and am thinking that the more IPs I work on, the better it will be for me. Is this assumption correct?
  5. Overall, any other advice as to what I should consider and whether I should take this position.

I would appreciate your reply.

DAC Theme #3 - “Increasing Clouds Over SF Bay”

Sunday, August 16th, 2009

Clouds over San FranciscoIt was easy to spot the big theme’s at DAC this year. This was the “Year of ESL” (again). The state of the economy and the future of EDA was a constant backdrop. Analog design was finally more than just Cadence Virtuoso. And social media challenged traditional media.

It was harder to spot the themes that were not front and center, that were not spotlighted by the industry beacons, that were not reported by press or bloggers. Still, there were important developments if you  looked in the right places and noticed what was changing. At least one of those themes came across to me loud and clear. This was the year that the clouds started forming over EDA.

If you’ve read my blog for a while, you know I’m not talking about the weather or some metaphor for the health of the EDA industry. You know I am talking about cloud computing, which moved from crazy idea of deluded bloggers to solidly in the early adopter category. Though this technology is still “left of chasm”, many companies were talking about sticking their toes in the waters of cloud computing and some even had specific plans to jump in. Of note:

  • Univa UD - Offering a “hybrid cloud” approach to combine on premise hardware and public cloud resources. Many view this as the first step into the cloud since it is incremental to existing on premise hardware.
  • Imera Systems - Offering a product called EDA Remote Debug that enables an EDA company to place a debug version of their software on a customer’s site in order to debug a tool issue. This reduces the need to send an AE on site or to have the customer package up a testcase.
  • R Systems - A spinoff from the National Center for Supercomputing Applications (best known for Telnet and Mosaic), they were wandering the floor pitching their own high performance computing resources (that they steadfastly insisted were “not a cloud”) available remotely or brought to your site to increase your computing capacity.
  • Cadence - One of the first (after PDTi) to have an official Hosted Design Solutions offering, they host their software and your data in a secure datacenter and are looking at the cloud as well for the future.

And then there’s Xuropa.

Before I cover Xuropa, I need to take a brief digression. You see, July 27th was not just the first day of DAC. It was also my first official day working for Xuropa as one of my clients. I’ll be doing social media consulting (blogging, tweeting, other online social community stuff) and also helping their customers get their tools on the Xuropa platform. This is very exciting for me, something I’ll blog about specifically on the Xuropa Blog and also here. In the meantime, under full disclosure, you’ve now been told. You can factor in the appropriate amount of skepticism to what I have to say about cloud computing, hosted design, Software-as-a-Service and Xuropa.

  • Xuropa - Offering to EDA companies and IP providers the ability to create secure online labs in the cloud for current and prospective customers to test drive a tool, do tool training, etc. They also have plans to make the tools available for “real work”.

These companies and technologies are very exciting on their own. Still, the cloud computing market is very new and there is a lot of churn so it is very difficult to know what will survive or become the standard. Perhaps something not even on this list will emerge.

Even though the technology side is cloudy (pun intended), the factors driving companies to consider using the cloud are very clear. They all seem to come down to one economic requirement. Doing more with less. Whenever I speak to people about cloud computing (and I do that a lot) they always seem to “get it” when I speak in terms of doing more with less. Here are some examples:

  • I spoke to an IT person from a large fabless semiconductor company that is looking at cloud computing as a way to access more IT resources with less of an on premise hardware datacenter.
  • Cadence told me that their Hosted Design Solutions are specifically targeted at smaller companies that want to be able to access a complete EDA design environment (hardware, software, IT resources) without making any long-term commitment to the infrastructure.
  • EDA and IP companies of all sizes are looking to reduce the cost of customer support while providing more immediate and accessible service.
  • EDA and IP companies are looking to go global (e.g. US companies into Europe and Asia) without hiring a full on sales and support team.
  • Everyone is trying to reduce their travel budgets.

Naysayers point out that we’ve seen this trend before. EDA companies tried to put their tools in datacenters. There were Application Service Providers trying to sell Software-as-a-Service. These attempts failed or the companies moved into other offerings. And so they ask (rightly) “what is different now?”

There is certainly a lot of new technology (as you see above) that help to make this all more secure and convenient than it was in the past. We live in a time of cheap computing and storage and ubiquitous internet access which makes this all so much more affordable and accessible than before. And huge low cost commodity hardware data centers like those at Amazon and Google never existed before now. But just because all this technology exists so that it can be done, doesn’t mean it will be done.

What is different is the economic imperative to do more with less. That is why this will happen. If cloud computing did not exist, we’d have to invent it.

harry the ASIC guy

DAC Theme #2 - “Oasys Frappe”

Monday, August 10th, 2009

Sean Murphy has the best one sentence description of DAC that I have ever read:

FrappeThe emotional ambience at DAC is what you get when you pour the excitement of a high school science fair, the sense of the recurring wheel of life from the movie Groundhog Day, and the auld lang syne of a high school re-union, and hit frappe.

That perfectly describes my visit with Oasys Design Systems at DAC.

Auld Lang Syne

When I joined Synopsys in June of 1992, the company had already gone public, but still felt like a startup. Logic synthesis was going mainstream, challenging schematic entry for market dominance. ASICs (they were actually called gate arrays back then) were heading towards 50K gates capacity using 0.35 uM technology. And we were aiming to change the world by knocking off Joe Costello’s Cadence as the #1 EDA company.

As I walked through the Oasys booth at DAC, I recognized familiar faces. A former Synopsys sales manager, now a sales consultant for Oasys. A former Synopsys AE, now managing business development for Oasys. And not to be forgotten, Joe Costello, ever the Synopsys nemesis, now an Oasys board member. Even the company’s tag line “the chip synthesis company” is a takeoff on Synopsys’ original tag line “the synthesis company”. It seemed like 1992 all over again … only 17 years later.

Groundhog Day

In the movie Groundhog Day, Bill Murray portrays Phil, a smug, self-centered, yet popular TV reporter who is consigned by the spirits of Groundhog Day to relive Feb 2nd over and over. After many tries, Phil is finally able to live a “perfect day” that pleases the spirits and he is able to move on, as a better person, to Feb 3rd.

As I mentioned in a previous post, I’ve seen this movie before. In the synthesis market, there was Autologic on Groundhog Day #1. Then Ambit on Groundhod Day #2. Then Get2chip on Groundhod Day #3. Compass had a synthesis tool in there somewhere as well. (I’m sure Paul McLellan could tell me when that was.) None of these tools, some of which had significant initial performance advantages, were able to knock off Design Compiler as market leader. This Groundhog Day it’s Oasys’ turn. Will this be the day they finally “get it right”?

Science Fair

A good science fair project is part technology and part showmanship. Oasys had the showmanship with a pre-recorded 7-minute rock medley featuring “Bass ‘n’ Vocal Monster” Joe Costello, Sanjiv “Tropic Thunder” Kaul, and Paul “Van Halen” Besouw. Does anyone know if this has been posted on Youtube yet?

On the technology side, I had one main mission at the Oasys booth … to find out enough about the RealTime Designer product to make my own judgment whether it was “too good to be true”. In order to do this, I needed to get a better explanation of the algorithms working on “under-the-hood”, which I was able to get from founder Paul van Besouw.

For the demo, Paul ran on a Dell laptop with a 2.2 GHz Core Duo processor, although he claims that only 1 CPU was used. The demo design was a 1.6M instance design based on multiple instantiations of the open source Sparc T1 processor. The target technology was the open source 45nm Nangate library. Parts of the design flow ran in real time as we spoke about the tool, but unfortunately we did not run through the entire chip synthesis on his laptop in the 30 minutes I was there, so I cannot confirm the actual performance of the tool. Bummer.

Paul did describe, though, in some detail, the methods that enable their tool to achieve such fast turnaround time and high capacity. For some context, you need to go back in time to the origins and evolution of logic synthesis.

At 0.35 uM, gate delays were 80%+ of the path delay and the relatively small wire delays could be estimated accurately enough using statistical wire load models. At 0.25 uM, wire delays grew as a percentage of the path delay. The Synopsys Floorplan Manager tool allowed front-end designers to create custom wire load models from an initial floorplan. This helped maintain some accuracy for a while, but eventually was also too inaccurate. At 180 nM and 130 nM, Physical Compiler (now part of IC Compiler) came along to do actual cell placement and estimate wire lengths based on a global route. At 90 nM and 65 nM came DC-Topographic and DC-Graphical, further addressing the issues of wire delay accuracy and also layout congestion.

These approaches seem to work well, but certain drawbacks are starting to appear:

  1. Much of the initial logic optimization takes place prior to placement, so the real delays (now heavily dependent on placement) are not available yet.
  2. The capacity is limited because the logic optimization problem scales faster than order(n). Although Synopsys has come out with methods to address the turnaround time issue, such as automatic chip synthesis, these approaches amount to not much more than divide and conquer (i.e.budget and compile).
  3. The placement developed by the front-end synthesis tool (e.g. DC-Topographic) is not passed on to the place and route tool. As a result, once you place the design again in the place and route tool, the timing has changed.

According to Paul van Besouw, Oasys decided to take an approach they call “place first”. That is, rather than spend a lot of cycles in logic optimization before even getting to placement, they do an initial placement of the design as soon as possible so they are working with real interconnect delays from the start. Because of this approach, RealTime Designer can get to meaningful optimizations almost immediately in the first stage of optimization.

A second key strategy according to van Besouw is the RTL partitioning which chops the design up into RTL blocks that are floorplaned and placed on the chip. The partitions are fluid, sometimes splitting apart, sometimes merging with other partitions during the optimization process as the design demands. The RTL can be revisited and changed for a new structure during the optimization as well. Since the RTL partitions are higher-level than gates, the number of design objects in much fewer, leading to faster runtime with lower memory foot print according to van Besouw. Exactly how Oasys does the RTL partitioning and optimizations is the “secret sauce”, so don’t expect to hear a lot of detail.

Besides this initial RTL optimization and placement, there are 2 more phases of synthesis in which the design is further optimized and refined to a legal placement. That final placement can be taken into any place and route tool and give you better results than the starting point netlist from another tool, says van Besouw.

In summary, Oasys claims that they achieve faster turnaround time and higher capacity by using a higher level of abstraction (RTL vs. gate). They claim that they can achieve a better starting point for and timing correlation with place and route because they use actual placement from the start and feed that placement on to the place and route tool. And the better placement also runs faster because it converges faster.

What Does Harry Think?

Given the description that I got from Oasys at DAC, I am now convinced that it is “plausible” that Oasys can do what they claim. Although gory detail is still missing, the technical approach described above sounds exactly right, almost obvious when you think about it. Add to that the advantage of starting from scratch with modern coding languages and methods and not being tied to a 20 year old code base, and you can achieve quite a bit of improvement.

However, until I see the actual tool running for myself in a neutral environment on a variety of designs and able to demonstrate faster timing closure through the place and route flow, I remain a skeptic. I’m not saying it is not real, just that I need to see it.

There are several pieces of the solution that were not addressed adequately, in my opinion:

  1. Clock tree synthesis - How can you claim to have a netlist and placement optimized to meet timing until you have a clock tree with its unique slew and skew. CTS is not address in this solution. (To be fair, it’s not addressed directly in Design Compiler either).
  2. A robust interface to the backend - Oasys has no backend tools in-house, which means that the work they have done integrating with 3rd party place and route has been at customer sites, either by them or by the customer. How robust could those flows be unless they have the tools in-house (and join the respective partner programs).
  3. Bells and whistles - RealTime designer can support multi-voltage, but not multi-mode optimization. Support for low power design is not complete. What about UPF? CPF? All of these are important in a real flow and it is not clear what support Oasys has.
  4. Tapeouts - This is probably the key question. For as long as EDA has existed, tapeouts have been the gold standards by which to evaluate a tool and its adoption. When I asked Paul if there are any tapeouts to date, he said “probably”. That seems odd to me. He should know.

However, if Oasys can address these issues, this might actually be the game changer that gets us out of the Groundhog Day rut and onto a new day.

harry the ASIC guy

DAC Theme #1 - “The Rise of the EDA Bloggers”

Sunday, August 2nd, 2009

Harry Gries at Conversation Central

(Photo courtesy J.L. Gray

Last year, at the Design Automation Conference, there were only a couple dozen individuals who would have merited the title of EDA blogger. Of those, perhaps a dozen or so wrote regularly and had any appreciable audience. In order to nurture this fledgling group, JL Gray (with the help of John Ford, Sean Murphy, and yours truly) scrounged a free room after-hours in the back corner of the Anaheim Convention Center in which to hold the first ever EDA Bloggers Birds-of-a-Feather session. At this event, attended by both bloggers and traditional journalists, as John Ford put it, us bloggers got our collective butts sniffed by the top dog journalists.

My, how things have changed in just one year.

This year at DAC, us EDA bloggers (numbering 233 according to Sean Murphy) and other new media practitioners took center stage:

  • Bloggers were literally on stage at the Denali party as part of an EDA’s Next Top Blogger competition.
  • Bloggers were literally center stage at the exhibits, in the centrally located Synopsys booth, engaging in lively conversation regarding new media.
  • Atrenta held a Blogfest.
  • There was a Pavillion Panel dedicated to tweeting and blogging.
  • And most conspicuously, there was the 14-foot Twitter Tower streaming DAC related tweets.

Meanwhile, the traditional journalists who were still covering DAC seemed to fall into 2 camps. There were those who embraced the bloggers as part of the media and those that didn’t. Those that did, like Brian Fuller, could be found in many of the sessions and venues I mentioned above. Those that did not, could be found somewhere down the hall between North and South halls of Moscone in their own back corner room. I know this because I was given access to the press room this year and I did indeed find that room to be very valuable … I was able to print out my boarding pass on their printer.

Here’s my recap of the new media events:

I had mixed feelings regarding the Denali Top Blogger competition as I know others did as well. JL, Karen, and I all felt it was kind of silly, parading like beauty queens to be judged. Especially since blogging is such a collaborative, rather than competitive, medium. So often we reference and riff off of each other’s blog posts. Still, I think it was good recognition and publicity for blogging in EDA and one could not argue with the legitimacy of the blogger representatives, all first-hand experts in the areas that they cover. Oh, by the way, congratulations to Karen Bartleson for winning the award.

Conversation Central, hosted by Synopsys, was my highlight of DAC.  It was a little hard to find (they should have had a sign), located in a little frosted glass room on the left front corner of the Synopsys booth. But if you could find your way there, it was well worth the search. I’m a little biased since I hosted conversations there Monday - Wednesday on “Job Search: How Social Media Can Help Job Seekers & Employers”. The sessions were a combination of specific advice and lively discussions and debates. I was fortunate to have a recruiter show up one day and a hiring manager another day to add their unique perspectives. I think that that was the real power of this very intimate kitchen table style format. Everybody felt like they were allowed to and even encouraged to participate and add their views into the discussions. This is very different from a very formal style presentation and even panel discussions.

Unfortunately, I was not able to clone myself in order to attend all the sessions there, many of which I heard about afterwards from others or in online writeups. I did attend the session by Ron Ploof entitled “Objectivity is Overrated: Corporate Bloggers Aren’t Journalists, & Why They Shouldn’t Even Try”. Interestingly enough, no journalists showed up to the session. Still, it was a lively discussion, the key point being that bloggers don’t just talk the talk, they walk the walk, and therefore bring to the table a deeper understanding and experience with EDA and design than a journalist, even one that was previously a designer.

I also attended Rick Jamison’s session on “Competitors in Cyberspace: Why Be Friends?” which attracted several Cadence folks (Joe Hupcey, Adam Sherer, Bob Dwyer) and some Mentor folks. Although competitors for their respective companies, there was a sense of fraternity and a lot of the discussion concerned what is “fair play” with regards to blog posting and commenting. The consensus was that advocacy was acceptable and even expected from the partisans, as long as it could be backed up by fact and kept within the bounds of decorum (i.e. no personal attacks). EDA corporate bloggers have been very fair in this regards in contrast to some rather vitriolic “discussions” in other industries.

The Atrenta Blogfest sounded very interesting and I was very disappointed that I could not attend because it conflicted with my Conversation Central discussion. Mike Demler has a brief summary on his blog as does Daniel Nenni on his blog.

Late Wednesday, Michael Sanie hosted a DAC Pavillion Panel entitled “Tweet, Blog or News: How Do I Stay Current?” Panelists Ron Wilson (Practical Chip Design in EDN), John Busco (John’s Semi-Blog) and Sean Murphy (his blog) shared insights into the ways they use social media to stay current with events in the industry, avoid information overload, and separate fact from fiction. Ron Wilson commented that social networks are taking the place of the socialization that engineers used to get by attending conferences and the shared experience reading the same traditional media news. John Busco, the recognized first EDA blogger, shared how he keeps his private life and his job at NVidia separate from his blogging life. And Sean Murphy gave perspective on how blogging has grown within EDA and will continue to grow to his projection of 500 EDA bloggers in 2011.

Last, but not least, there was the Twitter Tower, located next to the Synopsys booth. Previous conferences, such as DVCon attempted to use hashtags (#DVCon) to aggregate conference related tweets. The success was limited, attracting perhaps a few dozen tweets at most. This time, Karen Bartleson had a better idea. Appeal to people’s vanity. The Twitter Tower displayed a realtime snapshot of all tweets containing “#46DAC“, the hashtag designated for the 46th DAC. If one stood in front of the tower and tweeted with this hastag, the tweet would show up within seconds on the tower. How cool is that? Sure it was a little gimmicky, but it made everyone who passed by aware of this new standard. As I write this, there have been over 1500 tweets using the #46DAC hashtag.

If you want to read more, Sean Murphy has done the not-so-glamorous but oh-so-valuable legwork of compiling a pretty comprehensive roundup of the DAC coverage by bloggers and traditional press. (Thanks Sean!)

harry the ASIC guy