Archive for the ‘New Media / New Tech’ Category

Dear H. Gries

Thursday, March 17th, 2011

Below is the response I received 2 days after my original email to Verizon. As you can see, no change on my end at this point. I’m not too happy, but what do you think?

Verizon

Dear H. Gries,

Thank you for choosing Verizon. I have received your email dated 3/14/11 regarding your request to handle your concerns over a DSL technical issue that you were trying to report when an order was placed to remove your DSL and add Fios to your home. My name is Janine, and I will be happy to assist you.

We apologize for the delay in our response and regret any inconvenience to you.

I understand how important it is to be treated with respect and handle your concern efficiently.

We always welcome feedback from our customers and we appreciate your comments. We apologize for any difficulties you have experienced.

We constantly review our processes and procedures to determine where we can improve upon the Verizon customer experience. Customer feedback is vital to our business. Thank you for taking the time to offer your comments.

I am researching your online issue immediately. I have contacted our DSL escalation party to see if she can run your service back in immediately. Once I hear back from her, I will contact you back with her answer.

Although additional follow-up is needed, it has been my goal today to address your concerns related to the problems you have experienced. I hope I have succeeded in meeting that goal. In the meantime, if you have any other questions, please let us know. We look forward to serving you.

Thank you for using Verizon. We appreciate your business.

Sincerely,
Janine
Verizon eCenter

*****Simplify your life. Cut the clutter and help the environment with paperless billing!*****

Enroll today at: http://www.verizon.com/gogreen

Original Message Excluded:
————————-

Altium Looking to Gain Altitude in the Cloud

Sunday, January 30th, 2011

Altium Enterprise Vault SystemOver the holiday break, I came across an interview of Altium CIO Alan Perkins that caught my eye. Sramana Mitra has been focusing on interesting cloud-based businesses and this interview focused on how this EDA company was planning to move into the cloud. I wasn’t able to talk to Alan Perkins directly, but I was able to find out more through their folks in the US (the company is based in Australia). It was interesting enough to warrant a post.

I knew very little about Altium before seeing this interview and maybe you don’t either, so here is a little background. Based in Australia, Altium is a small (~$50M) EDA company focused primarily in the design of printed circuit boards with FPGAs and embedded software. They formed from a company called Protel about 10 years ago and most recently gained attention when they acquired Morfik, a company that offers an IDE for developing web apps (more on that later). According to some data I saw and from what they told me, they added 1700 new customers (companies, not seats) in 2010 just in the US! So, they may be they best kept secret in a long while. (Ironically, the next day at work after I spoke to Altium, I spoke to someone at another company that was using Altium to design a PC board for us).

According to Altium, their big differentiator is that they have a database-centric offering as compared to tool-flow centric offerings like Cadence OrCAD and Allegro and Mentor’s Board Station and Expedition and related tools. I’m not an EDA developer, so I won’t pretend to understand the nuances of one versus the other. However, when I think of a “database-centric”, I think of “frameworks”. I know it’s been almost 20 years since those days, and things have changed, so maybe database-centric makes a lot of sense now. OpenAccess is certainly a good thing for the industry, but that is because it’s an “open standard” while Altium’s database is not. Anyway, enough on this matter because, as I said, I’m not an EDA developer and don’t want to get in too deep here.

A few years ago, I wrote a blog post entitled “Is IP a 4-Letter Word?”. The main thrust of that post was that IP quality is rather poor in general and there needs to be some sort of centralized authority to grade IP quality and to certify its use. So, when Altium told me they plan to enable a marketplace for design IP by creating ”design vaults” in the cloud, my first question was “who is going to make sure this IP is any good”? Is this going to be the iPhone app model, where Apple vets and approves every app? Or is it going to be the Android model, caveat emptor.

To Altium’s credit, they have similar concerns, which is why they are planning to move slowly. With their introduction of Altium Designer 10, Altium will first provide it’s own vetted IP in the cloud. In the past, this IP was distributed to the tool users on their site, but having it in the cloud will make it easier to distribute (pull, insted of push) and also allow for asynchronous release and updates. The tools will automatically detect if you are using an IP that has been revved, and ask you if you want to download the new version.

Once they have this model understood, Altium then plans to open the model up to 3rd party IP which can be offered for free, or licensed, or maybe even traded for credits (like Linden dollars in Second Life). It’s an interesting idea which requires some pretty significant shifts in personal and corporate cultures. I think that sharing of small “jelly bean” type IP is acheivable because none of it is very differentiated. But once you get to IP that required some significant time to design, why share it unless IP is your primary business. The semiconductor industry is still fiercely competitive and I think that will be a significant barrier. Not to mention that it takes something like 4x-5x as much effort to create an IP that is easily reusable as compared to creating it just to be used once.

Being a tool for the design of FPGAs is an advantage for Altium, since the cost of repairing an FPGA bug is so much less than an SoC or ASIC. For FPGAs, the rewards may be greater than the risks, especially for companies that are doing ASICs for the first time. And this is the market that Altium is aiming for … the thousands of sompanies that will have to design their products to work on the internet-of-things. Companies that design toasters that have never had any digital electronics and now have to throw something together. They will be the ones that will want to reuse these designs because they don’t have the ability to design them in-house.

Which brings us to Morfik, that company that Altium acquired that does IDEs for web apps. It’s those same companies that are designing internet enabled toasters that will also need to design a web app for their customers to access the toaster. So if Altium sells the web app and the IP that let’s the toaster talk to the web app, then Altium provides a significant value to the toaster company. That’s the plan.

Still, the cloud aspect is what interests me the most. Even if designers are reluctant to enter this market, the idea of having this type of central repository is best enabled by the cloud. The cloud can enable collaboration and sharing much better than any hosted environment. And it can scale as large and as quickly as needed. It allows a safe sort of DMZ where IP can be evaluated by a customer while still protecting the IP from theft.

This is not by any means a new idea either. OpenCores has been around for more than a decade offering a repository for designers to share and access free IP. I spoke with them a few years ago and at the time the site was used mainly by universities and smaller companies, but their OpenRISC processor has seen some good usage, so it’s a model that can work.

I’m anxious to see what happens over time with this concept. Eventually, I think this sort of sharing will have to happen and it will be interesting to see how this evolves.

harry the ASIC guy

It Shrinks?

Sunday, January 2nd, 2011

As we enter a new year, it is comforting to know that we all are just a little bit dumber than we were last year.

According to an article in Discover Magazine, human brains have shrunk approximately 10% since Cro-Magnon man walked the earth 20,000 years ago. Although there is no certain relationship between brain size and intelligence, this still seems to be rather alarming and goes against what we all grew up believing. After all, don’t all those aliens have small bodies and big heads?

There are, of course, theories to explain this shrinkage.

One theory is that our brains have become more efficient and hence can do the same or better job with less mass. That’s the theory I’d like to believe.

Another thoery, described quite well in this clip from the movie Idiocracy, is that intelligence is no longer an asset for survival and procreation, and may even be a liability. That’s the theory I fear is true whenever I channel surf.

An interesting observation made by one anthropologist is that a smaller brain seems to be a way of naturally selecting against aggression and for tolerance and collaboration. Whereas early man had to be self-reliant and independent and aggreesive against his fellow man to survive, modern man benefits from the community which requires him to be tolerant and to collaborate.

By that reasoning, social media and social networking, which require a large amount of collaboration, are just the next stage in the evolution of the species.

So, as we start 2011, I’d like to propose a toast to us small-minded folks in social media. Smaller is better :)

harry the ASIC guy

EDA: The Next Big Things

Sunday, October 10th, 2010

As most of you know, I’ve been a big advocate for using technology to do more and more online. As an example, back in April, when the volcano in Iceland was causing havoc with air travel in Europe, I wrote a post on the Xuropa blog entitled “What’s in Your Volcano Kit?” In that post, I urged EDA companies to develop a kit of online tools to communicate and collaborate with current and prospective customers and the industry in general.

Well, it’s good to know that people are reading my blog and following my advice! ;)

One such tool that has become very popular in the last year, virtual conferences, are events sponsored either by media companies or the EDA companies themselves with several sessions throughout the day on a variety of topics. For us designers, they allow us to “drop in” on an event without leaving our desks or investing additional time or cost in traveling to and from the event. Certainly, it is not as rich an experience as being there live, but it’s more complete than the standard single topic disguised product pitch Webinar.

Since my advocacy was so fundamental in bringing these events about, I am very excited to be taking part in one of these upcoming virtual conferences. I will be moderating a session entitled “System-on-Chip: Designing Faster and Faster” at the upcoming “EDA Virtual Conference- EDA: The Next Big Things” on October 14. Here is a brief overview of my session, which will include presentations by Synopsys, Sonics, and Magma.

High speed digital design presents three important challenges: creating functional IP that performs well, combining IP blocks quickly to form a system, and being sure the system performs as expected with no surprises. EDA is allowing designers to create, simulate, connect, and deliver SoCs in new and exciting ways by combining and verifying IP blocks faster than ever. Very fast digital IP, with as high as 2 GHz clock speeds, is uncovering new issues that EDA and IP teams are working together to solve.

This session looks at the trends in digital IP, interconnect technology, issues in maintaining signal integrity, on-chip instrumentation, and more ideas to create sophisticated SoC designs and get chips to market quickly. Experts will discuss what they are seeing as clock speeds increase, tools capable of identifying issues, and ways to make sure a high speed SoC functions right the first time.

There are also 4 other 1-hour sessions during the day:

You can register for the event here. I hope you can make it.

harry the ASIC guy

Scott Clark on EDA Clouds

Sunday, August 8th, 2010

scottclark.jpgAlthough I had heard his name mentioned quite often, it wasn’t until this year at DAC that I finally met Scott Clark  for the first time. Scott was describing how, as Director of Engineering Infrastructure at Broadcom, he led a project to virtualize Broadcom’s internal data center in order to transform it into a private cloud. It was a great discussion. We had lunch a few weeks later to talk about his new business, Deopli, a company that he has founded to help other semiconductor and EDA companies improve their compute infrastructure operations in similar fashion.

So, when I saw Dan Nenni’s blog post on cloud computing and some of the responses, I thought I’d contact Scott. You see, as opposed to most of those commenting on Dan’s post, Scott has actually taken EDA tools and moved them to the cloud, so he knows what he’s talking about. Scott was kind enough to contribute a blog post on the subject, so please enjoy.

__________

Harry the ASIC Guy pointed me to Dan Nenni’s Silicon Valley Blog to take a look at this post regarding Daniel Suarez’s books Daemon and Freedom. His post intrigued me enough to download the first book to my iPad to get a feel for the style and atmosphere. That was good enough that I plan to read both. You can read Dan’s post to see his overview of the books, but at the end of his post, he poses a question that seemed to spark lots of conversation and varying opinions. His question was “Who can be trusted to secure Darknet (Cloud Computing)?”

I think Dan was making reference to concepts in the book where all data in the world becomes controlled by a finite set of service providers, and therefore creates an exposure based on the singularity of the solution. His references hit pretty close to home in Apple, Microsoft and Google, but that did not seem to be the focus of the responses. Because Dan’s background (and blog) is primarily in the EDA / Semiconductor space, the responses seemed to fall into the category of “Should Semiconductor companies use Cloud Computing?” and the array of opinions seemed to align on the two ends of the spectrum. There were a few respondents who felt that EDA would never ever move into the Cloud or gave somewhat skewed definitions of “cloud” to say “it’s impossible” but for the most part, it was refreshing to see some open minded views of what was possible and how things could work. I was particularly intrigued by Dan’s comment that he felt foundries would venture into the cloud hosting space. Given the history of the fabless semiconductor space, how can that not make perfect sense! The leadup to the creation of foundries was that internal manufacturing was growing in capacity and complexity to the point that it made more sense to have that done externally. The same dynamics are happening in the datacenter space for chip design today.

Some of the comments were very accurate in my experiences, so just to highlight a few (please read the blog for specifics so I don’t mis-quote). Daniel Payne made the observation that semiconductor companies will start by creating their own private cloud, and that is exactly where we are today (compute clusters really are private clouds). James Colgan injected sanity throughout and made some very astute observations about the functional dynamics and applicability of cloud to certain parts of a design flow. I can’t say how much I agree with Kevin Cameron’s comments on security; cloud has the potential to be a huge boost in security for the industry. Tom Anderson indicated that he is already doing chip design using Amazon EC2 resources, and I think there are many more like Tom out there. One of the last postings to date is by Lou Covey, and his opinion is that Cloud for the industry is inevitable - I happen to agree with that. It’s not that we “have to” but more of “this is the right answer for the business, and we should do the right thing”.

One of the missing concepts that I notice is that this blog is looking at generic cloud solutions, and not industry specific solution. You will see the development of EDA specific cloud solutions that is very focused on EDA customers, and in the beginning it will be private clouds with technology added to elastic expansion. That said, looking at Cloud for the EDA industry, there are still going to be several roadblocks to adoption that will need to be addressed:

  • Ego – getting around the perception that IT is a core competency of chip design companies. The core competency of a chip design company should be … chip design.
  • Cost – getting around the expectation that cloud should cost ½ as much as what I am currently paying. There are many economies of scale and efficiencies that cloud brings. Cloud is an opportunity for cost avoidance as time goes forward, not a refund policy.
  • Trust – letting go of what is a critical function / resource and having confidence that you can still get the results necessary. This industry has a very powerful model to refer to. In this case, how the fabs were released, and successful partnerships were formed.
  • Control – how to let go of a critical resource, and still maintain control over the resources, costs, schedules, and dynamics of capacity / priority decisions.
  • Security – probably the most wielded blade in the “you can’t do it” arsenal, but also probably the most misunderstood.
  • Performance – the final roadblock, which is the one with the most technical merit, is performance. There are many different facets to performance, but it will primarily fall into “internal cluster performance” and “display performance”.

My perspective, the ego part we can get around. Current conversations with many EDA companies indicate they are already leaning this direction, which is a good sign.

The cost issue is far more ambiguous. There are as many expectations of cloud as there are definitions, but invariably the expectations are rooted in economics. Given that, the only answer seems to be to create a realistic model for cost, present the data, and let nature take it’s course. There really is cost benefit, so companies will want to accomplish that

Trust seems like it should be the easy part for this industry, but it is proving to be more stubborn than that. I think that is mostly because of the implied threat to job security for the people who are currently performing the tasks (who are usually the people receiving the presentation about outsourcing their job). EDA companies should examine their own history to see what to do and how to do it.

The control front falls into the same category as trust. The same way that fabless semiconductor companies created internal organizations and positions for managing the outsource of the foundries, that model should be applied to the outsourcing of computational infrastructure. That is not to say there will not be contention issues for capacity and priority. The cloud suppliers will need to make sure they have enough resources so they can provide sufficient capacity to the customers, or they will not be the supplier long. Again, foundries will be a great model to look at for this.

On the security front, Cloud will at a minimum give data points to show how weak internal security has been historically. Applying best security practices in a consistent manner should actually help evolve an industry specific cloud security solution to better address security issues. And for the time being, we can just avoid the multi-tenant aspects of security by maintaining isolation – private clouds with share dynamic resources.

And finally, given that we are stalking about EDA specific clouds, they will be specifically designed to have “internal cluster performance” appropriate for EDA. It will be designed exactly like we would design that cluster for a companies private datacenter. The tricky part will be in addressing display performance issues for functions like custom layout and board design where network latency causes the engineer’s working style to be impacted.

So really this boils down to proper execution by the EDA cloud providers, and one technical hurdle of display latency, which has many ways to be addressed. There is a lot of money and attention being aimed at these issues and this industry, and really no real reason why it will not succeed. There might be some companies that choose to adopt at a slower rate than others, but I believe this will become the direction everyone goes eventually. Thanks Dan for a great read and thanks Harry for pointing me at it.

__________

Scott Clark has been an infrastructure solution provider in the EDA/Semiconductor industry for the last 20 years, working for companies like Western Digital, Conexant, and Broadcom. He holds a bachelors of science in applied mathematics from San Diego State University and is currently President and CEO of Deopli Corporation. You can follow Scott on his blog at HPC in the Clouds.

Is 2D Scaling Dead? Looking at Transistor Design

Wednesday, June 23rd, 2010

 (Part 3 in the series Which Direction For EDA: 2D,3D, or 360?)

Replica of the First TransistorIn the last blog post, I started to examine the question “is 2D scaling really dead or just mostly dead?” I looked at the most challenging issue for 2D scaling, lithography. But even if we can draw the device patterns somehow on the wafer at smaller and smaller geometries, does not necessarily mean that the circuits will deliver the performance (speed, area, power) improvements that Moore’s Law has delivered in the past. Indeed, as transistors get smaller (gate length and width) they also get shorter (oxide thickness). There are limits to the improvements we can gain in power and speed. We’ll talk about those next.

Transistor Design

First, consider what has made 2D scaling effective to date. The move to smaller geometries has allowed us to produce transistors that have shorter channels, operate at lower supply voltages, and switch less current. The shorter channel results in lower gate capacitance and higher drive which means faster devices. And the lower supply voltage and lower current result in lower dynamic power. All good.

At the same time, these shorter channels have higher sub-threshold and source-drain leakage currents and the thinner gate oxide results in greater gate leakage. At the start of Moore’s Law, leakage was small, so exponential increases were not a big deal. But at current and future geometries, leakage power is on par and soon exceeding dynamic power. And we care more today about static power, due to the proliferation of portable devices that spend most of their time in standby mode.

leakage-power.jpeg

The reduction in dynamic power is also reaching a limit. Most of the dynamic power reduction of the last decade was due to voltage scaling. For instance, scaling from 3.3V to 1.0V reduces power by 10x alone. But reductions beyond 08.V are problematic due to the inherent drop across a transistor and device threshold voltages. Noise margins are fast eroding and that will cause new problems.

Still, as with lithography, we haven’t thrown in the towel yet.

Strained Silicon is a technique that has been in use since the 90nm and 65nm nodes. It involves stretching apart the silicon atoms to allow better electron mobility and hence faster devices at lower power consumpti0on, up to 35% faster.

Hi-k dielectrics (k being the dielectric constant of the gate oxide) can reduce leakage current. The silicon dioxide is replaced with a material such as hafnium dioxide with a larger dielectric constant, thereby reducing leakage for an equivalent capacitance. This technique is often implemented with another modification which is replacing the polysilicon gate with a metal gate with lower resistance, hence increasing speed. Together, the use of hi-k dielectrics with metal gates is often referred to by the acronym HKMG and is common at 45nm and beyond.

A set of techniques commonly referred to as FinFET or Multi-gate FET (MuGFET) break the gate of a single transistor into several gates in a single device. How? Basically by flipping the transistor on it’s side. The net effect is a reduction in effective channel width and device threshold with the same leakage current; i.e. faster devices with lower dynamic power with the same leakage power.  But this technique is not a simple “tweak”; it’s a fundamental change in the way we build devices. To quote Bernard Meyerson of IBM, “to go away from a planar device and deal with a non-planar one introduces unimaginable complexities.” Don’t expect this to be easy or cheap.

Multigate FET - Trigate

A more mainstream technology that has been around a while, Silicon-on-Insulator (SOI), is also an attractive option for very high performance ICs such as those found in game consoles. In SOI ICs, a thick layer of an insulator (usually silicon dioxide) lies below the devices instead of silicon as in normal bulk CMOS. This reduces device capacitance and results in a speed-power improvement of 2x-4x, although with more expensive processing and a slightly more complex design process. You can find a ton of good information at the SOI Consortium website.

In summary, we are running into a brick wall for transistor design. Although there are new design techniques that can get us over the wall, none of these are easy and all of them are expensive, And the new materials used in this process create new kinds of defects, hence reducing yield. With some work, the techniques above may get us to 16nm or maybe a little bit further. Beyond that, they’re talking about Graphene transistors (i.e. carbon nanotubes), pretty far out stuff.

In my next post, I’ll look at some of the other considerations regarding 2D scaling, not the least of which is the extraordinary cost.

harry the ASIC guy

Brian Bailey on Unconventional Blogging

Tuesday, June 15th, 2010

bailey.jpg

(Photo courtesy Ron Ploof

I had the pleasure yesterday of interviewing Brian Bailey in the Synopsys Conversation Central Stage at DAC. We discussed his roots in verification working with the initial developers of digital simulation tools and his blogging experiences these past few years. There are, of course, even a few comments on the difference between journalists and bloggers ;)

You can listen to this half hour interview at the Synopsys Blog Talk Radio site. I’d be interested in your comments on the show and the format as well. It was pretty fun, especially in front of a live audience.

At 12:30 PDT today, I’ll be doing another interview on Security Standards for the Cloud. You can tune in live on your computer or mobile device by going to the main Synopsys Blog Talk Radio Page. So, even if you’re not here at DAC, you can still partake.

harry the ASIC guy

Where in the DAC is harry the ASIC guy?

Friday, June 11th, 2010

dac_logo.pngLast year’s Design Automation Conference was kind of quiet and dull, muted by the impact of the global recession with low attendance and just not a lot of real interesting new developments. This year looks very different; I’m actually having to make some tough choices of what sessions to attend. And with all the recent acquisitions by Cadence and Synopsys, the landscape is changing all around, which will make for some interesting discussion.

I’ll be at the conference Monday through Wednesday. As a rule, I try to keep half of my schedule open for meeting up with friends and colleagues and for the unexpected. So if you want to chat, hopefully we can find some time. Here are the public events that I have lined up:

Monday

10:30 - 11:00 My good friend Ron Ploof will interviewing Peggy Aycinena on the Synopsys Conversation Central stage, so I can’t miss that. They both ask tough questions so that one may get chippy. (Or you can participate remotely live here)

11:30 - 12:00 I’ll be on that same Synopsys Conversation Central stage interviewing Verification Consultant and Blogger Extraordinaire Brian Bailey. Audience questions are encouraged, so please come and participate. (Or you can participate remotely live here)

3:00 - 4:00 I’ll be at the Atrenta 3D Blogfest at their booth. It should be an interesting interactive discussion and a good chance to learn about one of the 3 directions EDA is moving in.

6:00 - Cadence is having a Beer for Bloggers event but I’m not sure where. For the record, beer does not necessarily mean I’ll write good things. (This event was canceled since there is the Denali party that night).

Tuesday

8:30 - 10:15 For the 2nd straight year, a large fab, Global Foundries (last year it was TSMC) will be presenting their ideas on how the semiconductor design ecosystem should change From Contract to Collaboration: Delivering a New Approach to Foundry

10:30 - 12:00 I’ll be at a panel discussion on EDA Challenges and Options: Investing for the Future. Wally Rhines is the lead panelist so it should be interesting as well.

12:30 - 1:00 I’ll be back at the Synopsys Conversation Central stage interviewing James Wendorf (IEEE) and Jeff Green (McAfee) about standards for cloud computing security, one of the hot topics.

Wednesday

10:30 - 11:30 I’ll be at the Starbucks outside the convention floor with Xuropa and Sigasi. We’ll be giving out Belgian Chocolate and invitations to use the Sigasi-Xilinx lab on Xuropa.

2:00 - 4:00 James Colgan, CEO of Xuropa, and representatives from Amazon, Synopsys, Cadence, Berkeley and Altera will be on a panel discussion on Does IC Design have a Future In the Cloud?. You know what I think!

This is my plan. Things might change. I hope I run into some of you there.

harry the ASIC guy

DAC Yesterday, Today, and Tomorrow

Friday, May 28th, 2010

About a week ago, I got an email from someone I know doing a story on how the Design Automation Conference has changed with respect to bloggers since the first EDA Bloggers Birds-of-a-Feather Session 2 years ago. I gave a thoughtful response and some of it ended up in the story, but I thought it would be nice to share my original full response with you.

Has your perception of the differences between bloggers and press changed since the first BOF?

Forget my perception; many of the press are now bloggers! I don’t mean that in a mean way and I understand that people losing their jobs is never a good thing. But I think the lines have blurred because we all find ourselves in similar positions now. It’s not just in EDA … many, if not most, journalists also have a blog that they write on the side.

Ultimately, I think either the traditional “press” or a blog is just a channel between someone with knowledge to people who want information they can trust. What determines trust is the reliability of the source. In thepast, the trust was endowed by the reputation of the publication. Now, weall have to earn that trust.

As for traditional investigative journalism (ala All the President’s Men) and reporting the facts (5 Ws), I think there is still a role for that, butmost readers are looking for insight, not jut the facts.

What do you think of DAC’s latest attempts to address these differences, e.g. Blog-sphere on the show floor, press room in the usual location?

Frankly, I’m not sure exactly what DAC is doing along these lines this year. Last year bloggers had very similar access as journalists to the press room and other facilities. It was nice to be able to find a quiet place to sit, but since most bloggers are not under deadline to file stories it is not as critical. Wireless technology is making a lot of this obsolete since we can pretty much work from anywhere. Still, having the snacks is nice :)

What does the future hold for blogging at DAC?

Two years ago, blogging was the “new thing” at DAC. Last year, blogging was mainstream and Twitter was the new thing. This year blogging will probably be old skool and there will be another “new thing”. For instance, I think we’re all aware and even involved in Synopsys’ radio show. This stuff moves so fast. So, I think the future at DAC is not so much for blogging, as it is for multiple channels of all kinds, controlled not only by “the media”, but also the vendors, independents, etc. Someone attending DAC will be able to use his wireless device to tap into many channels, some in real-time.

Next year, I predict that personalized and location aware services will be a bigger deal. When you come near a booth, you may get an invitation for a free demo or latte if your profile indicates you are a prospective customer. You’ll be able to hold up your device and see a “google goggles” like view of the show floor. You may even be able to tell who among your contacts is at the show and where they are. Who knows? It will be interesting.

harry the ASIC guy

Harry’s SEO Homework

Wednesday, February 17th, 2010

William ShakespeareAs I’ve mentioned before, I live in California, the state with the 46th best elementary school system in the country. Thank you California Lottery! So keep that in mind as you read the rest of this post.

One of the more challenging homework assignments my 3rd grade daughter receives regularly is to write a short story using a list of the week’s dozen or so spelling words. For instance, this is one that she received not so long ago:

Write about a time when you worked very hard to learn something. Tell what the experience was like. Use spelling words from the list.

And the list was:

coach    blow    float    hold    sew    though

sold    soap    row    own    both    most

She wrote about the time she learned to play the piano at summer camp. I won’t embarrass her by posting the story here, but suffice it to say that it was pretty forced. Don’t even think about asking how she got the word “soap” into the story!

So, this evening, whilst walking the dog, I was listening to this week’s episode of Leo Laporte’s This Week in Tech podcast (aka TWiT). On the podcast, someone mentioned a site called Wordstream. On this site, you can enter a keyword and it will tell you the most common search terms that includes that keyword. The idea is that, if you want to increase your SEO (search engine optimization), you should use the words that are most common in searches and the search engines will send people to you.

I immediately thought of my daughter’s homework assignment. The users of this site must feel like her, trying to weave the words generated by this site into their prose. I wondered how odd that would be. So, I decided to try it, just so I could get a taste of what my daughter went through. And also, because I thought it would be kinda fun.

Being “the ASIC guy”, what word other than “ASIC” could I have entered. After entering my keyword and my email address, I received an email with the 10,000 most common search terms that include “ASIC”. I decided to focus on the top 50 search terms, separating them out into individual words and listed them on a sheet of paper.

Now, without further ado, is Harry’s SEO Homework:

 __________

The alarm rang.

I lurched up out of bed, already in a panic, staring at the clock to see what time it was.

11:00am. Damn!

I took care of the basic biological necessities, then threw on my jeans, a T-shirt, and my brand new ASIC Gel-Kayano running chaussures. At least the company I worked for didn’t have a dress code and they didn’t care what shoes I wore. Designing ASICs and FPGAs is much easier when I’m comfortable.

I had been assigned to the verification team. My job was to search for bugs and to wrestle them down. Thankfully, I was able to use Verilog and System-Verilog for this project. Not like those VLSI design days, when I, and so many of my fellow engineers, had to wear a tie to impress the boss and had to use VHDL because they made us . A language by any other name is better than VHDL. Sure, VHDL is more structured. But, Verilog is a whole lot easier to use.

I’d been searching in some DCT4 code for one particular bug that had eluded me for 15 days. It should have been implemented in analog, but some Einstein decided digital logic was easier to design, so here I was.  It was me vs the bug. And the bug was winning!

Then it hit me. I was looking at the wrong register!

I felt a surge of power as I unlocked and modified my testbench. The combination of sleeplessness and Mountain Dew made me delirious. For a moment, I thought I was wearing a women’s dress and Onitsuka ASICs while playing volleyball in a prison cell. Gotta stop hanging out with those guys from the UK who watch Monty Python all the time.

I acted quickly, changing an “lt” to a “gt“, invoking the recompile flow on the new code, and kicking off the regression sim.

The simulation worked and I breathed a sigh of relief. My boss had threatened to bring in some hotshot design services company that he’d found on a website if I couldn’t find this bug. The nimbus that had been floating over my head for weeks was gone.

Now I could keep my job.

And now it was time for the layout guys to sweat!

__________

Phew! That was a lot harder than I thought. (Especially since those ASICS running shoes get a lot more hits apparently than the ASICs I usually write about). But now that I wrote and published that story, I expect I’ll be #1 on Google Search in the morning:-)

To be fair, I think there is certainly some value in understanding how people find this blog through various search terms. It helps me to understand what kind of information they are looking for and that helps me choose better topics to write about. But, taken to the extreme, if I write content for the search engines instead of all of you (my readers), then I’m in trouble. You may find me, but you won’t like what you find. And that would be much worse.

If anyone else wants to give this a try just for grins, just go to Wordstream and try it out. Just let me know where to find your “masterpiece”.

harry the ASIC guy