The Myth of Corporate Genius

Today, I want to talk about revolutionary ideas and how, still, some corporate experts think they did, in fact, invent those. I am sure that you can find so many counterexamples depending on what you consider a great idea. The world is full of surprises. But I am writing this post in response to numerous reactions I get when I talk about an idea that was initially developed outside today’s tech industry. I usually get a push back from some well-seasoned experts that think the industry has done it all. Nevertheless, from Nikola Tesla to page rank algorithms and now to new Large Language Models (LLMs), companies still make big claims and sometimes, in truth, make great strides, but to be honest, great ideas happen elsewhere.

Consider 100 years ago, Nikola Tesla ideas created a big technological advancement. Certainly, at that time also there were so many companies and people (e.g., Edison), that paved the way, but we know Nikola Tesla’s foresight was much greater, similar to his contributions, which Westinghouse later adopted.

Now consider these days, the main technology drivers are seemingly many companies, and it seems like ideas are developed within these entities. Still, this is an illusion. If you look closely, you will see most ideas are first developed outside of any of these companies, and later these companies mature them or sometimes destroy them. For sure there is great technological advancement when things happen at “Scale” (as one of them says), but in essence, they refine and engineer, and yet the true innovation often starts elsewhere.

While Tesla may not have crafted the ultimate generator to electrify the world, today’s fusion reactors are built upon the foundation he laid. Similarly, the origins of LLMs, page rank, and many groundbreaking innovations can be traced back to visionary thinkers outside the mainstream. It’s a testament to the fact that truly revolutionary ideas often don’t sprout in conventional corporate environments.

My Experience

A few years ago, right after getting my Ph.D. in computer architecture, I tried my luck with a startup. They said their CPU designs were better than the big names like Intel, AMD, and ARM. Their first reaction? “Why did you apply?” My research wasn’t all about CPU microarchitecture, but I had dug into a bunch of stuff, from how caching works to what microcodes are.

In the interview, I answered their tough questions, talking about everything from out-of-order execution to checkpointing. The hiring manager with more than 15-years of big corporate experience looked pretty shocked and asked, “How do you know all this?!” I told him that I’d read about a lot of these techniques in research papers. Sure, there are some tiny details you might only get if you’ve been working on the same thing for ages, but the big picture is the same.

The whole hiring thing was a marathon. I had five interviews, chatted with everyone on the team, and even had a talk with the CEO. At the end of it all, they went with someone who had just 2 years of work experience. Later, they had me go through another five interviews for a different team that was into SoC design. I did get an offer after all that. But the whole thing made me think: What’s more important to companies - what you know, the time you’ve spent in the industry, or something else? If I were them, I would bet on something else. It’s about how a person learns and their character.

Note: My focus is on computer science technologies, though this trend appears consistent in other tech areas. While companies are pivotal, true innovation often starts with individual visionaries. This post reflects my current personal perspective on today’s tech landscape.