Yes, designing chips is hard, it takes a lot of knowledge. This is why medical doctors need to go through all that schooling... designing a tiny chip with more transistors running software that does amazing things is very difficult.
My Ph.D. is in computer engineering, specifically VLSI and chip design. This was from a few years ago. I _probably_ should have gone into industry, I mean, after all, it is what I went to school for and wanted to do. However, the starting salary for a chip designer (Intel / AMD / HP / IBM) was literally less than I was making at a side job (I worked my way through my Ph.D) as an IT sysadmin. Not only that, people that I knew well that graduated before me would call me up and tell me it was worse than hell itself. 80 hour weeks? Completely normal, outside of the 2 hours of commute time. Barely make rent because you live in California? Check. Pages / Calls all hours of the day outside of work? Check. 80 hours? You mean 100 hours a week leading up to a release, right? Check.
Looking back on it, it seems this was "the challenging" and if you made it past this (something like 5 years on) things calmed down for a chip designer and you moved into a more "modest" 60-80 hours a week role with less pressure and somewhat of a pay increase.
Yes, how do you attract talent under those conditions? It is not flashy work, takes a lot of schooling and the rewards are low. At least medical doctors can kind of look forward to "well, I can make _real_ money doing this", and have the satisfaction of "I helped a lot of people".
The job is too niche. This means there are very few employers worldwide that can make use of your experience. That actually puts downwards pressure on your salary (the employer is a quasi monopsony), not to mention that you are tied to a few locations.
The reason software pays so well is that there’s a lot of demand by a lot of companies for the best in a global market AND the enough of the best can counter those offers by going to VCs to fund their own company. EE market is not as global and much more difficult to fund. The only ones paying the biggest salaries are the big companies
The chip industry obviously isn't struggling to attract the next generation, as can be seen by the fact their salaries in the article are about in line with everyone else's. I've worked in industries that struggled to attract people, it looked like 2x, 3x, 4x and 5x salaries for what a given skill-set would reasonably be valued at and they still sometimes have to hire unqualified people. And train them, shock & horror.
These articles crop up from time to time and I really don't think the frame is hitting a high standard of thoughtfulness. Or it is a surreptitious attempt to influence government policy in which case fair enough. The problem with the chip industry is that Asia has a comparative advantage at it, probably because that is where all the advanced industrial capital investment is happening. It has nothing to do with people - it is usual for these matters to turn out to be a regulatory problem when the superficial issues are peeled back.
>The problem with the chip industry is that Asia has a comparative advantage at it, probably because that is where all the advanced industrial capital investment is happening
Asia has a comparative advantage because labor is much cheaper there. PHDs working at TSMC in Taiwan make less than fresh grads working in software in the US.
And then goes to show how a SW engineer makes 30% more. Where’s the myth? Especially given that EE requires a lot more work so that 30% gap is actually worse.
"It is hard to make any sweeping conclusions that software pays more than hardware, or vice versa. The myth of software always being more lucrative may be unfounded, but the sentiment does exist among young professionals looking to choose career paths."
I.e. there surely exist hardware positions that pay better than some software position.
One has to be careful with statistics, you can't start with the aggregate and predict the individual case.
The is THE limiting factor of many fields. Bioscience etc. even car mechanics for that matter.
It all starts to click when you have a practical and physical application.
A garage is readily available and not expensive.
A lab on the other hand… imo there should be more low cost options to produce chips. There are a few projects trying to do that.
Some universities have some cooperation with fabs for dead/spare space.
You are too tied to the whims of the capital to risk going down such a specialized field where you need multibillion dollar worth of machinery to get started, I would say.
I don't understand why they don't make certain studies cheaper or free for students that qualify, or even give scholarships. It's very important to nudge more students towards politically or economically vital studies.
If it's economically vital, why doesn't it pay as well as industries competing for similar graduates? (not just programming, but also finance sucks up a lot of mathematically inclined people)
I'm reminded of COVID where the most "essential" workers inevitably meant the most expendable.
People make value arguments but these are incredibly naive. Look at the companies with the top market caps. They'd tumble if TSMC or ACML went under.
So there's something else to how price gets defined. Be careful to think the status quo is "rational". Sure, there's an explanation but an explanation is not a good explanation. Usually it's a simple explanation to an incredibly complex topic which no one really can explain
At least in the UK it just doesn't pay enough vs. the effort required, it also seemed like there were also a lot less jobs going compared to software.
The article shows some pay figures but those are American where everything pays insanely well compared to here so I'm not sure how relevant those are.
It's odd that it doesn't discuss the size of the industry though - I always thought of it as a small, relatively niche industry compared to software dev. and while there is probably less competition for that smaller amount of jobs, there's still a smaller amount of jobs.
I think it's a mistake to draw a hard line between "systems" software engineers and hardware engineers - e.g. I would see myself as a software engineer, but an pretty comfortable with HDL, or even modifying it, even if my primary job was on the driver side.
And also as a Brit who has since got a green card in the US a couple of years ago - I'm not sure if the UK is a great comparison point - brexit decimated that level of engineering in the UK - the sort of companies that need that expertise are at the multinational level or have fabs, none of which is really true for the UK isolated from Europe. Every company I worked for/with in the UK has either moved offshore or completely closed down since. And low demand causes low wages.
Hmmm what is a simple "hello world" project in chip design?
In computer science courses, that's as simple as a println().
In machine learning courses, that's training on mnist dataset to do character recognition.
In electrical engineering, that's buying a raspberry pi to blink led.
In chip design ... Chatgpt says to design a 1-bit full adder using verilog?
...
I understand why the article thinks the market is looking for graduate education. To design a simple chip requires an *initial investment* (as with all hardware startups really). This is different from software where one can simply launch a web app with a container hosted on your preferred cloud provider...
... That said, with the rise of LLMs lowering the barrier of entry of software even lower (e.g. vibe coding), may we see more rise of hardware startups/innovations?
FPGA dev boards are cheap nowadays, and you can start coding in a hardware definition language with a simulator. The ChatGPT answer of doing a 1-bit full adder as "hello world" makes sense.
You are obviously not going to etch silicon at home, but the design part is rather accessible as far as hardware goes.
> rise of LLMs lowering the barrier of entry of software even lower
Getting to your first wafer costs something like $250k and upwards of fab costs, depending on what process you're using. Hence much of chip design effort is already spent on verification, it's probably over 50% by now. This is the exact opposite of vibes because mistakes are expensive.
Businesswise it's quite tough B2B sales because you're selling into other people's product development pipelines. They need to trust you because you can sink their project, way over and above the cost of the actual parts.
Edit: I cannot emphasise enough how much more conservative the culture is in chip design and EE more broadly. It belongs to a world not just before "vibe coding" but before "web 2.0". It's full of weird closed source very expensive tooling, and is built on a graveyard of expensive mistakes. You've got to get the product 100% right on the first go.
Well, maybe the second go, production silicon is usually "B" rev. But that's it. Economics dictate you then need to be able to sell that run for a few years before replacing it with an upgraded product line.
The rule of thumb I use for chip design is that verification takes at least 2/3s of development. Sometimes more. 50% would be nice but I think is optimistic
I think you would just buy a cheap FPGA board and use that wouldn't you? No need to do a full chip until you know what you are doing. That would be like building a server farm just to do your software hello world
In the software field, people with no education or education from a country with a low standard are normally a pain in the ass to work with.
And in hardware mistakes are more costly, while in software most of the developers work on completely useless projects that are doomed to disappear soon.
In this article, we will discuss 6 reasons why the chip industry is struggling to attract new talent.
Theory-first education: In an effort to build from fundamentals, there is too much emphasis on theory rather than a focus on applications.
Compensation myth: There is a feeling that software pays more than hardware. Reality is not so cut and dry.
Graduate degrees: A lot more employers ask for graduate level degrees to enter chip design creating bottlenecks in talent supply.
Early specialization: Highly niche skillsets are less marketable and career limiting.
Documentation shortages: Hardware design is entirely tribal knowledge and hard to self-learn.
Chip design culture: Hardware companies have a retro feel to them, deadlines are tight, and mistakes are deadly.
EEs also have an awful "holier-than-thou" attitude in Engineering.
And then you make it only worse by requiring "Masters only or above". Well guess why, because your graduation was spent going around stuff that goes from nowhere to nowhere else.
"Graduate degrees" listed as a reason.
Yes, designing chips is hard, it takes a lot of knowledge. This is why medical doctors need to go through all that schooling... designing a tiny chip with more transistors running software that does amazing things is very difficult.
My Ph.D. is in computer engineering, specifically VLSI and chip design. This was from a few years ago. I _probably_ should have gone into industry, I mean, after all, it is what I went to school for and wanted to do. However, the starting salary for a chip designer (Intel / AMD / HP / IBM) was literally less than I was making at a side job (I worked my way through my Ph.D) as an IT sysadmin. Not only that, people that I knew well that graduated before me would call me up and tell me it was worse than hell itself. 80 hour weeks? Completely normal, outside of the 2 hours of commute time. Barely make rent because you live in California? Check. Pages / Calls all hours of the day outside of work? Check. 80 hours? You mean 100 hours a week leading up to a release, right? Check.
Looking back on it, it seems this was "the challenging" and if you made it past this (something like 5 years on) things calmed down for a chip designer and you moved into a more "modest" 60-80 hours a week role with less pressure and somewhat of a pay increase.
Yes, how do you attract talent under those conditions? It is not flashy work, takes a lot of schooling and the rewards are low. At least medical doctors can kind of look forward to "well, I can make _real_ money doing this", and have the satisfaction of "I helped a lot of people".
The job is too niche. This means there are very few employers worldwide that can make use of your experience. That actually puts downwards pressure on your salary (the employer is a quasi monopsony), not to mention that you are tied to a few locations.
I'm surprised if a chip expert wouldn't be very employable by various downstream businesses?
The reason software pays so well is that there’s a lot of demand by a lot of companies for the best in a global market AND the enough of the best can counter those offers by going to VCs to fund their own company. EE market is not as global and much more difficult to fund. The only ones paying the biggest salaries are the big companies
Employable by multiple downstream businesses able and willing to pay high prices in the location you want to live is the full parameter.
The chip industry obviously isn't struggling to attract the next generation, as can be seen by the fact their salaries in the article are about in line with everyone else's. I've worked in industries that struggled to attract people, it looked like 2x, 3x, 4x and 5x salaries for what a given skill-set would reasonably be valued at and they still sometimes have to hire unqualified people. And train them, shock & horror.
These articles crop up from time to time and I really don't think the frame is hitting a high standard of thoughtfulness. Or it is a surreptitious attempt to influence government policy in which case fair enough. The problem with the chip industry is that Asia has a comparative advantage at it, probably because that is where all the advanced industrial capital investment is happening. It has nothing to do with people - it is usual for these matters to turn out to be a regulatory problem when the superficial issues are peeled back.
>The problem with the chip industry is that Asia has a comparative advantage at it, probably because that is where all the advanced industrial capital investment is happening
Asia has a comparative advantage because labor is much cheaper there. PHDs working at TSMC in Taiwan make less than fresh grads working in software in the US.
> The Hardware vs. Software Compensation Myth
And then goes to show how a SW engineer makes 30% more. Where’s the myth? Especially given that EE requires a lot more work so that 30% gap is actually worse.
No, they don't.
"It is hard to make any sweeping conclusions that software pays more than hardware, or vice versa. The myth of software always being more lucrative may be unfounded, but the sentiment does exist among young professionals looking to choose career paths."
I.e. there surely exist hardware positions that pay better than some software position.
One has to be careful with statistics, you can't start with the aggregate and predict the individual case.
Theory first
The is THE limiting factor of many fields. Bioscience etc. even car mechanics for that matter.
It all starts to click when you have a practical and physical application. A garage is readily available and not expensive.
A lab on the other hand… imo there should be more low cost options to produce chips. There are a few projects trying to do that. Some universities have some cooperation with fabs for dead/spare space.
You are too tied to the whims of the capital to risk going down such a specialized field where you need multibillion dollar worth of machinery to get started, I would say.
I don't understand why they don't make certain studies cheaper or free for students that qualify, or even give scholarships. It's very important to nudge more students towards politically or economically vital studies.
> politically or economically vital studies.
If it's economically vital, why doesn't it pay as well as industries competing for similar graduates? (not just programming, but also finance sucks up a lot of mathematically inclined people)
I'm reminded of COVID where the most "essential" workers inevitably meant the most expendable.
What other industries?
People make value arguments but these are incredibly naive. Look at the companies with the top market caps. They'd tumble if TSMC or ACML went under.
So there's something else to how price gets defined. Be careful to think the status quo is "rational". Sure, there's an explanation but an explanation is not a good explanation. Usually it's a simple explanation to an incredibly complex topic which no one really can explain
>Look at the companies with the top market caps. They'd tumble if TSMC or ACML went under.
No they wouldn't because their competitors would be in the same situation as them. If nobody has advanced chips, nobody has a competitive advantage.
Because capitalism doesn't work like that?
At least in the UK it just doesn't pay enough vs. the effort required, it also seemed like there were also a lot less jobs going compared to software.
The article shows some pay figures but those are American where everything pays insanely well compared to here so I'm not sure how relevant those are.
It's odd that it doesn't discuss the size of the industry though - I always thought of it as a small, relatively niche industry compared to software dev. and while there is probably less competition for that smaller amount of jobs, there's still a smaller amount of jobs.
I think it's a mistake to draw a hard line between "systems" software engineers and hardware engineers - e.g. I would see myself as a software engineer, but an pretty comfortable with HDL, or even modifying it, even if my primary job was on the driver side.
And also as a Brit who has since got a green card in the US a couple of years ago - I'm not sure if the UK is a great comparison point - brexit decimated that level of engineering in the UK - the sort of companies that need that expertise are at the multinational level or have fabs, none of which is really true for the UK isolated from Europe. Every company I worked for/with in the UK has either moved offshore or completely closed down since. And low demand causes low wages.
Hmmm what is a simple "hello world" project in chip design?
In computer science courses, that's as simple as a println().
In machine learning courses, that's training on mnist dataset to do character recognition.
In electrical engineering, that's buying a raspberry pi to blink led.
In chip design ... Chatgpt says to design a 1-bit full adder using verilog?
...
I understand why the article thinks the market is looking for graduate education. To design a simple chip requires an *initial investment* (as with all hardware startups really). This is different from software where one can simply launch a web app with a container hosted on your preferred cloud provider...
... That said, with the rise of LLMs lowering the barrier of entry of software even lower (e.g. vibe coding), may we see more rise of hardware startups/innovations?
FPGA dev boards are cheap nowadays, and you can start coding in a hardware definition language with a simulator. The ChatGPT answer of doing a 1-bit full adder as "hello world" makes sense.
You are obviously not going to etch silicon at home, but the design part is rather accessible as far as hardware goes.
> rise of LLMs lowering the barrier of entry of software even lower
Getting to your first wafer costs something like $250k and upwards of fab costs, depending on what process you're using. Hence much of chip design effort is already spent on verification, it's probably over 50% by now. This is the exact opposite of vibes because mistakes are expensive.
Businesswise it's quite tough B2B sales because you're selling into other people's product development pipelines. They need to trust you because you can sink their project, way over and above the cost of the actual parts.
Edit: I cannot emphasise enough how much more conservative the culture is in chip design and EE more broadly. It belongs to a world not just before "vibe coding" but before "web 2.0". It's full of weird closed source very expensive tooling, and is built on a graveyard of expensive mistakes. You've got to get the product 100% right on the first go.
Well, maybe the second go, production silicon is usually "B" rev. But that's it. Economics dictate you then need to be able to sell that run for a few years before replacing it with an upgraded product line.
The rule of thumb I use for chip design is that verification takes at least 2/3s of development. Sometimes more. 50% would be nice but I think is optimistic
I think you would just buy a cheap FPGA board and use that wouldn't you? No need to do a full chip until you know what you are doing. That would be like building a server farm just to do your software hello world
I guess open something like simulator.io and start designing?
The one from Sebastian Lague is great too: https://sebastian.itch.io/digital-logic-sim
In chip design it is using languages like verilog that don't mean what they actually mean and just confuses everybody
Summary of six key reasons including an insistence on requiring graduate education.
In the software field, people with no education or education from a country with a low standard are normally a pain in the ass to work with.
And in hardware mistakes are more costly, while in software most of the developers work on completely useless projects that are doomed to disappear soon.
In this article, we will discuss 6 reasons why the chip industry is struggling to attract new talent.
Theory-first education: In an effort to build from fundamentals, there is too much emphasis on theory rather than a focus on applications. Compensation myth: There is a feeling that software pays more than hardware. Reality is not so cut and dry. Graduate degrees: A lot more employers ask for graduate level degrees to enter chip design creating bottlenecks in talent supply. Early specialization: Highly niche skillsets are less marketable and career limiting. Documentation shortages: Hardware design is entirely tribal knowledge and hard to self-learn. Chip design culture: Hardware companies have a retro feel to them, deadlines are tight, and mistakes are deadly.
Agree a lot with the article
Especially 1 and 3
EE education has absolute dog-crap didacticism.
EEs also have an awful "holier-than-thou" attitude in Engineering.
And then you make it only worse by requiring "Masters only or above". Well guess why, because your graduation was spent going around stuff that goes from nowhere to nowhere else.