To set the stage: I’ve heard the recent news about layoffs with Intel. Before that I read from their new CEO “On training, I think it is too late for us”. Lastly there has been some offhand comments (from LTT) that they’re preparing to sell the company.
Yet while I have no doubt that they are behind; their revenue is about 55 billion since 2023, down from the high of 78-80ish Billion during the pandemic, but about the same as the plateau leading up to the pandemic 2015-2019.
Maybe i’m naive about the way businesses work; but if your still profitable, and you know you need to “catch up” why lay off people and close sites? Maybe that works for a consumer goods company; if your overhead is too high and your not making a profit: slim down.
However for a company where RND is really where the value is, like Intel, it just doesn’t seem to make sense; your not going to get better designs and processes by reducing your experienced staff and letting them go work for the competition. Maybe some restructuring, (in the engineering sense not the euphemism for layoffs).
It’s easier to get rich killing a company than doing good business
Intel is best thought of as two businesses, where their historical dominance in one (actually fabricating semiconductors) protected their dominance in another (designing logic chips), despite not actually being the best at that.
Intel’s fabs represented the cutting edge in semiconductor manufacturing, and their superiority in that business almost killed AMD, who just couldn’t keep up. Eventually, AMD decided they wouldn’t try to keep up with cutting edge semiconductor manufacturing, and spun off their fabs as an independent company called Global Foundries in 2009.
But Intel hit a wall in progressing in semiconductor manufacturing, and made very slow progress with a new type of transistor known as a finFET, with lots of roadblocks and challenges. The biggest delays came around Intel’s 10nm process, where they never got yields quite to where they should have been, while other foundries like Samsung and TSMC passed them up. And so their actual CPU business suffered because AMD, now a fabless chip designer, could go all in on TSMC’s more advanced processes. Plus because they were fabless, they pioneered advanced packaging for “chiplet” designs where different pieces of silicon could be connected in a way that they acted like a single chip, but where the different components could be small enough that imperfections wouldn’t hurt yield as badly, and where they could mix and match the cheap processes and the expensive processes to the part of the “chip” that actually needed the performance and precision.
Meanwhile, Apple was competing with Qualcomm and Samsung in the mobile System on a Chip (SoC) systems for phones, and developed its own silicon expertise. Eventually, they were able to scale up performance (with TSMC’s help) to make a competitive laptop chip based on the principles of their mobile chip design (and then eventually desktop chips). That allowed them to stop buying Intel chips, and switch to their own designs, manufactured by TSMC. Qualcomm is also attempting to get into the laptop/small PC market by scaling up their mobile chip designs, also manufactured by TSMC.
Intel can get things right if it catches up with or surpasses TSMC in the next paradigm of semiconductor manufacturing. The transistors are changing from finFET (where TSMC has utter dominance) to GAAFET (where Intel, TSMC, and Samsung are all jockeying for position), and are trying out backside power (where the transistor gates are powered from underneath rather than from the cluttered top side). Intel has basically gone all in on their 18A process, and in a sense it’s a bit of a clean slate in their competition with TSMC (and to a lesser degree, Samsung, and a new company out of Japan named Rapidus), and possibly even with Chinese companies like SMIC.
But there are negative external signs. Intel acknowledged that they don’t have a lot of outside customers signing up for foundry services, so they’re not exactly poaching any clients from TSMC. And if that’s happening while TSMC is making absurd profits, that must mean that those potential clients who have seen Intel’s tech under NDA might see that Intel is falling further behind from TSMC. At that point, Intel will struggle to compete on logic chips (CPUs against AMD and Apple and maybe Qualcomm, discrete GPUs against AMD and NVIDIA), if they’re all just paying TSMC to make the chips for them.
So I don’t think all of their layoffs make a ton of sense, but understand that they’re really trying to retake the lead on fabrication, with everything else a lesser priority.
Thank you for this excellent summary!
Great explanation. Thanks.
They sat on their monopoly and didn’t innovate, got taken over by AMD and still didn’t innovate and instead of letting engineers do what engineers do c suite morons and execs took a bad situation and ran the company into the ground all in an effort to keep shareholders happy.
As soon as your engineering company starts taking advice from business instead of engineers, you’ve lost. See also - Boeing.
Are you kidding?
From the perspective of capitalists, Boeing is the fucking dream. They can innovate fuck not at all, they can bury inconvenient data AND the people who know it with impunity, and since they’re too big to fail in literally the cronyest capitalist industry on Earth, American War, the government not only won’t lift a finger, but will actively print money to give to them to let them keep doing all of the above in perpetuity.
Boeing is the capiteeliest of capitalist success stories. You didn’t think modern corporations actually believed their own propaganda about free markets deciding profit and success based on herp derp honest compertition? Thats just the bullshit they drive into kids minds when they can’t yet muster questions or concerns to ruin their lives and forge them into wage zombie husks. Capitalists want to win, preferably at the gunpoint of their captured governments, because actually prouducing products/services that people actually want is for suckers.
We’ll here’s a lecture by Peter Thiel basically saying “competition might be good for society but being a monopoly is good for business”. Ghoul says it openly.
Well, in the short term that’s true, because they won’t be allowed to fail. You have to think either the government will get much more involved or all their civilian customers will go away if this continues on, though.
You are bouncing on Boeing’s dick so hard I can hear the “boeing” from here. Jesus christ.
If you want to believe me calling them the peak of capitalism by capitalism’s own inhuman metrics is a compliment, knock yourself out.
;)
Patrick Gelsinger was an engineer.
Nvidia is run by a former engineer. I don’t know if that’s an argument for or against the idea.
Intel’s core product became big primarily due to decisions they had nothing to do with.
Roll back the clock to IBM developing their first PC. They usually developed hardware in house, but they were late to the game and needed something fast. They choose to use off the shelf hardware, with the BIOS being the one thing that’s proprietary.
For the CPU, they could have gone with Motorola or MOS or Texas Instruments. They choose Intel. Why? Because Intel fulfilled a memory contract on another project, so sure, use their 8088.
Compaq then reverse engineers the BIOS and the whole thing pases legal muster. Now anybody can make a compatible, but they have to use the same CPU that IBM did.
Microsoft does its thing with DOS and Windows. Everyone is writing software against that. Now everyone starts getting locked in.
After that, all Intel has to do is keep x86 going well enough that nobody wants to make the effort to switch. Yes, AMD and Cyrix are out there, but at this point, they’re both the cheap alternative that isn’t as good.
They fucked up the 64-bit transition. AMD did the version we all use now.
Intel’s entire success is based on making good on a memory contract to IBM decades ago. That’s it. It hinges primarily on decisions they did not make themselves. Weren’t even in the room at the time.
Also they did that netburst stuff that was a bad idea, and let AMD beat them to the punch on integrated memory controller… So many scenarios where Intel flubbed the “big change”.
However they have frequently been good on the little stuff. For a long time they had a fab advantage (and have messed that up for a decade now). There are decent at dotting the "I"s and crossing the ts, including broader software and firmware enablement.
Nowadays the only software people care about is CUDA and AMD actually has cash to take care of a lot of little things. Intel kept trying to make side efforts happen, like having a rack of memory and processors freely associated in ways no one ever asked for, phase change memory which while cool, was an awkward in-between NAND and SDRAM, various FPGA efforts which they never really figured out the actual point, putting an infiniband variant in the processor package in an awkward way that just made things worse, and a ton of weird accelerator architectures they abandoned within a couple years each time…
There are decent at dotting the "I"s and crossing the ts, including broader software and firmware enablement.
Yes, and that really puts the Raptor Lake failures in perspective. It undermined this whole argument that Intel stuff is rock solid even if it’s not the best in other ways. Xeon stuff was affected, too, and given how lucrative the server market is, that may have been the biggest mistake Intel has made.
From videos I watched, the big issue is them losing their market position. They took a big hit when Apple ditched them and made their own chips. Now they’re losing to AMD and Nvidia in the server space. Their newest desktop chips are under-performing. The consumer market is getting more competitive with Qualcomm joining the space and Nvidia/AMD preparing ARM chips. They made a lot of factories for producing chips but it sounds like they’re struggling to lock in a major buyer. Now they’re ejecting tens of thousands of employees in the next few months because they’re hemorrhaging money.
TL;DR they’re getting screwed from every front and either it will take them a long time to recover or they’re going to be left behind.
Apple probably didn’t move the needle, at least in any market Intel was actually in.
Intel’s deep woes began around 2016, when TSMC got ahead of them fab-wise and Intel stuck with in-house. Not a little ahead, years ahead and mostly a branding exercise to assert equivalence (“Intel 7” was just 10nm rebranded, and on the current 3nm front, TSMCs 3nm is over 50% more dense than Intel’s claimed “Intel 3”.).
At roughly the same time AMD did Zen, coming out of a long bad microarchitectural design.
Intel basically invested on trying to branch out in unproven directions rather than focusing on actually salvaging their core business. Intel partners were given huge budgets to try Intel’s wacky ideas no one asked for and burdened Intel CPUs with trying to have a built in FPGA or HPC fabric or phase change memory sticks. They thought if they could make a rack of cpu sockets, memory, and I/O that could be freely reassociated they would have a gold mine, despite no one really wanting that (software does fine with traditional setup).
Then to just utterly drive things home, NVIDIA comes and every IT budget is busy throwing every last dollar they have at GPUs with as little as possible spared for enabling components, like CPUs.
I hope they cling on and make somewhat of a comeback, or carve out a niche market, but I don’t feel sorry for them at all. The are guilty of shade monopolistic tactics.
As a nerdy consumer, I wouldn’t count Intel out. I remember when their Pentium 4’s ran hot and AMD started eating their lunch, then they launched the Core line up and were back on top. They get lazy when they’re not challenged.
That being said, historically, they haven’t done very well pivoting from their main business. Their GPU line up seems kind of ok but them trying to make mobile chips went nowhere.
Companies seem to have realized there’s real benefit to using ARM processors in laptops for the performance and battery life which is a direct threat to intel’s business.
So it’s intel’s ability to create when pressure is applied vs their inability to create products outside of their comfort zone.
I don’t count them out but it’s a steep climb.
I’ve got my eye on their stock just in case this looks like it might turn into something like Apple in the 90s.
It’s going to take a long while for them to come back especially since they plan on laying off a huge amount of their engineers.
From what I’ve heard, the main thing modern Intel really excels at is hardware video encoding.
Sounds about right. I was a die hard fan of Intel for years. I upgraded my PC this year and I picked AMD for the first time in my life. Looking at the scathing reviews and performance tables, it is an insane choice to pick Intel.
The hit wasn’t Apple leaving them. That was a small part of their business. The failure was getting in on mobile when they had the chance. They could have diversified and they didn’t, so when AMD came to eat their lunch, they had no fallback and now way to catch up.
Semiconductor fab is an industry which takes years and tons of cash to stand up a new product line, so a failure can really set you back a ton. Intel has had a series of false starts and outright failures competing with the entire industry. They can’t match TSMC on fabs, they can’t match AMD on x86 cpus, they got stomped by ARM in portables/edge, and they can’t seem to make a dent in the GPU market. The only place where they have a small market lead is in data center cpus, but they are at serious risk of falling behind that curve if AMD wants to move to a smaller node, or if server grade ARM finally takes off.
Intel got rich on vertically integration and now they are struggling on both the fab and the IP side, which has really broken their traditional business model.
I don’t think they have a datacenter lead anymore, EPYC really cooked them and they haven’t been able to really catch back up. It’s been a mess since AMD Rome for Intel.
Plus, they took a bath when they basically had to admit two entire generations of processors had a fatal flaw:
Plus the king of computation right now is Amazon, and Amazon cooks its own cpus (Graviton), and while they still also offer Intel and AMD, AWS plans are to eventually reduce that offering.
They aren’t profitable, and the great push by the previous CEO towards manufacturing was apparently not successful enough (so far I don’t think they managed to get proper clients)
Maybe i’m naive about the way businesses work; but if your still profitable, and you know you need to “catch up” why lay off people and close sites?
Precisely because you’re queuing the company up for M&A. You’re going to let the next guy do the investing and remodeling. In the meantime, Intel needs to look like a blank slate - paid down debts, no long term project costs, lots of potential for leveraged buyouts - so private equity can come in and do their thing.
Whether that means a Berkshire style business rewrite or a Bain Capital gutting and scrapping for parts, the important thing is that the investors get to project their vision of the future onto you. They’re not burdened by whatever plans the prior managers had on the table.
However for a company where RND is really where the value is, like Intel, it just doesn’t seem to make sense
When you’re on the bleeding edge, R&D is a value-add because it keeps the like-minded customers shopping from you. But when you’re lagging by a decade or more, it may be more efficient to simply strip mine your assets and narrow the focus of the business to the most profitable sectors. Intel could easily go the way of Nokia, making low cost substitutions for manufacturers who don’t want or need the high end chipsets. At the 7nm mark, Intel can just… keep making chips forever with comparatively little overhead. They don’t ever need to do better, because dishwashers and coffee machines don’t need the same kind of hardware as AI data farms or high end graphics cards.
The market for low end SoCs and embedded is highly competitive, with lots of Chinese companies like Allwinner and established niche leaders like Renesas or NXP. No way an outsider is going to beat them on price or features out of the blue.
Intel only has a chance to stay afloat in the high-ish end sector, whether they like it or not.
LTT isn’t a reliable news source.
And why is that?
If you want reliable sources you should look into War Thunder’s forum leaks.
Hey, we aren’t discussing military hardware here!
why would you trust someone who uses his platform to purposely harm the reputation of companies competing with the one he is invested in?
I am curious about your view. Can you point to anything specific? We all know he made a big investment into Framework and he was a fanboi of that company for a while.
There are some very real constraints around how LTT can review laptops now. Any promotional work (reviews, status updates, etc.) that LTT does for Framework is easily framed as such based on video context.
I am genuinely curious about this and your point of view. Why? I am not a huge fan of deception or otherwise shady practices that would illegally harm competitors.
What?
I hope they pull through. Not because of any loyalty. I’ll buy whatever is providing the best value for performance at the time. But I don’t want to see a monopoly in the x86 market.
More likely that this will just kill the x86 market and ARM will fully take over because of those chips’ reduced power consumption in comparison.
Nvidia ate their lunch and they don’t see a way back from it.
AMD too. Epyc is better than Xeon. Intel survives on the good will of integrators.
Closing/layoffs is a quick way to turn the books black.
I think this is not then being pessimistic. It’s one of the first cases of a corporation being realistic, rather than pushing the propaganda of success till it shuts down.
They have been number 1 for a good decade or two. That means the only place they can go is down.
Maybe i’m naive about the way businesses work; but if your still profitable, and you know you need to “catch up” why lay off people and close sites?
I don’t have any internal knowledge of Intel but I can make some guesses.
There is a 1 to 2 year process pipeline that goes from ideation, to design, to prototyping, to production readiness, to recurring production. If Intel has determined that the chips they have in design and prototyping stage aren’t market viable, there’s no reason to pass them to the next steps. This means that the teams that follow (production readiness, to recurring production) won’t have work for potentially years. So why employ the extremely expensive staff that do those steps for years when they have nothing to do and you just burn money for now output?
Yet while I have no doubt that they are behind; their revenue is about 55 billion since 2023, down from the high of 78-80ish Billion during the pandemic
Business have ways move moving profit and debt around. One way is corporate bonds ( or Commercial Paper). This can give cash infusions up front to build out infrastructure or finance today’s design costs knowing that you’ll be able to take the profits from the sales of those completed products at a later date, and pay off the debt. Its possible that Intel has taken out this debt, and because they’re dumping products currently in development, they won’t have any profits to pay off the debt. I don’t know if Intel has any of these, but they are not uncommon in large companies.
However for a company where RND is really where the value is, like Intel, it just doesn’t seem to make sense; your not going to get better designs and processes by reducing your experienced staff and letting them go work for the competition.
Sure, but maybe not on all product lines. If you have 10 product lines, and 8 of them are producing products that are barely profitable (or perhaps not profitable at all), you might trim those lines, reducing your headcount to provide more R&D resources to the 2 remaining promising product lines.
So why employ the extremely expensive staff that do those steps for years when they have nothing to do and you just burn money for now output?
Because in an industry as specialized as semiconductors, most of those “expensive staff” are people with 12 to 25 years of industry experience and company specific institutional knowledge.
Once they’re gone, it’s impossible to replace that knowledge. New hires will never know the same details and tricks, and the old staff are unlikely to come back after being screwed (except for insanely high compensation.) In specialized industries you have to retain the knowledge base through thin times to have any hope of being successful in thick times.
Its a shortsighted move by bean counters looking to make it to the next quarter so they can merge or sell off, and nothing more.