It could also be for their PRO line of CPUs which have a guaranteed timeline of availability. Their current offerings were supposed to have a 7nm IO die, but it has remained on 14nm.
Doing Zen3 on 12LP+ will mean a redesign for the 12LP+ process, which is almost like a complete redesign to optimise it for this specific process. Which will require big $$$ sum, which doesn't make sense if they're targeting cheap laptops.
I didn't make up the rumor. If they put something on 12LP+, they get to sell more products than they can if they are on TSMC only. And Monet as described could sell for a long time because it would be sufficient for cheap laptops or mini PCs. It could replace the likes of the A6-9220C which is a 2019 Excavator product on 28nm.
Hmm... supply for leading processes will remain strapped, so it does make some sense to consider cranking out a fresh quad core APU on 12LP+ for the global market. But given the greater severity of the GPU shortage, it probably does make more sense to focus on GPUs for any excess supply not gobbled up by chipsets and older Pro chips.
That being the case, they should crank out RX 580/590 or an optimized "585" in the short term. Other than that, possibly retool Vega to use Polaris' memory controller for a more cost-effective 12LP GDDR GPU with some substantial grunt.
it especially doesn't make since 7nm is of course a better - if more expensive - node. The additional cost might be compensated by a thinner battery and chassis, which also make for a more attractive device. My guess is that IO dies are going to stay at 12nm (or a modification perhaps) for the foreseeable future. But in that timeframe, maybe GF is going to license another node.
I sure hope not. X570 ran way too hot for what it did. X670 is rumored to be dual B650 chipset that is unlikely to fit on mITX boards, and if they make this stuff on 12nm/14nm, I don't think many users would be pleased with how hot that chip would get.
The x570 chipset is just a repurposed Matisse IO die built at 14nm. Not being optimized for it's specific product segment is likely why it's TDP was allot higher than x470.
" X570 ran way too hot for what it did. " i have never heard the chipset fan on my X570 based board, i even took the cover off of it, and put a piece of paper in it just to make sure it worked, it does, but very rarely spins. so i dont know what you would consider " too hot "
My ASUS TUF X570's chipset fan was not controllable and easily the loudest fan in my system. Of course, in previous posts here and elsewhere, I'm just told the ASUS TUF X570 isn't a popular board, so who cares, right?
I also have the Asus Tuf x570 and my MB fan is effectively silent (and I'm someone who uses seven low rpm case fans with an analog fan controller to keep my system as quiet as possible)
It doesn't "require a fan" bigger heat sink, sure. Also it's on 14nm not 12nm and not designed as a chipset. So what's your point? 55nm chipsets worked just fine, so why would you think 12nm chipsets designed as chipsets to be any worse?
uh, because X570 was a hot chip that required a fan? How is this so hard to understand?
Do you know what other chipsets also use gloflo 14nm? X370 and X470 Those ran on passive heatsinks just fine. And do you know what the major difference is between X470 and X570? It's that X570 has PCIe Gen4 and USB 3.2.
Having a fan doesn't mean requiring a fan in most cases (literally). You basically need the fan in air flow limited cases under heavy PCIe 4 load. That's it. Hell, X570 is still cooler than some old chipsets from Athlon XP era. Those suckers were too hot to touch. They also were passively cooled with relatively large heatsinks.
No, X570 *as such* doesn't require a fan. Technically nothing does. They need heat dissipation, which can also be solved with bigger heatsinks, no fan required. What you're talking about is a specific case where low profile is wanted.
Maybe you are too dumb to do some research, if you did, you would know Z690 also have PCIe4 and USB3.2Gen2. Built on Intel 14nm. Even if Samsung/GloFo 14LPP is 30% worse, 12LP+ would be more than enough to make up the difference.
meacupla, i put a piece of paper in it, like kids would do with their bike tires and cards what back when, only time i heard that, was when i 1st booted, after that, it barely went on, if at all. " so i dont know what you would consider " too hot " " if the fan barely comes on at all, to me, that isnt hot, the heatsink alone is enough to cool it.
Did not expect to see it extended that far out, and amending it after less than a year is also surprising. I wonder what that means for I/O features like DDR5 and PCIe 5 support--if they think they can pull those off on a 12nm IOD or if some products are going to lack one or both.
Could this be for their stacked cache that is due out soonish? They don't want some hugely old process for that but still don't need cutting edge either
Because AMD has no interest in selling low margin chips when they can sell overpriced high margin products and the community defends them at every turn.
They're in this to make money and right now have no incentive to undercut their $300 6 core.
Not really, no. In a hypothetical scenario where they could increase production to sell more low-end chips then yes, they'd make more money. In their current situation, where they are making everything they can and selling all of it, it makes sense to sell what they can make into the highest possible market segment.
It sucks for all of us who like bargain chips, for sure. The only way out is if Intel start a price war or if they suddenly get the opportunity to produce a LOT more chips.
"when they can sell overpriced high margin products" Are they really overpriced? High margin sure, but given they can sell everything they build and they're still lower than Intel's historic price points, I'm not sure how that makes any sense in any context other than that AMD now charge more than they did when they were desperately clawing for second place in the market. To which I'd say sure, no surprise there, and it sucks for those of us who enjoyed riding that gravy train but it wasn't doing their finances any good.
If AMD can sell high-margin chips - great! And the more the better. They need every nickel they can make now to survive Intel's eventual comeback. Without them the PC desktop would be a quad-core wasteland, as Intel proved to us for *ten years*. We need to think less about the cost of buying AMD chips, and more about the cost of not buying them. Go AMD!
It's weird you typed your comment twice, but you are not entirely correct.
Capitalism used to work well during the early 1970s and earlier, because people were buying the highest quality product for the lowest amount of money. This was causing the industry to work harder, to be more innovative, and provide better for their customers than the competition. But this whole system stopped working effectively when companies started paying millions of dollars, and getting the best psychologists around the world, making deals with their networks, and crafting the biggest marketing campaigns in history. They realised they didn't need to make the best product or the cheapest prices, they can merely "life hack" their business into producing the greatest profits in their history just by clever marketing. Marketing was always around, and it started getting more traction in the late-1950s, but they really kicked it into high gear starting in the 1970s and on and on.
So for AMD to survive is meaningless. Both companies will cooperate eventually to maximise their profits. Alternatively, Intel can acquire AMD and form a monopoly. Whichever way, businesses keep the end-result profits since they had the upfront risk (well, not anymore with government bailouts). So in this system, it is always the consumers who pay in the end. This is the natural result of a free-market capitalist system. Because of these imperfections, there have been many laws written to "control" the market somewhat, such as Antitrust.
We have laws in place, in most countries, to make sure people don't get ripped off from buying certain goods, such as milk. These ensure farmers stay healthy, and consumers don't get price-gouged. Now I don't know if PC chips should be examined for similar roles, because the digitisation of our industrial and societal is heavily influenced by them. Surely its not quite relevant at the entry-level (Intel Atom), nor at the expensive luxury segment (RTX 3090). Yet, the mainstream area (Core i5, RX 6600) is still a lucrative market for Intel, AMD, and Nvidia.
So the real question that needs to be asked is, can we make fair laws and enforce them, that will protect the consumers and the market, from the greed of these corporations?
Yeah, don't believe your lying eyes. Intel suddenly scrambling to add cores to their desktop chips had *nothing* to do with competition from AMD.
> Alternatively, Intel can acquire AMD and form a monopoly.
No, they can't. There's no way that would be approved by the FTC.
> well, not anymore with government bailouts
Specifically what bailouts?
> Because of these imperfections, there have been many laws > written to "control" the market somewhat, such as Antitrust.
Anti-trust laws exist specifically to keep the free-market functioning.
Seriously, dude, you're being cynical to the point of stupidity.
> We have laws in place, in most countries, to make sure people don't > get ripped off from buying certain goods, such as milk. These ensure > farmers stay healthy, and consumers don't get price-gouged.
Price controls on agricultural goods primarily exist to keep farmers from getting wiped out when prices drop too low due to oversupply due to things like weather. Because, you know what happens when too many farmers go out of business? Food prices spike and people go hungry during the following years.
> can we make fair laws and enforce them, that will protect the > consumers and the market, from the greed of these corporations?
Speaking specifically of the US government, what they did was to subsidize semiconductor manufacturing like many other governments already do. That's a reasonable first step, IMO. To use your analogy of agricultural products, the next step would probably be for the government to provide some underwriting support for US-based fabs, so that if demand suddenly craters (as has happened many times before, in the semiconductor business), these fabs don't suddenly go out of business.
> Yeah, don't believe your lying eyes. Intel suddenly scrambling > to add cores to their desktop chips had *nothing* to do with competition from AMD.
Not to mention anything about power, which really spiraled out of control starting with Kaby Lake (which launched around the same time as the first-gen Ryzen).
> this whole system stopped working effectively when companies started > paying millions of dollars, and getting the best psychologists around the world
Also, did you ever hear about this thing called "the cloud"? It's one of the fastest-growing markets and cloud operators care about the bottom line, at the end of the day. If there's a more cost-effective solution, that's where they go, which is one reason ARM server CPUs are taking off.
It's great for AMD to be selling as many high-margin chips as they can. They will need every nickel to even stay in business when Intel eventually recovers, and without AMD we'd be condemned to eternal sub-mediocrity. Intel proved that with ten years of quad cores! Think less about the cost of buying AMD chips and more about the cost of not buying them.
CPU-wise, AMD aren't ripping anyone off. Yes, they raised their prices a little, but that's a legacy from when they had to to compete with Intel. It wasn't a good long term strategy if you end up being able to command higher prices.
It's great being a customer during times when companies use lower prices to compete. But you need to be aware that it is not healthy for those companies in many cases. They can only lower prices so much, and low margin products are luxury for them.
This ignorance and sense of entitlement seems to be very strong amongst the gaming community. Just look at Steam. Too many gamers are only prepared to spend very small amounts of money on games. But for most developers, especially small ones, that's simply not sustainable.
Nintendo get schtick from them for not lowing prices much or often. But really they just aren't devaluing their work to the point of self harm.
> Too many gamers are only prepared to spend very small amounts of money on games. > But for most developers, especially small ones, that's simply not sustainable.
They can always switch to a different line of work. Games are entertainment. The entertainment industry can be fickle and ruthless. Game developers ignore this at their peril.
It’s supply and demand. Through most of its life AMD ‘has been forced to sell low margin chips just to get them sold. Nobody wanted their chips. They had a few short peaks, but were then back in the valley. Now they’re peaking again. For how long? We don’t know.
But between Apple’s M series of chips, to Microsoft trying ARM we’re seeing a possible long term shift away from x86, if other vendors such as Qualcomm and others can even come close to Apple’s work. So AMD should make as much money when they can. Low end chips don’t make much profit. If AMD is going to have rainy days ahead, every penny they make on their higher end products will come in handy while they work on a shift.
Yeah yeah, we don’t know if that will ever happen, but then, people laughed when Apple came out with the dual core 64 bit A7. The industry is going under an upheaval right now, so it pays to put money away for later. And even now, AMD can’t command the prices Intel still can, so overpriced? Not really.
Didn't AMD announce plans for a 12nm Chromebook chip recently? Or was that just a rumor?
A 12nm wafer would be yielding at max quantities at this point and a 12nm wafer would be priced at a quarter of the price of a 7nm wafer. Chromebooks do not require cutting edge silicon to sell.
Cheap wafers should go a long way towards minimizing negative margin impact while also addressing Intel's free run with Pentium and Celeron in this segment of the market.
AFAIK it was a rumour, but it would make a lot of sense. The rumoured chip is called Monet, supposedly a quad-core Zen 3 design with 4 CUs of RDNA 2 graphics. That ought to do nicely for things more powerful than Chromebooks - performance would likely be comparable to Intel's 10th gen quad-core mobile devices.
I think you could make a case for not needing anything greater than Monet in certain segments. Good enough for office work, web browsing, light gaming, with 4K H.265/AV1 playback for HTPC. I guess the TDP would be 15 W, maybe less if 12LP+ is good.
I wonder whether global foundries will ever make a 7nm process. They cancelled it due to cost but much of the 3rd party equipment needed to make 7nm is already on the market. I'm sure it would be cheaper for them to make 7nm now. As there is a chip shortage i'm sure they could get a lot of long term customers to sign up for it.
I don't think it's that simple. Even if they buy 7nm equipment, they would have to completely retool their fabricators. This process would take many months, perhaps even more than a year, to fully complete and tune.
And GF made it very clear that they were not going to do that. The guy who bought the company said at least a couple of years ago that it was time to make some money after all that was poured into the firm, and that was why they were abandoning leading edge nodes such as 10mm and smaller. To go back on that now would be very difficult, as they’d be years behind.
The market has changed quite a bit since and it's not impossible that they might revisit that, like before they could go to Samsung, so it could be possible to set up something like Samsung "8nm" for a relatively modest sum, no EUV needed and mature process.
Eventually, the node shrinks will stop, there will be no new lithography techniques on the horizon, and costs will decline. Then companies like GF could skip right to the end with new fabs, after paying some licensing fees.
According to Jim Keller, his team at Intel found something like 50x worth of density improvements (probably relative to their 14 nm node?) that seem plausible. So, not for a while.
> Then companies like GF could skip right to the end
...if they're not long since out of business, by then. These fabs need an ongoing revenue stream. They can't just sit back and do nothing. And they generally have to build capacity *ahead* of where the demand is going to be, rather than waiting for the demand to appear first.
Two reasons. Wafers are more expensive than before, so you get same amount of wafers at the higher price! Second is that AMD needs more IO chips for their CPUs.
My wife, who is a corporate attorney, said that it was normal for agreements that are basically the same, but with minor amendments and extensions. It also give these looking for the agreements a way to find the exact one they need.
such old and cheap process is great for stuff that do not need high density and do not scale well on newest processes, so it may be mem and PCIE PHY, display controller, RAMDAC, USB, Thunderbold, etc. This is result of increasing use multiple different asics than single monolithic asic.
GPUs don't use RAMDACs anymore when they only have digital outputs. The term is a misnomer these days, pixel clock would be more accurate and anyway, even for things that *do* have analog video outputs they are integrated and have been for decades.
It absolutely WILL NOT be used for memory. It's easier and cheaper to just buy it rather than have your own production chain which would make little sense.
BTW, if anything, things are getting more integrated into single chips. Less packaging, less space required, easier board design, cheaper... And the chips aren't really monolithic in the way you seem to think. At the very least there are several versions of basically everything.
The currently have a 22nm SOI processes (marketed as 22FDX). I assume, but don't know for sure, that that's essentially the IBM process. They've added support for MRAM (a nonvolatile memory technology) and RF (radio frequency, allowing the process to be used for things like automotive radar).
I did a quick Google search and the 22nm process is from IBM. The latest interesting article I found about it was posted here in 2018. Apparently it’s targeted at chips 150 sq mm and smaller, due to wire capacitance making the process non-competitive against finFET with anything larger.
Curious about the 32nm equipment since everything old is new again due to the chip crunch.
They have long term hardware to support, so it'll be that.
Given that embedded solutions (at lot of that use these nodes) get five years of support, plus an option for two more, means that embedded solutions bought as far back as 2018 (one year after Ryzen released) will need to be supported with option for new (not new new) hardware. Bear in mind that these solutions also probably don't need more advanced nodes.
And hey, AMD might use some of the capacity to release some low end consumer CPUs/APUs.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
71 Comments
Back to Article
ballsystemlord - Thursday, December 23, 2021 - link
That might be for AMD's Motherboard chipsets.ballsystemlord - Thursday, December 23, 2021 - link
It could also be for their PRO line of CPUs which have a guaranteed timeline of availability. Their current offerings were supposed to have a 7nm IO die, but it has remained on 14nm.nandnandnand - Friday, December 24, 2021 - link
If they can make Monet quad-core Zen 3 on 12LP+, that would sell in cheap laptops or mini PCs.Xajel - Sunday, December 26, 2021 - link
Doing Zen3 on 12LP+ will mean a redesign for the 12LP+ process, which is almost like a complete redesign to optimise it for this specific process. Which will require big $$$ sum, which doesn't make sense if they're targeting cheap laptops.nandnandnand - Monday, December 27, 2021 - link
I didn't make up the rumor. If they put something on 12LP+, they get to sell more products than they can if they are on TSMC only. And Monet as described could sell for a long time because it would be sufficient for cheap laptops or mini PCs. It could replace the likes of the A6-9220C which is a 2019 Excavator product on 28nm.Josh128 - Monday, December 27, 2021 - link
It would make more sense to produce more RX580 and Vega 64 GPUs. Someone will buy them, for sure.Alexvrb - Sunday, January 2, 2022 - link
Hmm... supply for leading processes will remain strapped, so it does make some sense to consider cranking out a fresh quad core APU on 12LP+ for the global market. But given the greater severity of the GPU shortage, it probably does make more sense to focus on GPUs for any excess supply not gobbled up by chipsets and older Pro chips.That being the case, they should crank out RX 580/590 or an optimized "585" in the short term. Other than that, possibly retool Vega to use Polaris' memory controller for a more cost-effective 12LP GDDR GPU with some substantial grunt.
dersteffeneilers - Wednesday, December 29, 2021 - link
it especially doesn't make since 7nm is of course a better - if more expensive - node. The additional cost might be compensated by a thinner battery and chassis, which also make for a more attractive device. My guess is that IO dies are going to stay at 12nm (or a modification perhaps) for the foreseeable future. But in that timeframe, maybe GF is going to license another node.meacupla - Friday, December 24, 2021 - link
I sure hope not. X570 ran way too hot for what it did.X670 is rumored to be dual B650 chipset that is unlikely to fit on mITX boards, and if they make this stuff on 12nm/14nm, I don't think many users would be pleased with how hot that chip would get.
flashmozzg - Friday, December 24, 2021 - link
Wasn't X570 on 14nm? Not like there is a big difference between 14nm and 12nm+++ but still...StevoLincolnite - Saturday, December 25, 2021 - link
The x570 chipset is just a repurposed Matisse IO die built at 14nm.Not being optimized for it's specific product segment is likely why it's TDP was allot higher than x470.
The 300 and 400 series chipsets were all 55nm...
Qasar - Saturday, December 25, 2021 - link
" X570 ran way too hot for what it did. "i have never heard the chipset fan on my X570 based board, i even took the cover off of it, and put a piece of paper in it just to make sure it worked, it does, but very rarely spins. so i dont know what you would consider " too hot "
meacupla - Saturday, December 25, 2021 - link
That it requires a fan at all?Also, just because you can't hear it, doesn't mean someone else can't.
jeremyshaw - Saturday, December 25, 2021 - link
My ASUS TUF X570's chipset fan was not controllable and easily the loudest fan in my system. Of course, in previous posts here and elsewhere, I'm just told the ASUS TUF X570 isn't a popular board, so who cares, right?StevoLincolnite - Sunday, December 26, 2021 - link
I have the Asus TUF x570 and I can control the fan and it is quiet.Dizoja86 - Tuesday, December 28, 2021 - link
I also have the Asus Tuf x570 and my MB fan is effectively silent (and I'm someone who uses seven low rpm case fans with an analog fan controller to keep my system as quiet as possible)dotjaz - Saturday, December 25, 2021 - link
It doesn't "require a fan" bigger heat sink, sure. Also it's on 14nm not 12nm and not designed as a chipset. So what's your point? 55nm chipsets worked just fine, so why would you think 12nm chipsets designed as chipsets to be any worse?meacupla - Sunday, December 26, 2021 - link
uh, because X570 was a hot chip that required a fan?How is this so hard to understand?
Do you know what other chipsets also use gloflo 14nm? X370 and X470
Those ran on passive heatsinks just fine.
And do you know what the major difference is between X470 and X570?
It's that X570 has PCIe Gen4 and USB 3.2.
bananaforscale - Thursday, December 30, 2021 - link
Having a fan doesn't mean requiring a fan in most cases (literally). You basically need the fan in air flow limited cases under heavy PCIe 4 load. That's it. Hell, X570 is still cooler than some old chipsets from Athlon XP era. Those suckers were too hot to touch. They also were passively cooled with relatively large heatsinks.No, X570 *as such* doesn't require a fan. Technically nothing does. They need heat dissipation, which can also be solved with bigger heatsinks, no fan required. What you're talking about is a specific case where low profile is wanted.
dotjaz - Sunday, January 2, 2022 - link
So what? You think 55nm to 12nm isn't enough to compensate that minor difference? 28nm would have been enough.dotjaz - Sunday, January 2, 2022 - link
Maybe you are too dumb to do some research, if you did, you would know Z690 also have PCIe4 and USB3.2Gen2. Built on Intel 14nm. Even if Samsung/GloFo 14LPP is 30% worse, 12LP+ would be more than enough to make up the difference.Qasar - Sunday, December 26, 2021 - link
meacupla, i put a piece of paper in it, like kids would do with their bike tires and cards what back when, only time i heard that, was when i 1st booted, after that, it barely went on, if at all." so i dont know what you would consider " too hot " " if the fan barely comes on at all, to me, that isnt hot, the heatsink alone is enough to cool it.
twotwotwo - Thursday, December 23, 2021 - link
Did not expect to see it extended that far out, and amending it after less than a year is also surprising. I wonder what that means for I/O features like DDR5 and PCIe 5 support--if they think they can pull those off on a 12nm IOD or if some products are going to lack one or both.jeremyshaw - Thursday, December 23, 2021 - link
Might even do a crazier setup, with 12/14nm IOD and 7nm PHYs.kpb321 - Thursday, December 23, 2021 - link
Could this be for their stacked cache that is due out soonish? They don't want some hugely old process for that but still don't need cutting edge eitherTheinsanegamerN - Thursday, December 23, 2021 - link
Highly unlikely, large cache on 14nm would be a power hog.pugster - Thursday, December 23, 2021 - link
Don't understand why AMD couldn't source low end ryzen 2000 series cpus from them considering that AMD is not selling any low end cpu's.TheinsanegamerN - Thursday, December 23, 2021 - link
Because AMD has no interest in selling low margin chips when they can sell overpriced high margin products and the community defends them at every turn.They're in this to make money and right now have no incentive to undercut their $300 6 core.
Wereweeb - Friday, December 24, 2021 - link
G*mer discovers corporations only care about making money, more news at 11StevoLincolnite - Friday, December 24, 2021 - link
At the moment low-end, mid-range and high-end parts are ALL flying off shelves.AMD NOT having money on the table is less cash in their pockets.
StevoLincolnite - Friday, December 24, 2021 - link
AMD not having low-end chips on the table is less cash in their pockets.*Need an edit button. Maybe next century.
Spunjji - Friday, December 24, 2021 - link
Not really, no. In a hypothetical scenario where they could increase production to sell more low-end chips then yes, they'd make more money. In their current situation, where they are making everything they can and selling all of it, it makes sense to sell what they can make into the highest possible market segment.It sucks for all of us who like bargain chips, for sure. The only way out is if Intel start a price war or if they suddenly get the opportunity to produce a LOT more chips.
Oxford Guy - Friday, December 24, 2021 - link
The joys of duopoly. ‘Too bad’ that our alleged capitalism is defined by inadequate competition.meacupla - Saturday, December 25, 2021 - link
Yeah, it's not like there's an abundance of second hand Ryzen 1000/2000/3000 and Intel 2/3/4/5/6/7/8/9/10th gen parts on the used market right now.Oh, wait, there is.
Spunjji - Friday, December 24, 2021 - link
"when they can sell overpriced high margin products"Are they really overpriced? High margin sure, but given they can sell everything they build and they're still lower than Intel's historic price points, I'm not sure how that makes any sense in any context other than that AMD now charge more than they did when they were desperately clawing for second place in the market. To which I'd say sure, no surprise there, and it sucks for those of us who enjoyed riding that gravy train but it wasn't doing their finances any good.
Arbie - Friday, December 24, 2021 - link
If AMD can sell high-margin chips - great! And the more the better. They need every nickel they can make now to survive Intel's eventual comeback. Without them the PC desktop would be a quad-core wasteland, as Intel proved to us for *ten years*. We need to think less about the cost of buying AMD chips, and more about the cost of not buying them. Go AMD!Kangal - Saturday, December 25, 2021 - link
It's weird you typed your comment twice, but you are not entirely correct.Capitalism used to work well during the early 1970s and earlier, because people were buying the highest quality product for the lowest amount of money. This was causing the industry to work harder, to be more innovative, and provide better for their customers than the competition. But this whole system stopped working effectively when companies started paying millions of dollars, and getting the best psychologists around the world, making deals with their networks, and crafting the biggest marketing campaigns in history. They realised they didn't need to make the best product or the cheapest prices, they can merely "life hack" their business into producing the greatest profits in their history just by clever marketing. Marketing was always around, and it started getting more traction in the late-1950s, but they really kicked it into high gear starting in the 1970s and on and on.
So for AMD to survive is meaningless. Both companies will cooperate eventually to maximise their profits. Alternatively, Intel can acquire AMD and form a monopoly. Whichever way, businesses keep the end-result profits since they had the upfront risk (well, not anymore with government bailouts). So in this system, it is always the consumers who pay in the end. This is the natural result of a free-market capitalist system. Because of these imperfections, there have been many laws written to "control" the market somewhat, such as Antitrust.
We have laws in place, in most countries, to make sure people don't get ripped off from buying certain goods, such as milk. These ensure farmers stay healthy, and consumers don't get price-gouged. Now I don't know if PC chips should be examined for similar roles, because the digitisation of our industrial and societal is heavily influenced by them. Surely its not quite relevant at the entry-level (Intel Atom), nor at the expensive luxury segment (RTX 3090). Yet, the mainstream area (Core i5, RX 6600) is still a lucrative market for Intel, AMD, and Nvidia.
So the real question that needs to be asked is, can we make fair laws and enforce them, that will protect the consumers and the market, from the greed of these corporations?
mode_13h - Sunday, January 2, 2022 - link
> for AMD to survive is meaningless.Yeah, don't believe your lying eyes. Intel suddenly scrambling to add cores to their desktop chips had *nothing* to do with competition from AMD.
> Alternatively, Intel can acquire AMD and form a monopoly.
No, they can't. There's no way that would be approved by the FTC.
> well, not anymore with government bailouts
Specifically what bailouts?
> Because of these imperfections, there have been many laws
> written to "control" the market somewhat, such as Antitrust.
Anti-trust laws exist specifically to keep the free-market functioning.
Seriously, dude, you're being cynical to the point of stupidity.
> We have laws in place, in most countries, to make sure people don't
> get ripped off from buying certain goods, such as milk. These ensure
> farmers stay healthy, and consumers don't get price-gouged.
Price controls on agricultural goods primarily exist to keep farmers from getting wiped out when prices drop too low due to oversupply due to things like weather. Because, you know what happens when too many farmers go out of business? Food prices spike and people go hungry during the following years.
> can we make fair laws and enforce them, that will protect the
> consumers and the market, from the greed of these corporations?
Speaking specifically of the US government, what they did was to subsidize semiconductor manufacturing like many other governments already do. That's a reasonable first step, IMO. To use your analogy of agricultural products, the next step would probably be for the government to provide some underwriting support for US-based fabs, so that if demand suddenly craters (as has happened many times before, in the semiconductor business), these fabs don't suddenly go out of business.
mode_13h - Sunday, January 2, 2022 - link
> Yeah, don't believe your lying eyes. Intel suddenly scrambling> to add cores to their desktop chips had *nothing* to do with competition from AMD.
Not to mention anything about power, which really spiraled out of control starting with Kaby Lake (which launched around the same time as the first-gen Ryzen).
mode_13h - Sunday, January 2, 2022 - link
> this whole system stopped working effectively when companies started> paying millions of dollars, and getting the best psychologists around the world
Also, did you ever hear about this thing called "the cloud"? It's one of the fastest-growing markets and cloud operators care about the bottom line, at the end of the day. If there's a more cost-effective solution, that's where they go, which is one reason ARM server CPUs are taking off.
Arbie - Friday, December 24, 2021 - link
It's great for AMD to be selling as many high-margin chips as they can. They will need every nickel to even stay in business when Intel eventually recovers, and without AMD we'd be condemned to eternal sub-mediocrity. Intel proved that with ten years of quad cores! Think less about the cost of buying AMD chips and more about the cost of not buying them.Tams80 - Saturday, December 25, 2021 - link
And why should they?CPU-wise, AMD aren't ripping anyone off. Yes, they raised their prices a little, but that's a legacy from when they had to to compete with Intel. It wasn't a good long term strategy if you end up being able to command higher prices.
It's great being a customer during times when companies use lower prices to compete. But you need to be aware that it is not healthy for those companies in many cases. They can only lower prices so much, and low margin products are luxury for them.
This ignorance and sense of entitlement seems to be very strong amongst the gaming community. Just look at Steam. Too many gamers are only prepared to spend very small amounts of money on games. But for most developers, especially small ones, that's simply not sustainable.
Nintendo get schtick from them for not lowing prices much or often. But really they just aren't devaluing their work to the point of self harm.
mode_13h - Sunday, January 2, 2022 - link
> Too many gamers are only prepared to spend very small amounts of money on games.> But for most developers, especially small ones, that's simply not sustainable.
They can always switch to a different line of work. Games are entertainment. The entertainment industry can be fickle and ruthless. Game developers ignore this at their peril.
melgross - Friday, December 24, 2021 - link
It’s supply and demand. Through most of its life AMD ‘has been forced to sell low margin chips just to get them sold. Nobody wanted their chips. They had a few short peaks, but were then back in the valley. Now they’re peaking again. For how long? We don’t know.But between Apple’s M series of chips, to Microsoft trying ARM we’re seeing a possible long term shift away from x86, if other vendors such as Qualcomm and others can even come close to Apple’s work. So AMD should make as much money when they can. Low end chips don’t make much profit. If AMD is going to have rainy days ahead, every penny they make on their higher end products will come in handy while they work on a shift.
Yeah yeah, we don’t know if that will ever happen, but then, people laughed when Apple came out with the dual core 64 bit A7. The industry is going under an upheaval right now, so it pays to put money away for later. And even now, AMD can’t command the prices Intel still can, so overpriced? Not really.
Josh128 - Monday, December 27, 2021 - link
Microsoft is trying to aquire ARM? Since when? Its Nvidia only, AFAIK.mode_13h - Sunday, January 2, 2022 - link
I think "Microsoft trying ARM" refers to a rumored ARM-based CPU Microsoft has been working on.Intel999 - Friday, December 24, 2021 - link
Didn't AMD announce plans for a 12nm Chromebook chip recently? Or was that just a rumor?A 12nm wafer would be yielding at max quantities at this point and a 12nm wafer would be priced at a quarter of the price of a 7nm wafer. Chromebooks do not require cutting edge silicon to sell.
Cheap wafers should go a long way towards minimizing negative margin impact while also addressing Intel's free run with Pentium and Celeron in this segment of the market.
Spunjji - Friday, December 24, 2021 - link
AFAIK it was a rumour, but it would make a lot of sense. The rumoured chip is called Monet, supposedly a quad-core Zen 3 design with 4 CUs of RDNA 2 graphics. That ought to do nicely for things more powerful than Chromebooks - performance would likely be comparable to Intel's 10th gen quad-core mobile devices.nandnandnand - Thursday, December 30, 2021 - link
I think you could make a case for not needing anything greater than Monet in certain segments. Good enough for office work, web browsing, light gaming, with 4K H.265/AV1 playback for HTPC. I guess the TDP would be 15 W, maybe less if 12LP+ is good.spaceship9876 - Friday, December 24, 2021 - link
I wonder whether global foundries will ever make a 7nm process. They cancelled it due to cost but much of the 3rd party equipment needed to make 7nm is already on the market. I'm sure it would be cheaper for them to make 7nm now. As there is a chip shortage i'm sure they could get a lot of long term customers to sign up for it.haukionkannel - Friday, December 24, 2021 - link
GF will not go for 7nm because it is too expensive. If GF can buy used 7nm capacity at cheap, they may. But that is tens of years from now.meacupla - Friday, December 24, 2021 - link
I don't think it's that simple. Even if they buy 7nm equipment, they would have to completely retool their fabricators. This process would take many months, perhaps even more than a year, to fully complete and tune.melgross - Friday, December 24, 2021 - link
And GF made it very clear that they were not going to do that. The guy who bought the company said at least a couple of years ago that it was time to make some money after all that was poured into the firm, and that was why they were abandoning leading edge nodes such as 10mm and smaller. To go back on that now would be very difficult, as they’d be years behind.Zoolook - Saturday, December 25, 2021 - link
The market has changed quite a bit since and it's not impossible that they might revisit that, like before they could go to Samsung, so it could be possible to set up something like Samsung "8nm" for a relatively modest sum, no EUV needed and mature process.nandnandnand - Thursday, December 30, 2021 - link
Eventually, the node shrinks will stop, there will be no new lithography techniques on the horizon, and costs will decline. Then companies like GF could skip right to the end with new fabs, after paying some licensing fees.mode_13h - Sunday, January 2, 2022 - link
> Eventually, the node shrinks will stopAccording to Jim Keller, his team at Intel found something like 50x worth of density improvements (probably relative to their 14 nm node?) that seem plausible. So, not for a while.
> Then companies like GF could skip right to the end
...if they're not long since out of business, by then. These fabs need an ongoing revenue stream. They can't just sit back and do nothing. And they generally have to build capacity *ahead* of where the demand is going to be, rather than waiting for the demand to appear first.
qlum - Friday, December 24, 2021 - link
If tbe price is right, it could at least be used to supply the low-end woth chips on older nodes for some time.haukionkannel - Friday, December 24, 2021 - link
Two reasons. Wafers are more expensive than before, so you get same amount of wafers at the higher price! Second is that AMD needs more IO chips for their CPUs.Spunjji - Friday, December 24, 2021 - link
"First Amendment to the Amended and Restated Seventh Amendment to the Wafer Supply Agreement"Amazing
melgross - Friday, December 24, 2021 - link
My wife, who is a corporate attorney, said that it was normal for agreements that are basically the same, but with minor amendments and extensions. It also give these looking for the agreements a way to find the exact one they need.mode_13h - Sunday, January 2, 2022 - link
I'm convinced English is a rubbish language for contracts.TristanSDX - Friday, December 24, 2021 - link
such old and cheap process is great for stuff that do not need high density and do not scale well on newest processes, so it may be mem and PCIE PHY, display controller, RAMDAC, USB, Thunderbold, etc. This is result of increasing use multiple different asics than single monolithic asic.bananaforscale - Thursday, December 30, 2021 - link
GPUs don't use RAMDACs anymore when they only have digital outputs. The term is a misnomer these days, pixel clock would be more accurate and anyway, even for things that *do* have analog video outputs they are integrated and have been for decades.It absolutely WILL NOT be used for memory. It's easier and cheaper to just buy it rather than have your own production chain which would make little sense.
BTW, if anything, things are getting more integrated into single chips. Less packaging, less space required, easier board design, cheaper... And the chips aren't really monolithic in the way you seem to think. At the very least there are several versions of basically everything.
mode_13h - Sunday, January 2, 2022 - link
PCIe's PAM4 encoding needs something like a DAC/ADC. Not sure how many other interconnects use something like a modulation scheme.Anyway, I think the point was that older process nodes can be used for *chiplets* containing some of these interface blocks.
Oxford Guy - Friday, December 24, 2021 - link
Is the 32nm SOI still being made by GF?Oxford Guy - Monday, December 27, 2021 - link
Did GF ever put the 22nm SOI from IBM into production?KennethAlmquist - Wednesday, December 29, 2021 - link
The currently have a 22nm SOI processes (marketed as 22FDX). I assume, but don't know for sure, that that's essentially the IBM process. They've added support for MRAM (a nonvolatile memory technology) and RF (radio frequency, allowing the process to be used for things like automotive radar).Oxford Guy - Friday, December 31, 2021 - link
Thanks for that info. What happened to the 32nm SOI machines?Oxford Guy - Friday, December 31, 2021 - link
I did a quick Google search and the 22nm process is from IBM. The latest interesting article I found about it was posted here in 2018. Apparently it’s targeted at chips 150 sq mm and smaller, due to wire capacitance making the process non-competitive against finFET with anything larger.Curious about the 32nm equipment since everything old is new again due to the chip crunch.
Tams80 - Saturday, December 25, 2021 - link
They have long term hardware to support, so it'll be that.Given that embedded solutions (at lot of that use these nodes) get five years of support, plus an option for two more, means that embedded solutions bought as far back as 2018 (one year after Ryzen released) will need to be supported with option for new (not new new) hardware. Bear in mind that these solutions also probably don't need more advanced nodes.
And hey, AMD might use some of the capacity to release some low end consumer CPUs/APUs.
Sivar - Monday, December 27, 2021 - link
|Officially classified as the "First Amendment to the Amended and Restated Seventh Amendment to the Wafer Supply Agreement"Attorneys can be funny, too.