Sounds like almost all the layoffs will be in USA.
Based off the numbers, $1.4B/yr, and 12K layoffs, the expectation is a savings of $116K per employee per year. If we assume 90% of those are in the USA, that would bump up the savings to ~$130K/yr assuming the foreign workers are free (but than they wouldn't get laidoff if they were free so that is not a good assumption).
Its unlikely the cuts are to fabs since those dont run themselves, its probably R&D and chip engineering. Is this to make room for an acquisition perhaps? Maybe buy salesforce? That's more than all the employees at AMD, so it would have to be a big acquisition if it was one.
It sounds like a lot but think about it like this. Companies, on average, experience ~5% turnover per year in employees. This isn't THAT big. It's 2 years of turn over, probably in the same jobs that experience high turn over.
Intel isn't all fabs and R&D, there's janitors, secretaries, accountants, HR, and so on. I doubt they'll really be shrinking PC fab and R&D, probably just the support staff around that department needed to maintain rapid growth such as hiring, payroll and training. If you think about it, this is more like a hiring freeze than a full blown workforce reduction.
But the problem is Intel already had said last year they would reduce their workforce through freeze and attrition by ~5% last year. They had the program where they would offer incentives for folks to quit so they could reduce their workforce. This announcement is in addition to that, so I don't think the folks who were already going to leave are still there since they would've left last year when Intel first introduced the program with favorable quitting incentives (like 2-3 weeks severance for every year accrued plus some cash).
Turnover is not the same as layoffs. Turnover is change of employees. Are you saying Intel is just going to hire some OTHER 12,000 employees over the next 2 years?
no, he's saying that half of that reduction will happen simply through a turn-over freeze, so it's not as massive as it seems. They've said as much too, voluntary lay-off = freezing turn-over and providing good packages for people who leave.
Hmm. How well have companies that implement your Darwinist theory of management worked out? Microsoft and Dell used Stack Ranking, to notoriously bad effects in both cases. Enron went all-in, to the extent that the company essentially became a criminal enterprise.
stack ranking hits single people and destroys teamwork.
You can apply darwinist management by talking to bosses and seeing who works best considering everything, including team leadership. The problem is that it's not easy to measure.
@Gigaplex: "Knowing you've got roughly a 1 in 10 chance of getting laid off in a drawn out process spanning a year or so can't be good for morale."
The article address that for you. (In the update perhaps)
@Article: "While the majority of the notices to employees will go out in 60 days, the projection is that only about half of the layoffs will be completed by the end of this year"
Looks like they'll know pretty quickly. Also, assuming a 5% turnover rate and given that only half of the 11% reduction will take place this year, this is starting to look a lot more like a targeted turnover freeze + the 1% of employees that aren't getting the job done. That said, I've seen voluntary layoffs backfire on companies before. They invariably end up loosing someone that they would rather have kept around.
Didn't intel have a 20,000 employee layoff about 7 or 8 years ago? is this going to be a repeating cylce where Intel hires a bunch of people then lays them off before they hit retirement age?
The thing is, Intel, like HP, when they have layoffs, it doesn't affect the "lifers" like most other companies. I know people who work at each company in Oregon and Texas and they've been there for 10-15 years and know people who have been there for 30+ years. The boards are still very old school at these companies and actually respect keeping the aging population around, because in a lot of ways, experience, especially in physics and engineering, is priceless.
I also think it's fascinating both companies offer PAID paternity leave for fathers. I'd never even heard of that before my buddy at Intel had his first child.
Agree. I'm in finance. Perks and salaries are high to retain engineering talent - when it is needed, but they quickly become a burden upon non revenue growth. Q1 revenues were not good in any businesses, Q2 and second half estimates are kinda gloomy, so new capital and R&D expenditures are not going to realize, hence the most expensive are to go. Accountants are 1) not that expensive 2) like a fixed expense - you have 5 accountants with 200 or 500 engineers and technicians 3) majority of them are temp and contracts anyway. When a company has 1.4 bln of pretax related charge, that in Silicon Valley means tech staff laid off , entire division shut down i.e. people on full time employment with benefits and large 401K plans.
This sounds as clueless as Microsoft's version of the same thing a year ago --- a dinosaur flailing around without a real clue. Intel's main business is CPUs, right? So where do they ACTUALLY plan to cut back? Xeons? God no, that's where the money is. Core-m? That's where the other money is. mobile i3, i5, i7? Not if they care about 2-in-1s. Atom and Quark? I'd get rid of them, but apparently Intel still has fantasies that they can make money with the IoT. Their various software groups? icc, openMP, the Linux group, numerics, Cilk and other parallel initiatives are probably cheap and pay their way strategically. LTE? Not if they have those grand IoT plans. etc etc
I'm mocking them because I have a low tolerance for BS. You don't claim "Our results demonstrate a strategy that’s working and a solid foundation for growth" at the same time that you are LAYING OFF 11% of your workers. You don't claim "this requires some difficult decisions" and then refuse to actually give a single decision as to which groups are going to be affected.
The one group I can think of that they might reasonably toss is SSDs. They have no particular advantage in this space, and the battle has been won --- you don't need to convince anyone now to buy SSDs. 3D-XPoint might POSSIBLY be the same --- I expect Intel had grand dreams of using it as a strategic advantage to force the purchase of Xeons, during whatever window exists while 3D-XPoint is the only viable RAM-bus attached NVMe, but now that it's become clear how large a job integration this sort of persistent memory is (mainly because of the OS work involved), they may be wondering if no OS's will be ready until after alternatives to 3D-XPoint exist, like the various MRAM solutions or alternative ReRAM solutions. This outcome would severely limit 3D-XPoint's strategic value, so once again, why is Intel doing the job that other vendors can do just as well, a job that will be driven by standards and commoditization?
The last possible candidate is Xeon Phi. My guess is that no way has this paid for itself and there's no explosive growth path ahead of it. BUT I suspect Intel management also fear that it's a strategic requirement. If they want various future supercomputer contracts, they need something like Xeon Phi (ie a throughput engine), and they don't want the humiliation of using nV cards in an Intel supercomputer. And THAT is how great empires destroy themselves --- through emotional concerns over "humiliation" rather than rational thought. (God knows, this is the story of 90% of the stupid things the US has done since at least getting involved in Vietnam.)
Meanwhile, kids what's happening to the competition? Oh, TSMC, fresh from telling us that they've been in 10nm "risk" production for some months now (probably too soon for A10, but likely ready for A11 in 2017) said to shareholders today that they're starting 7nm risk production in 2H2017, with mass production hopefully by 2018. (So MAYBE soon enough to hit A12s...) Let the weeping and wailing begin about how TSMC sux because their nanometers aren't the same as Intel nanometers, how Intel fabs are superior because reasons, etc etc...
Have you ever worked in a large corporation? If yes you should know that laying off 10% could actually make the company perform better (more efficient) if it's the correct 10% of people laid off like bureaucrats, HR or other "employee controlling positions".
Except those " controlling positions " never go away. A large Corp is political and layoffs are the most political action. They clear cut divisions and the best go with the worst. If you're connected, you transfer out and see it coming. It is a fantasy that a company would perform better. It may become more efficient by having people work 2 or 3 roles for a time. Layoffs are not merit based they are cost based. If you wanted to increase performance you'd fire the low performers.
You're seriously going to try to suggest that no major corporation has had layoffs and not had growth afterwards? Obviously it's not a guarantee, but it does happen, and frequently. Sometimes companies get bloated over time and hire way more people then they need, or venture into markets they're not successful in, and eliminating divisions that are costing them money, or employees that may not be strictly necessary can absolutely produce growth and profit.
Too many business owners and operators don't understand basic economics/business. One of my favorite things was how much people hammered Carly Fiorina (not that I was a huge fan, but I give credit where it'd due) by trying to say how "badly" HP was doing under her reign. What they don't realize is she took a company that was hemorrhaging 800mil/yr and brought that to only hemorrhaging 400mil/yr. That's a GOOD thing. Could it have been better? Certainly.
A number of interesting points. For IoT though the numbers from their quarterly report back them up. Maybe not big numbers yet but fastest growing area. They don't need so many employees to stay ahead of AMD anymore so they're reloading to go after Qualcomm. And as far as Xeon Phi, it's not just pride, they need to stay in the race of specialized computing that Nvidia is so focused on, that could deliver significant growth depending on how things go.
If you design the core once, it's comparably easy to construct the derivatives. I'm sure they're not considering cutting any of them, unless they become very niche products.
Again the "PC market is in decline" argument. Well, what do you expect, if you offer basically the same performance since 2011? Who's gonna part with $150-200 just to get a more capable IGP?
And that's the problem isn't it? Intel can't justify its high prices anymore. AMD already beats it on price/performance for all Celerons and Pentiums and even Core i3 if you count multi-thread performance.
And that's without mentioning that the latest $30 high-end ARM chips are basically within 2x of the performance of Intel's $200 Core-M chips.
So, let's see, I JUST upgraded from a -7-2600k that I had OC'd to 4.3ghz that I built in Jan 2011, to a i7-6700K build that I have clocked at 4.4ghz, and my Firestrike scores on the exact same video card went from ~13.4k, to ~16.7k... That's a roughly 20% increase almost clock for clock. I haven't finished doing my OC'ing, but im already at a stable 4.6 and will work on 4.7 tonight. So there's that also.
Same video card.
I get what you're saying that it's not like the past, where one generation could provide 30-50% or more performance boost. However, saying its "basically" the same performance is really just being intellectually dishonest.
If an integrated video chipset on the CPU were such a great advantage, then AMD would be in the majority of Windows tablets and all-in-ones. But they're not, mostly because Intel chips can get the job done with a CPU/GPU that generates about 5W of heat. If AMD could put a chip out with decent performance at sub-10W, and match all of the features of Intel's chipsets (Thunderbolt, DDR4, huge number of PCIe lanes, etc.), then they would be competitive again. AMD needs a miracle, a successful die-shrink, and a huge injection of cash in order to catch a glimpse of a chance of getting back on even footing with Intel. Right now, AMD has to be sweating bullets that ARM may move into the PC desktop space and ruin them at the low-end.
AMD has Intel's rocket-boots in it's face in the high-end market, and ARM's fist ready to be planted firmly where the sun don't shine in the low-end of the market. If it wasn't for GPU's and game consoles, AMD wouldn't have anything to protect it. Ever seen a smartphone or tablet running an AMD chip? Me either.
I agree. I'm not sure of anyone that won't spend a few more bucks for efficiency, battery life, low heat, etc. Not sure how AMD is relevant anymore except in video cards. We certainly don't go out of our way to save $200 on a $7500 server to go with AMD with lower performance, less driver support, fewer features, and higher heat output.
I'd keep it in mind, but yeah, no matter how good you are, chances are it's going to be incredibly difficult to get your foot in the door (particularly since they're downsizing) unless you have a number of internal connections in the company (and if that's the case, then I and many people hate you for approaching the job hiring process as a basis of who you know rather than what you know).
In any case, I feel Intel and x86 are dying giants right now. The performance gains to be mustered out of x86 processors with additional node shrinks and optimizations looks bleak, and chances are Intel will be behind once a new company with a new direction on future computing steps into the ring with advanced stuff like quantum computing, while Intel's a bit too busy and invested into x86 capital to really make strides into a different kind of computing paradigm.
This is no fault of Intel -- it's the result of fumbles by the bigger fish just up the food chain; the folks in Redmond. Four tiresome years of tiles, with the hope that the world would somehow think Microsoft's tablet was useful. I find both releases downright confusing. Total wreck of an operating system. The latest atrocity was to limit Windows 7 to run only on older hardware. Of course Intel can't sell new Skylake chips if Microsoft's only useful OS is cripped to only run on Broadwell and older.
Let's hope that Apple's new product line gains some popularity, enough to let Intel recover a little from Ballmer's completely botched ideas.
Apple has less than 10% market share worldwide, they can't help Intel. But see my post above: all Intel has to do is come up with more powerful CPUs. Then programmers could build smarter software (or Microsoft could built a smarter Windows) and everyone would be interested in the PC again. Current software runs just fine on an i7-2600k, so why would people look into newer hardware?
Apple isn't even at 10%, its about half that in the real world. Windows 10 is wildly popular. But there really has been no need for hardware upgrades in ten years now. You state that current software runs just fine on an i7-2600k, I'll go further: My 2007 era Core2 Duo laptop from Dell runs Win10 just fine and all basic software. Even low end games. The only upgrade it needed was a SSD.
This is a major problem for Intel. Its becoming a problem for Qualcomm as well. The phone upgrade treadmill has slowed tremendously and Qualcomm's latest results reflect that fact. We fist reached a point of 'good enough' with the hardware. After that it was inevitable that we'd reach a point of saturation since the hardware itself lasts a long time. The only real growth remaining is in one of two areas -
1) Expansion into new markets (China, India, Africa, etc) 2) Creation of new device categories that consume CPU's. Think Hololens.
your personal lack of adaptability to the changes is informing your opinion.
In reality consumers don't care at all about that W7 limitation and have upgraded to W10 already, and most I see aren't complaining about it.
Corporate is different with the support issues, but the fact is that my i5 750 works just fine and the only bottleneck I have is the GPU, so I'm sure office computers have even less need to update.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
49 Comments
Back to Article
knightspawn1138 - Tuesday, April 19, 2016 - link
Don't think of it as downsizing. Think of it as a process node reduction.jwcalla - Tuesday, April 19, 2016 - link
It'll be interesting to see how much R&D is impacted.webdoctors - Tuesday, April 19, 2016 - link
Sounds like almost all the layoffs will be in USA.Based off the numbers, $1.4B/yr, and 12K layoffs, the expectation is a savings of $116K per employee per year. If we assume 90% of those are in the USA, that would bump up the savings to ~$130K/yr assuming the foreign workers are free (but than they wouldn't get laidoff if they were free so that is not a good assumption).
Its unlikely the cuts are to fabs since those dont run themselves, its probably R&D and chip engineering. Is this to make room for an acquisition perhaps? Maybe buy salesforce? That's more than all the employees at AMD, so it would have to be a big acquisition if it was one.
extide - Tuesday, April 19, 2016 - link
Where do you come up with them buying salesforce? Is that an existing rumor or something? Just seems like an oddly random thing to speculate.snowmyr - Wednesday, April 20, 2016 - link
Just have a lot of salesforce stock :/Ej24 - Tuesday, April 19, 2016 - link
It sounds like a lot but think about it like this. Companies, on average, experience ~5% turnover per year in employees. This isn't THAT big. It's 2 years of turn over, probably in the same jobs that experience high turn over.Intel isn't all fabs and R&D, there's janitors, secretaries, accountants, HR, and so on. I doubt they'll really be shrinking PC fab and R&D, probably just the support staff around that department needed to maintain rapid growth such as hiring, payroll and training. If you think about it, this is more like a hiring freeze than a full blown workforce reduction.
jwcalla - Tuesday, April 19, 2016 - link
That's a good spin on it.webdoctors - Tuesday, April 19, 2016 - link
But the problem is Intel already had said last year they would reduce their workforce through freeze and attrition by ~5% last year. They had the program where they would offer incentives for folks to quit so they could reduce their workforce. This announcement is in addition to that, so I don't think the folks who were already going to leave are still there since they would've left last year when Intel first introduced the program with favorable quitting incentives (like 2-3 weeks severance for every year accrued plus some cash).Krysto - Wednesday, April 20, 2016 - link
Turnover is not the same as layoffs. Turnover is change of employees. Are you saying Intel is just going to hire some OTHER 12,000 employees over the next 2 years?Murloc - Wednesday, April 20, 2016 - link
no, he's saying that half of that reduction will happen simply through a turn-over freeze, so it's not as massive as it seems.They've said as much too, voluntary lay-off = freezing turn-over and providing good packages for people who leave.
Gigaplex - Tuesday, April 19, 2016 - link
"Intel expects the bulk of the layoffs to occur within the next 60 days, with the entire process stretching into mid-2017."Knowing you've got roughly a 1 in 10 chance of getting laid off in a drawn out process spanning a year or so can't be good for morale.
bji - Tuesday, April 19, 2016 - link
Employees who pull their own weight don't worry.name99 - Wednesday, April 20, 2016 - link
Hmm. How well have companies that implement your Darwinist theory of management worked out?Microsoft and Dell used Stack Ranking, to notoriously bad effects in both cases. Enron went all-in, to the extent that the company essentially became a criminal enterprise.
Murloc - Wednesday, April 20, 2016 - link
stack ranking hits single people and destroys teamwork.You can apply darwinist management by talking to bosses and seeing who works best considering everything, including team leadership. The problem is that it's not easy to measure.
Krysto - Wednesday, April 20, 2016 - link
Except this looks like it might be division-based, not performance-based. So if you're entire division is wiped out, it won't matter how good you are.MrPoletski - Wednesday, April 20, 2016 - link
hicmore ale anyone? hic
BurntMyBacon - Thursday, April 21, 2016 - link
@Gigaplex: "Knowing you've got roughly a 1 in 10 chance of getting laid off in a drawn out process spanning a year or so can't be good for morale."The article address that for you. (In the update perhaps)
@Article: "While the majority of the notices to employees will go out in 60 days, the projection is that only about half of the layoffs will be completed by the end of this year"
Looks like they'll know pretty quickly. Also, assuming a 5% turnover rate and given that only half of the 11% reduction will take place this year, this is starting to look a lot more like a targeted turnover freeze + the 1% of employees that aren't getting the job done. That said, I've seen voluntary layoffs backfire on companies before. They invariably end up loosing someone that they would rather have kept around.
phatboye - Tuesday, April 19, 2016 - link
Didn't intel have a 20,000 employee layoff about 7 or 8 years ago? is this going to be a repeating cylce where Intel hires a bunch of people then lays them off before they hit retirement age?Ryan Smith - Tuesday, April 19, 2016 - link
The 2006 layoff was roughly 10K people, IIRC.Samus - Tuesday, April 19, 2016 - link
The thing is, Intel, like HP, when they have layoffs, it doesn't affect the "lifers" like most other companies. I know people who work at each company in Oregon and Texas and they've been there for 10-15 years and know people who have been there for 30+ years. The boards are still very old school at these companies and actually respect keeping the aging population around, because in a lot of ways, experience, especially in physics and engineering, is priceless.I also think it's fascinating both companies offer PAID paternity leave for fathers. I'd never even heard of that before my buddy at Intel had his first child.
willis936 - Wednesday, April 20, 2016 - link
Yeah experience is important and if you don't invest in having newer generations getting experience then you'll be screwed when your old guys retire.SirMaster - Wednesday, April 20, 2016 - link
Really? I work for a small ~120 employee manufacturing company and even I get the option for paid paternity leave.Murloc - Wednesday, April 20, 2016 - link
that's standard operating procedure when it comes to engineers. They start costing too much after 50.SirMaster - Wednesday, April 20, 2016 - link
As an engineer, good thing that I am planning my savings and investments around retiring at about 50 then.Ananke - Wednesday, April 20, 2016 - link
Agree. I'm in finance. Perks and salaries are high to retain engineering talent - when it is needed, but they quickly become a burden upon non revenue growth. Q1 revenues were not good in any businesses, Q2 and second half estimates are kinda gloomy, so new capital and R&D expenditures are not going to realize, hence the most expensive are to go. Accountants are 1) not that expensive 2) like a fixed expense - you have 5 accountants with 200 or 500 engineers and technicians 3) majority of them are temp and contracts anyway.When a company has 1.4 bln of pretax related charge, that in Silicon Valley means tech staff laid off , entire division shut down i.e. people on full time employment with benefits and large 401K plans.
name99 - Tuesday, April 19, 2016 - link
This sounds as clueless as Microsoft's version of the same thing a year ago --- a dinosaur flailing around without a real clue.Intel's main business is CPUs, right? So where do they ACTUALLY plan to cut back?
Xeons? God no, that's where the money is.
Core-m? That's where the other money is.
mobile i3, i5, i7? Not if they care about 2-in-1s.
Atom and Quark? I'd get rid of them, but apparently Intel still has fantasies that they can make money with the IoT.
Their various software groups? icc, openMP, the Linux group, numerics, Cilk and other parallel initiatives are probably cheap and pay their way strategically.
LTE? Not if they have those grand IoT plans.
etc etc
I'm mocking them because I have a low tolerance for BS. You don't claim "Our results demonstrate a strategy that’s working and a solid foundation for growth" at the same time that you are LAYING OFF 11% of your workers. You don't claim "this requires some difficult decisions" and then refuse to actually give a single decision as to which groups are going to be affected.
The one group I can think of that they might reasonably toss is SSDs. They have no particular advantage in this space, and the battle has been won --- you don't need to convince anyone now to buy SSDs.
3D-XPoint might POSSIBLY be the same --- I expect Intel had grand dreams of using it as a strategic advantage to force the purchase of Xeons, during whatever window exists while 3D-XPoint is the only viable RAM-bus attached NVMe, but now that it's become clear how large a job integration this sort of persistent memory is (mainly because of the OS work involved), they may be wondering if no OS's will be ready until after alternatives to 3D-XPoint exist, like the various MRAM solutions or alternative ReRAM solutions. This outcome would severely limit 3D-XPoint's strategic value, so once again, why is Intel doing the job that other vendors can do just as well, a job that will be driven by standards and commoditization?
The last possible candidate is Xeon Phi. My guess is that no way has this paid for itself and there's no explosive growth path ahead of it. BUT I suspect Intel management also fear that it's a strategic requirement. If they want various future supercomputer contracts, they need something like Xeon Phi (ie a throughput engine), and they don't want the humiliation of using nV cards in an Intel supercomputer. And THAT is how great empires destroy themselves --- through emotional concerns over "humiliation" rather than rational thought. (God knows, this is the story of 90% of the stupid things the US has done since at least getting involved in Vietnam.)
Meanwhile, kids what's happening to the competition? Oh, TSMC, fresh from telling us that they've been in 10nm "risk" production for some months now (probably too soon for A10, but likely ready for A11 in 2017) said to shareholders today that they're starting 7nm risk production in 2H2017, with mass production hopefully by 2018. (So MAYBE soon enough to hit A12s...) Let the weeping and wailing begin about how TSMC sux because their nanometers aren't the same as Intel nanometers, how Intel fabs are superior because reasons, etc etc...
beginner99 - Wednesday, April 20, 2016 - link
Have you ever worked in a large corporation? If yes you should know that laying off 10% could actually make the company perform better (more efficient) if it's the correct 10% of people laid off like bureaucrats, HR or other "employee controlling positions".hero4hire - Wednesday, April 20, 2016 - link
Except those " controlling positions " never go away. A large Corp is political and layoffs are the most political action. They clear cut divisions and the best go with the worst. If you're connected, you transfer out and see it coming. It is a fantasy that a company would perform better. It may become more efficient by having people work 2 or 3 roles for a time. Layoffs are not merit based they are cost based. If you wanted to increase performance you'd fire the low performers.Kutark - Thursday, April 21, 2016 - link
You're seriously going to try to suggest that no major corporation has had layoffs and not had growth afterwards? Obviously it's not a guarantee, but it does happen, and frequently. Sometimes companies get bloated over time and hire way more people then they need, or venture into markets they're not successful in, and eliminating divisions that are costing them money, or employees that may not be strictly necessary can absolutely produce growth and profit.Too many business owners and operators don't understand basic economics/business. One of my favorite things was how much people hammered Carly Fiorina (not that I was a huge fan, but I give credit where it'd due) by trying to say how "badly" HP was doing under her reign. What they don't realize is she took a company that was hemorrhaging 800mil/yr and brought that to only hemorrhaging 400mil/yr. That's a GOOD thing. Could it have been better? Certainly.
Michael Bay - Wednesday, April 20, 2016 - link
>TSMC>telling us
ABR - Wednesday, April 20, 2016 - link
A number of interesting points. For IoT though the numbers from their quarterly report back them up. Maybe not big numbers yet but fastest growing area. They don't need so many employees to stay ahead of AMD anymore so they're reloading to go after Qualcomm. And as far as Xeon Phi, it's not just pride, they need to stay in the race of specialized computing that Nvidia is so focused on, that could deliver significant growth depending on how things go.MrSpadge - Wednesday, April 20, 2016 - link
> Xeons? Core-m? mobile i3, i5, i7?If you design the core once, it's comparably easy to construct the derivatives. I'm sure they're not considering cutting any of them, unless they become very niche products.
Achtung_BG - Wednesday, April 20, 2016 - link
Intel cuts more employees, than all the staff at AMD......bug77 - Wednesday, April 20, 2016 - link
Again the "PC market is in decline" argument. Well, what do you expect, if you offer basically the same performance since 2011? Who's gonna part with $150-200 just to get a more capable IGP?Krysto - Wednesday, April 20, 2016 - link
And that's the problem isn't it? Intel can't justify its high prices anymore. AMD already beats it on price/performance for all Celerons and Pentiums and even Core i3 if you count multi-thread performance.And that's without mentioning that the latest $30 high-end ARM chips are basically within 2x of the performance of Intel's $200 Core-M chips.
Krysto - Wednesday, April 20, 2016 - link
Err. meant to say within 2x of Core i5 or even Core i7. They're almost equal to Core M.bug77 - Wednesday, April 20, 2016 - link
$200 for a mainstream chip doesn't sound like an awful lot to me. Plus, AMD is in the red, so their prices clearly don't cover their costs.Kutark - Thursday, April 21, 2016 - link
"Basically the same performance".So, let's see, I JUST upgraded from a -7-2600k that I had OC'd to 4.3ghz that I built in Jan 2011, to a i7-6700K build that I have clocked at 4.4ghz, and my Firestrike scores on the exact same video card went from ~13.4k, to ~16.7k... That's a roughly 20% increase almost clock for clock. I haven't finished doing my OC'ing, but im already at a stable 4.6 and will work on 4.7 tonight. So there's that also.
Same video card.
I get what you're saying that it's not like the past, where one generation could provide 30-50% or more performance boost. However, saying its "basically" the same performance is really just being intellectually dishonest.
Shadowmaster625 - Wednesday, April 20, 2016 - link
They need to fire all of their integrated graphics group. And stop wasting silicon on it.SirMaster - Wednesday, April 20, 2016 - link
What? Then how will all the tablets and most laptops get video output?A discrete GPU from Nvidia or AMD? Sounds like extra complexity and additional cost to me.
knightspawn1138 - Wednesday, April 20, 2016 - link
If an integrated video chipset on the CPU were such a great advantage, then AMD would be in the majority of Windows tablets and all-in-ones. But they're not, mostly because Intel chips can get the job done with a CPU/GPU that generates about 5W of heat. If AMD could put a chip out with decent performance at sub-10W, and match all of the features of Intel's chipsets (Thunderbolt, DDR4, huge number of PCIe lanes, etc.), then they would be competitive again. AMD needs a miracle, a successful die-shrink, and a huge injection of cash in order to catch a glimpse of a chance of getting back on even footing with Intel. Right now, AMD has to be sweating bullets that ARM may move into the PC desktop space and ruin them at the low-end.AMD has Intel's rocket-boots in it's face in the high-end market, and ARM's fist ready to be planted firmly where the sun don't shine in the low-end of the market. If it wasn't for GPU's and game consoles, AMD wouldn't have anything to protect it. Ever seen a smartphone or tablet running an AMD chip? Me either.
Dug - Thursday, April 21, 2016 - link
I agree. I'm not sure of anyone that won't spend a few more bucks for efficiency, battery life, low heat, etc. Not sure how AMD is relevant anymore except in video cards.We certainly don't go out of our way to save $200 on a $7500 server to go with AMD with lower performance, less driver support, fewer features, and higher heat output.
willis936 - Wednesday, April 20, 2016 - link
Guess I won't be applying to Intel after I finish this masters.JoeyJoJo123 - Wednesday, April 20, 2016 - link
I'd keep it in mind, but yeah, no matter how good you are, chances are it's going to be incredibly difficult to get your foot in the door (particularly since they're downsizing) unless you have a number of internal connections in the company (and if that's the case, then I and many people hate you for approaching the job hiring process as a basis of who you know rather than what you know).In any case, I feel Intel and x86 are dying giants right now. The performance gains to be mustered out of x86 processors with additional node shrinks and optimizations looks bleak, and chances are Intel will be behind once a new company with a new direction on future computing steps into the ring with advanced stuff like quantum computing, while Intel's a bit too busy and invested into x86 capital to really make strides into a different kind of computing paradigm.
LorinT - Wednesday, April 20, 2016 - link
This is no fault of Intel -- it's the result of fumbles by the bigger fish just up the food chain; the folks in Redmond. Four tiresome years of tiles, with the hope that the world would somehow think Microsoft's tablet was useful. I find both releases downright confusing. Total wreck of an operating system. The latest atrocity was to limit Windows 7 to run only on older hardware. Of course Intel can't sell new Skylake chips if Microsoft's only useful OS is cripped to only run on Broadwell and older.Let's hope that Apple's new product line gains some popularity, enough to let Intel recover a little from Ballmer's completely botched ideas.
bug77 - Wednesday, April 20, 2016 - link
Apple has less than 10% market share worldwide, they can't help Intel.But see my post above: all Intel has to do is come up with more powerful CPUs. Then programmers could build smarter software (or Microsoft could built a smarter Windows) and everyone would be interested in the PC again. Current software runs just fine on an i7-2600k, so why would people look into newer hardware?
Reflex - Wednesday, April 20, 2016 - link
Apple isn't even at 10%, its about half that in the real world. Windows 10 is wildly popular. But there really has been no need for hardware upgrades in ten years now. You state that current software runs just fine on an i7-2600k, I'll go further: My 2007 era Core2 Duo laptop from Dell runs Win10 just fine and all basic software. Even low end games. The only upgrade it needed was a SSD.This is a major problem for Intel. Its becoming a problem for Qualcomm as well. The phone upgrade treadmill has slowed tremendously and Qualcomm's latest results reflect that fact. We fist reached a point of 'good enough' with the hardware. After that it was inevitable that we'd reach a point of saturation since the hardware itself lasts a long time. The only real growth remaining is in one of two areas -
1) Expansion into new markets (China, India, Africa, etc)
2) Creation of new device categories that consume CPU's. Think Hololens.
Murloc - Wednesday, April 20, 2016 - link
your personal lack of adaptability to the changes is informing your opinion.In reality consumers don't care at all about that W7 limitation and have upgraded to W10 already, and most I see aren't complaining about it.
Corporate is different with the support issues, but the fact is that my i5 750 works just fine and the only bottleneck I have is the GPU, so I'm sure office computers have even less need to update.
svan1971 - Tuesday, May 3, 2016 - link
Fire the American workers to make room for those H-1B hires...