Page 1 of 1

Are Intel's days numbered?

Posted: Sat Dec 22, 2018 11:22 am
by Dazbobaby
Last month Amazon announced the intention to move away from intel's x86 to ARM for their AWS servers. Amazon servers are a big deal, but they're only a small player in the greater scheme of things. But IT IS a growing trend.
https://www.theregister.co.uk/2018/11/2 ... ton_specs/

Now ARM servers are nothing new really, they've been around a for a few years, mostly very very niche products, low power, low compute, but high core and thread count. Now the really big deal with ARM is the power consumption, compared to Intel the consumption of the CPU is around 75% less, 7watts upto 15watts vs Intel's 30watts to 60watts. For a data centre the energy cost saving will be in the millions of £/$'s per year, not only do you immediately need less electricity to power a server, you also need less energy to cool all those beautiful servers.
The main reason you see ARM CPU's and GPU's in mobile devices is the power saving. It simply means your battery lasts longer and generates less heat.

But now there are even more compelling reasons to switch. Over the years Intel has simply brute forced their way forward, by creating smaller transistors, more of them in the same space, and ever increasing clock speeds, the one thing overlooked is the scaleability of the x86. Simply put it scales very poorly compared to ARM. The theoretical limit of Intel CPU's on a motherboard is 4, Once you start to add more then you see performance start to go backwards. This is one reason why most Intel server motherboards support a maximum of 2 CPU's, performance per watt takes a serious nose dive after 2. 3 and 4 CPU's only make marginal speed improvements while power consumption quadruples.

ARM on the other hand has the ability to scale massively, with a theoretical limit of 65535 CPU's on a single board... of course that would be fecking enormous and impossible to build, but that's the limit per board and just a little bit above the x86 limit of 4! But stack boards together and the upper limit is only matched by your imagination. Of course this is all dependent on the appropriate application, some tasks are not parallelised and don't run on more than one single core. But this is the age of AI, and like a human brain, parallelisation is not only recommended, but an absolute must, and the more cores/threads the better. Which is why some AI developers, Google for example, with use entire data centre's just to teach and train these emerging artifical intelligence's. Also AI is not everything that requires scalability, so does weather prediction, fluid dynamics, astronomy and space research, pharmaceutical applications such as cancer research, and a whole hell of a lot more. The more cores and threads you throw at these problems the faster they get solved.

This is all fine for servers, researchers and data centre's, but useless for desktop applications.

For that we have AMD, and the new generation of desktop CPU's due out in February 2019 aim to pummel Intel further, while Intel languish in 14nm hell and struggle with 10nm and 7nm, AMD have already broken past these barriers and we'll see for real what will happen in just 2 months.
To a gamer and x86 CPU is an essential part of a system, hell, for anyone on a desktop or laptop, Apple or PC, you need an x86. But even this traditional landscape is changing and changing very quickly.
In the next itteration of Apple laptops and desktops, Apple are aiming for ARM CPU's again. Apple did run ARM for around 25 years before switching to Intel around 12 years ago. Now they're switching back, partly for cost as ARM is significantly cheaper than Intel, but also for reduced power consumption (Great for laptops), and reduced thermal output (Great for system stability and longevity - heat kills electronics like nothing else). But it's not just Apple that's changing dance partners, so are Microsoft. For Microsofts line of tablets and laptops they too will embrace ARM. While on the surface it may seem like a good idea, it complicates matters as whole systems have to be reprogrammed to work with a new architecture in the CPU. Even windows has to be reprogrammed to work with ARM, but these are problems that can be solved, intel woes with 10nm and 7nm designs can be overcome, the speed restrictions mean Intel will no longer be able to brute force the x86 performance limits. Moore's law came to an end some years ago.

And you can bet that Apple also have the same problem switching from x86 to ARM just like Microsoft. But it's not just these two behemoth's. It's also the independent programmer, large teams, massive teams and even game developers who will have to re-learn the skills required to code for both ARM and x86 during the next few transitioning years. Once ARM takes hold in the server, desktop, mobile and embedded markets, making software for any device and making it cross-compatible will be much easier than it is now.

The only benefit to sticking with x86 is the simple fact that while ARM can emulate x86... its very very poor at doing so. So backwards compatibility problems will still need x86... but eventually even this will be of a lesser problem and the need to pluck up the courage and switch will happen one day.

But these problems aren't just intel's problems, they're also AMD's problems, but while AMD are negating the scalability issue with chiplets, the problem is and always will be there. Still 64 cores and 128 threads will help solve some of these issues.

A new ARM server, apparently it's awesome. It's still new and does suffer in some speed tests, but these can be improved with tweaking.


The source of my curiosity: #1


Adored TV: #2

Re: Are Intel's days numbered?

Posted: Sun Dec 23, 2018 12:29 pm
by SyM
Re: Are Intel's days numbered?

Nah - to big lol