Sunday, May 21st 2023

Intel Exploring x86S Architecture, Envisions an Unadulterated 64-bit Future

Intel has published a highly involved and extensive whitepaper on the topic of streamlining its CPU architectures, most notably by focusing on a purely 64-bit specification, and consequently dropping legacy 32-bit operating modes (as well as 16-bit!). Team Blue's key proposal states: "This whitepaper details the architectural enhancements and modifications that Intel is currently investigating for a 64-bit mode-only architecture referred to as x86S (for simplification). Intel is publishing this paper to solicit feedback from the ecosystem while exploring the benefits of extending the ISA transition to a 64-bit mode-only solution."

The paper provides a bit of background context: "Since its introduction over 20 years ago, the Intel 64 architecture became the dominant operating mode. As an example of this evolution, Microsoft stopped shipping the 32-bit version of their Windows 11 operating system. Intel firmware no longer supports non UEFI64 operating systems natively. 64-bit operating systems are the de facto standard today. They retain the ability to run 32-bit applications but have stopped supporting 16-bit applications natively. With this evolution, Intel believes there are opportunities for simplification in our hardware and software ecosystem."

The intros a small flow diagram: "Certain legacy modes have little utility in modern operating systems besides bootstrapping the CPU into the 64-bit mode. It is worth asking the question, "Could these seldom used elements of the architecture be removed to simplify a 64-bit mode-only architecture?" The architecture proposed in this whitepaper completes the transition to a 64-bit architecture, removing some legacy modes."
Envisioning a Simplified Intel ArchitectureHow Would a 64-Bit Mode-Only Architecture Work?
Intel 64 architecture designs come out of reset in the same state as the original 8086 and require a series of code transitions to enter 64-bit mode. Once running, these modes are not used in modern applications or operating systems.

An exclusively 64-bit mode architecture will require 64-bit equivalents of technologies that currently run in either real mode or protected mode. For example:
  • Booting CPUs (SIPI) starts in real-address mode today and needs a 64-bit replacement. A direct 64-bit reset state eliminates the several stages of trampoline code to enter 64-bit operation.
  • Today, using 5-level pages requires disabling paging, which requires going back to unpaged legacy mode. In the proposed architecture, it is possible to switch to 5-level paging without leaving a paged mode.
These modifications can be implemented with straightforward enhancements to the system architecture affecting the operating system only.

What Would Be the Benefits of a 64-bit Mode-Only Architecture?
A 64-bit mode-only architecture removes some older appendages of the architecture, reducing the overall complexity of the software and hardware architecture. By exploring a 64-bit mode-only architecture, other changes that are aligned with modern software deployment could be made. These changes include:
  • Using the simplified segmentation model of 64-bit for segmentation support for 32-bit applications, matching what modern operating systems already use.
  • Removing ring 1 and 2 (which are unused by modern software) and obsolete segmentation features like gates.
  • Removing 16-bit addressing support.
  • Eliminating support for ring 3 I/O port accesses.
  • Eliminating string port I/O, which supported an obsolete CPU-driven I/O model.
  • Limiting local interrupt controller (APIC) use to X2APIC and remove legacy 8259 support.
  • Removing some unused operating system mode bits.
Legacy Operating Systems on 64-Bit Mode-Only Architecture
While running a legacy 64-bit operating system on top of a 64-bit mode-only architecture CPU is not an explicit goal of this effort, the Intel architecture software ecosystem has sufficiently matured with virtualization products so that a virtualization-based software solution could use virtualization hardware (VMX) to deliver a solution to emulate features required to boot legacy operating systems.

Detailed Proposal for a 64-Bit Mode-Only Architecture
A proposal for a 64-bit mode-only architecture is available. It embodies the ideas outlined in this white paper. Intel is publishing this specification for the ecosystem to evaluate potential impacts to software.
The webpage introduction only serves as a simple primer on the topic of x86S - more technically-minded folks can take a look at the big whitepaper document (PDF) here.
Sources: Intel Articles, Phoronix
Add your own comment

41 Comments on Intel Exploring x86S Architecture, Envisions an Unadulterated 64-bit Future

#1
Daven
I’m all for getting rid of legacy bloat and simplifying hardware and software implementations.
Posted on Reply
#2
TheLostSwede
News Editor
Makes sense for most things consumer at least, since none of us would be reliant on 16-bit or 32-bit legacy operating modes. 32-bit software would still run fine in emulated mode in a 64-bit OS. This should hopefully lead to quicker boot times as well, as fewer things need to be initiated by the system.
Posted on Reply
#3
Beermotor
This didn't go well for them the last few times they tried this.
  • They came out with a 32-bit processor in the early 80s with a different instruction set than the x86 chips. Intel thought people would adopt it because it said "Intel" on the tin. Nobody did and kept using x86 because memory was still very expensive and boards for the 8088 (8086 with an 8-bit front-side bus) were far cheaper.
  • Intel released a 32-bit backwards-compatible x86 processor in 1986 with all kinds of advanced (for the time) features like virtual memory, a flat(ter) memory model, and hardware virtualization. People bought it and ran DOS on it.
  • In 96 Intel released the Pentium Pro and neglected 16-bit performance since they thought 32-bit was the future. Nobody bought it and it ended up relegated to workstations and servers which was a much smaller market back then. They fixed this in the Pentium 2 and they're still using that basic architecture 25 years later.
  • Around the same time they came up with a scheme to switch to 64-bits and drop x86 again. People would buy it because it said "Intel" on the heat-spreader. It took them years longer than anticipated and by the time it came out it ran x86 code (in emulation) drastically slower than actual x86 chips that were out at the time. AMD came out with amd64 that was backwards compatible but cleaned up the x86 architecture in 64-bit long mode. It completely killed the Itanic and forced Intel to start producing chips using amd64. They still claimed for years that this was just temporary and the Itanium still performed better. It didn't.
I'm fully behind getting rid of tech debt, but Intel has a long history of failing to read the room and getting the timing right. I'm thinking the timing might be right in the late 2040s-early 2050s.
Posted on Reply
#4
theGryphon
BeermotorThis didn't go well for them the last few times they tried this.
  • They came out with a 32-bit processor in the early 80s with a different instruction set than the x86 chips. Intel thought people would adopt it because it said "Intel" on the tin. Nobody did and kept using x86 because memory was still very expensive and boards for the 8088 (8086 with an 8-bit front-side bus) were far cheaper.
  • Intel released a 32-bit backwards-compatible x86 processor in 1986 with all kinds of advanced (for the time) features like virtual memory, a flat(ter) memory model, and hardware virtualization. People bought it and ran DOS on it.
  • In 96 Intel released the Pentium Pro and neglected 16-bit performance since they thought 32-bit was the future. Nobody bought it and it ended up relegated to workstations and servers which was a much smaller market back then. They fixed this in the Pentium 2 and they're still using that basic architecture 25 years later.
  • Around the same time they came up with a scheme to switch to 64-bits and drop x86 again. People would buy it because it said "Intel" on the heat-spreader. It took them years longer than anticipated and by the time it came out it ran x86 code (in emulation) drastically slower than actual x86 chips that were out at the time. AMD came out with amd64 that was backwards compatible but cleaned up the x86 architecture in 64-bit long mode. It completely killed the Itanic and forced Intel to start producing chips using amd64. They still claimed for years that this was just temporary and the Itanium still performed better. It didn't.
I'm fully behind getting rid of tech debt, but Intel has a long history of failing to read the room and getting the timing right. I'm thinking the timing might be right in the late 2040s-early 2050s.
What else do you see happening late 2040s-early 2050s? Just so I plan ahead...
Posted on Reply
#5
Tomorrow
BeermotorThis didn't go well for them the last few times they tried this.
  • They came out with a 32-bit processor in the early 80s with a different instruction set than the x86 chips. Intel thought people would adopt it because it said "Intel" on the tin. Nobody did and kept using x86 because memory was still very expensive and boards for the 8088 (8086 with an 8-bit front-side bus) were far cheaper.
  • Intel released a 32-bit backwards-compatible x86 processor in 1986 with all kinds of advanced (for the time) features like virtual memory, a flat(ter) memory model, and hardware virtualization. People bought it and ran DOS on it.
  • In 96 Intel released the Pentium Pro and neglected 16-bit performance since they thought 32-bit was the future. Nobody bought it and it ended up relegated to workstations and servers which was a much smaller market back then. They fixed this in the Pentium 2 and they're still using that basic architecture 25 years later.
  • Around the same time they came up with a scheme to switch to 64-bits and drop x86 again. People would buy it because it said "Intel" on the heat-spreader. It took them years longer than anticipated and by the time it came out it ran x86 code (in emulation) drastically slower than actual x86 chips that were out at the time. AMD came out with amd64 that was backwards compatible but cleaned up the x86 architecture in 64-bit long mode. It completely killed the Itanic and forced Intel to start producing chips using amd64. They still claimed for years that this was just temporary and the Itanium still performed better. It didn't.
I'm fully behind getting rid of tech debt, but Intel has a long history of failing to read the room and getting the timing right. I'm thinking the timing might be right in the late 2040s-early 2050s.
Most recent example is ATX12VO. A needlessly complex scheme that would split up one simple 24pin connector to multiple ones that take more space and in the process makes motherboards more expensive and easier to break due to added 3v and 5v processing that is currently done on PSU's. It was proposed years ago and aside from some prototypes, has gone nowhere even on OEM systems.
Posted on Reply
#6
Xajel
I like it, 32bit has become rare to the point that only limited hardware is being used for such systems (embedded CPUs and OSs).

But, they'll have to work with at least AMD, IBM, Microsoft & the Linux community for this to work probably, this is not a simple x86 extension anymore like SSE, AVX, and to be "simple" as they call it, at least Intel and AMD must both agree on basic paths, especially 16bit & 32bit virtualization, it won't be simple anymore if it required separate code paths to address both AMD and Intel systems.

x86 is old, and has so much legacy things that it makes it slow and inefficient compared to modern alternatives (arm, risc-v).
Posted on Reply
#7
Denver
XajelI like it, 32bit has become rare to the point that only limited hardware is being used for such systems (embedded CPUs and OSs).

But, they'll have to work with at least AMD, IBM, Microsoft & the Linux community for this to work probably, this is not a simple x86 extension anymore like SSE, AVX, and to be "simple" as they call it, at least Intel and AMD must both agree on basic paths, especially 16bit & 32bit virtualization, it won't be simple anymore if it required separate code paths to address both AMD and Intel systems.

x86 is old, and has so much legacy things that it makes it slow and inefficient compared to modern alternatives (arm, risc-v).
Elaborate for us... How are the world's fastest CPUs "slow and inefficient"? It will hardly bring any advantage beyond making a CPU design a little bit leaner.


it's a similar situation to apple, samsung and others removing P2 connection, charger etc.. simplifies it, saves space and a few cents
Posted on Reply
#8
mechtech
Makes sense (at least in consumer space and maybe more).............probably made sense almost 10 years ago.

Would be nice if all software was good for 4 thread minimum and 64-bit as well...........

Maybe in my lifetime.....................hopefully..............
Posted on Reply
#9
_Flare
remembers me of this 2016 rumor www.bitsandchips.it/english-news/rumor-even-intel-is-studying-a-new-x86-uarch
and as the facts and data where sparse 6.5 years ago, it is true that the first SSE extensions where not for 64bit OS in its base.
So those rumor had not the same specifics as released by intel now, but look related to me nevertheless.
Additionally over the time TigerLake .. AlderLake etc. seem related to another in an evolutionary way, maybe until the Royal Cores will be released where Jim Keller had its hands in the game.
And Jim Keller imho stands for for revolution instead of tiny stepped evolution.
Posted on Reply
#10
R-T-B
DenverElaborate for us... How are the world's fastest CPUs "slow and inefficient"?
The ISA is, not the cpus. They could even be faster, is how to look at it. These old features waste die space. They can use that space for better things.
Posted on Reply
#11
TheoneandonlyMrK
Sounds like a side angle to leverage pluto like security nonesense to me I dunno.
Plus the obligatory upgrade force choke slam.
Posted on Reply
#12
LabRat 891
Get a load of this wankery:
Since its introduction over 20 years ago, the Intel 64 architecture became the dominant operating mode.
EM64T (and descended) are Intel-implementations of AMD64; because Itanium64 was a spectacular garbage fire.

(To the point that the mere association of Itanium's reputation and "64-bit" actually harmed A64 sales. It was a common misconception that "64-bit" processors were "not compatible" or "no performance benefit", thanks to Itanium)

Their statement is factually true, when dissected.
However, it makes it sound like an Intel-created technology is market-dominant. It's not saying that at all; all it says is
"the majority of Intel processors operate in 64-bit mode"

Well, no sh*t, Sherlock!
Posted on Reply
#13
trsttte
TomorrowMost recent example is ATX12VO. A needlessly complex scheme
ATX12VO is actually much simpler than regular ATX and I think it was a good change that will be a hard requirement to meet power efficiency targets eventually. It's just that other partners need to get onboard and, well, they didn't. Another example of failing to read the room I guess.

Yes, it does move complexity from the PSU to the Motherboard but that's not a bad thing, the motherboard can better manage power requirements relative to how many devices are/aren't connected and gets a better opportunity to offer nice to have features like higher power USB ports.

The PSU gets to be more of what it already is, a simple, dumb and reliable brick.
Posted on Reply
#14
TumbleGeorge
In the case of Itanium, Intel wanted to carve out a niche market with a specific proprietary architecture to milk from. In the case of the present work, I am not sure that there is a coincidence of intentions. I think that not only Intel would benefit from the change. Avoiding unnecessary steps in the organization of processes of jumping between modes will be a gain for users as well. Otherwise, the savings in the area of the logic in the cores is so insignificant that it hardly makes sense to list it as a motive.
Posted on Reply
#15
chrcoluk
As long as emulated mode has no notable performance or compatibility costs then its fine its progress, but if it does, then it will be controversial as 32 bit is still commonly used to compile applications, and of course legacy stuff wouldnt be updated such as older games.

16 bit is already emulated successfully I believe.
Posted on Reply
#16
Vayra86
Single digit % benefits is where Intel is desperately looking to gain x86 leverage. Is it even more than 2-3% I wonder.

It can't be more, I mean, can anyone give any insight on the actual cost on a die/ISA for keeping this intact?
TheoneandonlyMrKSounds like a side angle to leverage pluto like security nonesense to me I dunno.
Plus the obligatory upgrade force choke slam.
Yeah
Intel didn't do this in the ten plus years they had a tech leadership position, curiously.
Posted on Reply
#17
Denver
Vayra86Single digit % benefits is where Intel is desperately looking to gain x86 leverage. Is it even more than 2-3% I wonder.

It can't be more, I mean, can anyone give any insight on the actual cost on a die/ISA for keeping this intact?


Yeah
Intel didn't do this in the ten plus years they had a tech leadership position, curiously.
Considering that legacy 32-bit and 16-bit instructions can make up about 10% to 20% of the total x86 instruction set, removing these instructions could potentially save about 1-2% of space on a modern CPU. Yeah, game changer...
Posted on Reply
#18
Wirko
Intel seems unable to describe in simple terms what they're up to, so it's too easy for us to make wrong assumptions. The 32-bit mode is not going anywhere, it's staying. Here's the big change: an Intel processor will no longer be able to boot a 32-bit OS except in a VM. But 32-bit application code will continue to execute natively, no virtualisation needed, let alone emulation.

My guess is that Intel found security vulnerabilities in 32-bit protected mode and related stuff, or may be expecting to find them in the future, so they will remove functionality that's no longer really needed, just in case.



Posted on Reply
#19
nageme
Haven't read the paper, but from the overview section titled "What Would Be the Benefits of a 64-bit Mode-Only Architecture?" the benefits are unclear. What's listed sounds totally insignificant as far as end-users are concerned, so I do wonder.
Denverlegacy 32-bit and 16-bit instructions can make up about 10% to 20% of the total x86 instruction set
Where's the info from?
removing these instructions could potentially save about 1-2% of space on a modern CPU. Yeah, game changer...
No idea, but if I had to guess I'd say much less than that. I'd guess simplifying the instruction decoder isn't the main benefit. Even if it were, the decoder doesn't take up much room. For example, it's difficult to find in this Intel 13th-gen floorplan (annotated by someone random, not sure how authoritative or accurate).
Posted on Reply
#20
The_Enigma
Since its introduction over 20 years ago, the Intel 64 architecture became the dominant operating mode.
Lol. Ok Intel. I guess if you say you invented it then it must be true huh? It's too bad people still remember your actual 64-bit attempt called Itanium.
Posted on Reply
#21
Denver
nagemeHaven't read the paper, but from the overview section titled "What Would Be the Benefits of a 64-bit Mode-Only Architecture?" the benefits are unclear. What's listed sounds totally insignificant as far as end-users are concerned, so I do wonder.


Where's the info from?


No idea, but if I had to guess I'd say much less than that. I'd guess simplifying the instruction decoder isn't the main benefit. Even if it were, the decoder doesn't take up much room. For example, it's difficult to find in this Intel 13th-gen floorplan (annotated by someone random, not sure how authoritative or accurate).
I found it around:

"So, it was estimated that the Pentium used 30% of its transistors to support the x86 ISA. Since the x86 ISA and support hardware remained relatively constant, by the Pentium 4 era, x86 support was estimated to account for 10% of the transistor count.

Ars Technica' Jon Stokes touches on this x86-RISC decoding cost on The Pentium: An Architectural History of the World's Most Famous Desktop Processor."
Posted on Reply
#23
Minus Infinity
The Itanic has been refloated.

Would be great to get rid of legacy bloat. I still reading plenty of whinging from people complaining they have 25 year old programs that simply wouldnt run if they went pure 64 vit, and seriously think they should even be a concern of cpu makers in 2023.
Posted on Reply
#24
Dr. Dro
I made a discussion thread about it a couple of days ago, it got buried on the other posts with seemingly little interest:

www.techpowerup.com/forums/threads/intel-proposes-x86-s-a-redux-of-the-x86-architecture.308873/

Some people seem to have voted negatively on my poll, but without an explanation as to why in the comments. I'm still quite interested in the subject, so feel free to pitch in either on my thread or over here. It seems like one of the biggest changes in how traditional desktop processors work in quite some time.
nagemeNo idea, but if I had to guess I'd say much less than that. I'd guess simplifying the instruction decoder isn't the main benefit. Even if it were, the decoder doesn't take up much room. For example, it's difficult to find in this Intel 13th-gen floorplan (annotated by someone random, not sure how authoritative or accurate).
Yeah, I agree. But it's cleaning house so to speak, this is one of ARM's biggest advantages right now.
Posted on Reply
#25
lexluthermiester
Xajelx86 is old, and has so much legacy things that it makes it slow and inefficient compared to modern alternatives
That is nonsense. Legacy instructions work fine.
Posted on Reply
Add your own comment
Dec 3rd, 2024 13:31 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts