• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Researchers Propose New Density Metric for Semiconductor Technology

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.15/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
In today's world, fabrication process identification of semiconductor technology has become little more than marketing fluff. Whereas not that long ago, fabrication processes could (mostly) be directly compared on the basis of transistor density (ie, 300 nm, 32 nm, 14 nm, and now 7 nm), recent advances in manufacturing technologies and their end terminology have lost all significance when it comes to actually describe how density that process is. The way manufacturers measured this semiconductor density used to directly refer to the minimum gate length in transistors fabricated in a given process - that, is, in 300 nm, the minimum gate length that could be achieved was 300 nanometers, in 32 nm, 32 nanometres, and so on. As of now, that isn't happening - which is why we've got Intel saying that its 10 nm fabrication process will be comparable to TSMC's current 7 nm process.

This leads to a number of difficulties for interested parties to actually glean any meaningful information from this particular semiconductor metric alone. Now, a team of researchers has tackled this problem by suggesting a different way to express semiconductor manufacturing capability. Their intention is to allow to "gauge advances in future generations of semiconductor technologies in a holistic way, by accounting for the progress in logic, memory, and packaging/integration technologies simultaneously." As such, their proposed density metric follows a [DL, DM, DC] philosophy, where DL is the density of logic transistors (in #/mm²), DM is the bit density of main memory (currently the off-chip DRAM density, in #/mm²), and DC is the density of connections between the main memory and logic (in #/mm²). The researchers say that current top semiconductor technology density available would be described by this system as [38M, 383M, 12K].





The researchers include metrics such as DM and DC because current computer systems absolutely depend on off-chip memory density, and I/O and actual logic/memory throughput is increasingly important in all computing scenarios (and especially as computing performance increases). Furthermore, the researchers have noted a comparable increase in density of logic transistors and DRAM bit density, which adds to this new density metric. Remember that these values are supposed to represent the maximum effective capability of any given manufacturing process - this means that a next-generation process from Intel could include maximum transistor density expressed in the result of an equation as simple as number of transistors divided by the die area; the maximum available DRAM bit density of memory that can be paired with this fabrication process chip (we expect this number to be changed across product portfolios in a given generation should compatible DRAM density advancements occur in the lifetime of a given die logic manufacturing lifetime; as well as density of logic-to-memory connectivity.



Considering the current naming convention landscape for manufacturing process density is... filled with as many plot-holes as some super-hero franchises, this seems like a sensible way to go about in actually forcing a level playing field between manufacturers. They keep their leeway in marketing terms to describe their processes as they want, but must also provide these density metrics for their process technologies. And since this refers to the maximum densities their process can support (when it comes to transistor logic and memory connection logic densities), it ensures that consumers, institutions, and companies can actually look at the semiconductor landscape with a clear picture.

View at TechPowerUp Main Site
 
Yes, something like this is needed.

IEEE should handle this.
 
Definitely needed if processes are not comparable.
 
Well actually intel is still the only one following that rule, it's the others who are not.
 
Definitely needed if processes are not comparable.


They aren't. I did just a bit of research on the topic, and found that they probably can't describe their process tech by a single number in any case.

To give example, you can have (relatively) low-power / low-performance builds - like phone SoCs use - and get one density, use the same process tech with different 'toolkits' for high-power / high performance designs and wind up with very very different densities. Also the type of design matters a lot, you can get very high density on something like SRAM or NAND and then wind up with poor density doing logic circuits.

And it gets a whole lot more complex than that. They probably need two different ratings with a category qualifier, one for simple repetitive designs like RAM/NAND, and another for complex gates. So you might have a rating that looks like 7N/16N/HP for a high-power/performance toolkit on a node that can achieve 7nm with RAM/NAND and 16nm with gate logic. This is what I suspect TSMC 1st gen 7nm would look like, as actual achieved transistor density from their '7nm' is not much better than Intel 14nm.

Once you know that, it goes a long way to understanding why it wasn't until 7nm that AMDs 7nm based products actually matched up well against Intel 14nm. TSMC 7NP (7+) does look to be better than Intel 14N(++++). Point being the current published process node numbers are 70% marketing.
 
Let’s standardize power consumption and thermal metrics while we’re at it, too.
 
It took several researchers to figure this out? I thought it was common sense.
 
I think there was a power to performance metric in explanation of cache comparison benchmarks. I forgot the reference, however, - provided that you have the density to establish the same hit rate with a slower cache level - you get a win win situation as you can both increase density and power efficiency. Weird but holds true since faster sram is less efficient.
 
I think there was a power to performance metric in explanation of cache comparison benchmarks. I forgot the reference, however, - provided that you have the density to establish the same hit rate with a slower cache level - you get a win win situation as you can both increase density and power efficiency. Weird but holds true since faster sram is less efficient.
I don't think that's true. You may have the same hit rate, but the slower cache is still slower. You may be more power efficient, but you're also slower at the same time.
 
I don't think that's true. You may have the same hit rate, but the slower cache is still slower. You may be more power efficient, but you're also slower at the same time.
Slower caches might have more associativity? I don't know, got to find it.
 
It took several researchers to figure this out? I thought it was common sense.

The problem is readily apparent, the solution is not... as the not so consumer friendly 3-number metric illustrates.
 
The problem is readily apparent, the solution is not.
What do you mean? Just measure how many things* you can squeeze onto a square millimeter. Done.

*things = whole, working transistors
 
What do you mean? Just measure how many things* you can squeeze onto a square millimeter. Done.

*things = whole, working transistors

What about when the same process can be configured in different ways with different densities sometimes within even the same die, as has been happening for a bit now?

Yeah raw density is one thing but that measures the chip, not the node.
 
What about when the same process can be configured in different ways with different densities sometimes within even the same die, as has been happening for a bit now?
Who cares about process? I was talking about the finished product.
 
Who cares about process? I was talking about the finished product.

People making finished products, generally.
 
Pointless exercise. Companies will always find a new way to overinflate the number, and now what, we're going to watch three numbers get inflated? NO thanks.

This stuff belongs to tech sheets, not press.
 
Back
Top