# Cubic boron arsenide semi conductors



## KapiteinKoek007 (Aug 3, 2022)

I just read an article on tomshardware about researchers finding a promising new kind of semi conductor, composition in the title.  Promising a big improvement, lower temperatures, better electrical conductivity, and improved performance. Silicon just has a really bad thermal conductivity, that's why the nodes keep getting smaller and smaller but with way more transistors than the previous gen nodes, thus improved performance with lower power required, thus lower thermals.
But, i assume, everything has a limit. The smallest node currently is 18A or 1.8nm, think it was samsung that has the current lead in terms of smallest node.
The article does continue on saying it will probably take decades to replace it as an industry standard. Then there's quantum computing, destroying any classical binary computing system by to the power of 10 in terms of raw computing power.
But nothing in our current digital world is ready for quantum computing, security being the biggest problem, if a quantum computer can hack a 2048 bit satellite encryption in seconds it's a big big problem, but I digress.

Do you guys think this new type of semi conductor will be the next silicon with classical computing systems ? And if so, would it really take decades for it be commercially available?


----------



## lexluthermiester (Aug 3, 2022)

KapiteinKoek007 said:


> Then there's quantum computing, destroying any classical binary computing system by to the power of 10 in terms of raw computing power.


That is a myth. Quantum computers do some work very fast, but they will never replace traditional computers. Even if someone comes up with a room temperature QBit IC, they still will not replace transistors. They don't work the same way.



KapiteinKoek007 said:


> Do you guys think this new type of semi conductor will be the next silicon with classical computing systems ?


Who knows. Some thought Gallium Arsenide would be the Silicon replacement, then Gallium Nitride and so on. We need something though. Silicon is reaching it's limits very quickly.


----------



## KapiteinKoek007 (Aug 3, 2022)

lexluthermiester said:


> That is a myth. Quantum computers do some work very fast, but they will never replace traditional computers. Even if someone comes up with a room temperature QBit IC, they still will not replace transistors. They don't work the same way.


They don't work the same way because no one has written an foundation algorithm yet. BMW for example did use its own quantum dwave hybrid system to calculate the best position for sensor location in their vehicles, a task that would've taken 70 times slower when done with a classical supercomputer. But if IBM already has an 127 qubit (albeit not at room temperature) they are going to replace classical systems, china's Zuchongzhi 2.1 performs quantum computing 10 million times faster than google's sycamore computing system, and Google sycamore did a task in 2019 in *200 seconds* that Google claimed, would take a conventional binary system *10.000* years to complete.
Yes, there is no practical use for quantum computing yet. So all that computing power is useless for the most part right now. But, how many practical use was there for binary computing back in 1945 when The ENIAC was created?  I'm guessing not alot. It's a matter of time in terms of quantum computing. They will replace classical systems, they just have to find a way to have it function with room temps.


lexluthermiester said:


> Who knows. Some thought Gallium Arsenide would be the Silicon replacement, then Gallium Nitride and so on. We need something though. Silicon is reaching it's limits very quickly.


Yeah I assume TSMC, Intel, Qualcomm, Micron, Samsung, all the major chip makers have an R&D department that probably have dozens more semi conductor candidates lined up.


----------



## joemama (Aug 3, 2022)

Just reading from wikipedia, "Chemical synthesis of cubic BAs is very challenging and its single crystal forms usually have defects", looks like the defects are a big blocker for it.
With large amounts of defects, integrated circuits in small dimensions won't be possible, this is also the issue for gallium arsenide semiconductors.
Unless some day we develop a new process to sythesize BAs with minimal defects, it won't be able to replace silicon


----------



## KapiteinKoek007 (Aug 3, 2022)

joemama said:


> Just reading from wikipedia, "Chemical synthesis of cubic BAs is very challenging and its single crystal forms usually have defects", looks like the defects are a big blocker for it.
> With large amounts of defects, integrated circuits in small dimensions won't be possible, this is also the issue for gallium arsenide semiconductors.
> Unless some day we develop a new process to sythesize BAs with minimal defects, it won't be able to replace silicon


Ofcourse, but you're describing production problems and practical use of the semi conductor. Those are challenges that can be overcome, silicon was far from perfect when the Micral N was launched back in the 70's. Im guessing the increasing silicon shortages will probably motivate the private sector to innovate.


----------



## Wirko (Aug 3, 2022)

KapiteinKoek007 said:


> I just read an article on tomshardware about researchers finding a promising new kind of semi conductor, composition in the title.  Promising a big improvement, lower temperatures, better electrical conductivity, and improved performance. Silicon just has a really bad thermal conductivity, that's why the nodes keep getting smaller and smaller but with way more transistors than the previous gen nodes, thus improved performance with lower power required, thus lower thermals.


Thermal conductivity of silicon sure is an issue but I don't think it's already a limiting factor. Even if it _conducted _heat with zero thermal resistance, transistors would still _generate _as much heat, therefore, consume as much electrical power. We'd still need to have monstruous cooling systems and equally monstruous power delivery - just think about the Cerebras whole-wafer computer. We'd still be dealing with the fact that some power is lost even before it reaches the transistors - in the PCB tracks on the substrate and in the metal layers on the chip. A significant part of power I believe, but can't find any hard data.


KapiteinKoek007 said:


> But, i assume, everything has a limit. The smallest node currently is 18A or 1.8nm, think it was samsung that has the current lead in terms of smallest node.


You understand that if you use this kind of nanometers consistently then Mr. Pat Gelsinger is about 8 inches tall, right?


KapiteinKoek007 said:


> Do you guys think this new type of semi conductor will be the next silicon with classical computing systems ?


I think smart people will find materials and structures that are a little better. Or significantly better. Not fundamentally better. BAs is supposed to have much higher electron mobility than Si, that's good. Hole mobility equal to electron mobility, that's good too - it's a real technical issue in Si transistors, you can't build perfectly symmetrical CMOS pairs because of that.


KapiteinKoek007 said:


> And if so, would it really take decades for it be commercially available?


Yes. It doesn't seem like something that can bring prices down in, like, a decade. No matter what materials we use, and even if we employ nanotubes and nanosheets and whatnot, we only have one method that can mass produce transistors - lithography and deposition and chemical processing, each repeated many times over. That's how we manage to make a ~1 n$ (nanodollar) transistor, and I see no indication that it's going to become much cheaper, and the price of materials is probably a small fraction of total manufacturing cost.


----------



## KapiteinKoek007 (Aug 3, 2022)

Wirko said:


> Thermal conductivity of silicon sure is an issue but I don't think it's already a limiting factor. Even if it _conducted _heat with zero thermal resistance, transistors would still _generate _as much heat, therefore, consume as much electrical power. We'd still need to have monstruous cooling systems and equally monstruous power delivery - just think about the Cerebras whole-wafer computer. We'd still be dealing with the fact that some power is lost even before it reaches the transistors - in the PCB tracks on the substrate and in the metal layers on the chip. A significant part of power I believe, but can't find any hard data.
> 
> You understand that if you use this kind of nanometers consistently then Mr. Pat Gelsinger is about 8 inches tall, right?
> 
> ...


Solid answer! Thanks. But I disagree on you saying that smaller nodes are producing just as much heat per mhz as previous generations. It's a fact that the voltage needed to achieve the same frequency (and often with more cores) have dropped with each new generation. The fact that most cpus today are still generating massive amounts of heat is mainly because of the heat sinks which on intel hasn't really changed alot the past few generations, looking at AMD, their zen 2/3 and now 4 chips have heatsinks that evolved with the rest of the chip, intel is lagging behind on this.


----------



## Wirko (Aug 3, 2022)

KapiteinKoek007 said:


> Yes, there is no practical use for quantum computing yet. So all that computing power is useless for the most part right now.


Even today it's pretty clear that quantum computers will be exceedingly good at simulating physical processes, which is what supercomputers and HPC clusters are often used for. That may include machine learning.


KapiteinKoek007 said:


> But, how many practical use was there for binary computing back in 1945 when The ENIAC was created? I'm guessing not alot. It's a matter of time in terms of quantum computing.


Hm. That's not a good analogy. Mechanical and electromechanical digital computers were a common occurence before electronic digital computers appeared. The theory and usability were well known and probably not horribly hard to grasp even to an engineering or mathematics student at the time. Digital communication and storage was even more common - think Morse code and telegraph and musical boxes. Quantum communications? Maybe I'm ignorant but I'm not aware of even an attempt to make a quantum computer exchange quantum data (whatever that is) with another quantum computer. Yes, there are digital optical communications secured by quantum effects but that's not the same.


----------



## TheoneandonlyMrK (Aug 3, 2022)

KapiteinKoek007 said:


> Solid answer! Thanks. But I disagree on you saying that smaller nodes are producing just as much heat per mhz as previous generations. It's a fact that the voltage needed to achieve the same frequency (and often with more cores) have dropped with each new generation. The fact that most cpus today are still generating massive amounts of heat is mainly because of the heat sinks which on intel hasn't really changed alot the past few generations, looking at AMD, their zen 2/3 and now 4 chips have heatsinks that evolved with the rest of the chip, intel is lagging behind on this.


The cooler attached has little influence in the capacity of a chip to heat it.
New smaller nodes give off more heat IN the circuit simply because of smaller transistors being more tightly arranged in a ever smaller package, and usually significantly more transistors each generation.
Dropping the volts used might help, but it cannot fully mitigate the heat saturation issues that abound now.

There's just too much happening in too small a space.


----------



## bug (Aug 3, 2022)

When you design transistors 5nm big*, that's like 25 Si atoms side by side. When you're looking at compounds, there's even less of them. So the alternative needs to be a good deal better than Si, if it is to work with fewer molecules.
Also keep in mind that, for 100 things that work in a lab, there's maybe 1 product out there built on an industrial scale.

*good thing transistors aren't actually nm yet, some of their features are.


----------



## qubit (Aug 3, 2022)

KapiteinKoek007 said:


> But nothing in our current digital world is ready for quantum computing, security being the biggest problem


Oh, I am.


----------



## Shrek (Aug 3, 2022)

KapiteinKoek007 said:


> Then there's quantum computing, destroying any classical binary computing system by to the power of 10 in terms of raw computing power.



There are very few things a quantum computer can do exponentially faster and even then, without error correction it will not scale.

Grover's search algorithm is generic, but not exponentially faster.


----------



## qubit (Aug 3, 2022)

Shrek said:


> There are very few things a quantum computer can do faster and even then, without error correction it will not scale.


The technology is still in its infancy, have patience and give it time.  It really will live up to the promises when it's matured.

I read an article only the other day about research scientists making great advances with the decoherence problem. Essentially extending it from something like half a second to the full time of the experiment of 5.5 seconds. If I find the article I'll post it here.


----------



## Shrek (Aug 3, 2022)

Very impressive, but I remain skeptical.

Here is why: there is an error correcting scheme by Shor, but it presupposes that the support bits are error free; I find this a mathematical contradiction, to admit errors, but not in all qbits involved.


----------



## qubit (Aug 3, 2022)

Shrek said:


> Very impressive, but I remain skeptical.
> 
> Here is why: there is an error correcting scheme by Shor, but it presupposes that the support bits are error free; I find this a mathematical contradiction, to admit errors, but not in all qbits involved.


You have to read that article to understand how. It's new research and hard to remember the details off the top of my head, but I seem to remember it used extra "sacrificial" qubits to increase the coherence time. It certainly didn't violate any laws of physics.


----------



## Shrek (Aug 3, 2022)

I should have been more careful; I am not skeptical about the increased coherence time, but rather what it is aimed at, namely error free calculation. There is a reason analog computing is all but dead (error accumulation).

But this is off topic; perhaps there should be a thread on quantum computing?


----------



## TheoneandonlyMrK (Aug 3, 2022)

Gan found it's place and perfect commercial applications.
And is in use today, hopefully they are quicker with this though.


----------



## Shrek (Aug 3, 2022)

Did it? in the sense that they are good for power ICs and blue LEDs, but not so much high-density integrated circuits.

Correct me if I am wrong.


----------



## TheoneandonlyMrK (Aug 3, 2022)

Shrek said:


> Did it? in the sense that they are good for power ICs and blue LEDs, but not so much high-density integrated circuits.
> 
> Correct me if I am wrong.


I said it has its place, not that it took over the world and is competing with silicon.

And it does, in the right use case it makes silicon look bad.


----------



## DeathtoGnomes (Aug 3, 2022)

lexluthermiester said:


> Silicon is reaching it's limits very quickly.


including supply.



Shrek said:


> ?


This article about Silicon output dropping significantly causing a potential shortage. There are other sources talking about this, any shortage will be because of bigger demand vs mining production.






						Silicon Shortage May Intensify in 2022 amid Low Supply_SMM | Shanghai Non ferrous Metals
					

From January to September of 2021, the output of silicon metal increased by 39% year-on-year, and the annual output is expected to reach a record high of 2.78 million mt.




					news.metal.com


----------



## Shrek (Aug 3, 2022)

Ah, due to a shortage of energy to process it; it is after all the main component of sand.


----------



## DeathtoGnomes (Aug 3, 2022)

Shrek said:


> Ah, due to a shortage of energy to process it; it is after all the main component of sand.


Yea and its costs will escalate high enough to make it necessary to put more effort into mass producing other materials as was mentioned already.


----------



## lexluthermiester (Aug 4, 2022)

KapiteinKoek007 said:


> Yeah I assume TSMC, Intel, Qualcomm, Micron, Samsung, all the major chip makers have an R&D department that probably have dozens more semi conductor candidates lined up.


Most do, but it's slow going because most do not see the writing on the wall yet. My guess is that whatever it is will have some form of Arsenide as a substrate material.



DeathtoGnomes said:


> This article about Silicone output dropping significantly causing a potential shortage.


You meant Silicon..


----------



## DeathtoGnomes (Aug 4, 2022)

lexluthermiester said:


> You meant Silicon..


fixed. Gee wonder where my mind was at...


----------



## lexluthermiester (Aug 4, 2022)

DeathtoGnomes said:


> fixed. Gee wonder where my mind was at...


It's all good, no worries.


----------



## Wirko (Aug 4, 2022)

KapiteinKoek007 said:


> Yeah I assume TSMC, Intel, Qualcomm, Micron, Samsung, all the major chip makers have an R&D department that probably have dozens more semi conductor candidates lined up.


That mix of companies probably looks a little different. Like TSMC, Intel, Samsung, IBM, imec (woohoo, a European company), Lam Research, Applied Materials. So, chip makers and manufacturing equipment makers, but not fabless chip designers. I also suspect other companies on this list got at least some patents from doing semiconductor research.


----------



## KapiteinKoek007 (Aug 4, 2022)

TheoneandonlyMrK said:


> The cooler attached has little influence in the capacity of a chip to heat it.
> New smaller nodes give off more heat IN the circuit simply because of smaller transistors being more tightly arranged in a ever smaller package, and usually significantly more transistors each generation.
> Dropping the volts used might help, but it cannot fully mitigate the heat saturation issues that abound now.
> 
> There's just too much happening in too small a space.


wasnt talking about the heatsink of a cooler, but the integrated heat spreader of the cpu itself. that design hasnt changed alot on intel especially the last few years when they were pushing everything out that 14nm node. increased voltages on a chip that hasnt had any real architectual changes in years, same goes for IHS on the cpu, because really, a 11th gen cpu is basicly a broadwell on steriods. broadwell released in 2014, only some i3 the cannon lake for example had a 10nm node. the rest of the i5/i7/9 from 2014 up untill about 2020, no real architectual changes (ihs included) since the 5th gen broadwell release.


----------



## TheoneandonlyMrK (Aug 4, 2022)

KapiteinKoek007 said:


> wasnt talking about the heatsink of a cooler, but the integrated heat spreader of the cpu itself. that design hasnt changed alot on intel especially the last few years when they were pushing everything out that 14nm node. increased voltages on a chip that hasnt had any real architectual changes in years, same goes for IHS on the cpu, because really, a 11th gen cpu is basicly a broadwell on steriods. broadwell released in 2014, only some i3 the cannon lake for example had a 10nm node. the rest of the i5/i7/9 from 2014 up untill about 2020, no real architectual changes (heatsink included) since the 5th gen broadwell release.


So, that's still a thermal dissipation device and is irrelevant with regards to Why modern chip's are hotter.
My point stands, more transistors, less space = more heat.

What your talking about might be true, no change but it's not responsible for the core heat issues.

Yes it might mitigate the issues but it didn't cause them.


----------



## KapiteinKoek007 (Aug 4, 2022)

TheoneandonlyMrK said:


> So, that's still a thermal dissipation device and is irrelevant with regards to Why modern chip's are hotter.
> My point stands, more transistors, less space = more heat.
> 
> What your talking about might be true, no change but it's not responsible for the core heat issues.
> ...


Thermal dissapation is completely relevant for the IHS. if they are originally designed for 4 cores at 4 ghz, and now they are at 8 cores 5.3ghz (11th gen i9) a big bump in voltages as well on basicly the same chip and IHS. thats why intel had to make the switch to 10nm for alder lake because they were seeing they already pushed everything out of this already outdated 14nm node
more transistors on the die means it requires less voltage to do the same job which translates to less heat but they kept adding more cores and kept upping the default clockspeed and voltages on a chip that wasnt originally designed for those speeds and number of cores.


----------



## TheoneandonlyMrK (Aug 4, 2022)

KapiteinKoek007 said:


> Thermal dissapation is completely relevant for the IHS. if they are originally designed for 4 cores at 4 ghz, and now they are at 8 cores 5.3ghz (11th gen i9) a big bump in voltages as well on basicly the same chip and IHS. thats why intel had to make the switch to 10nm for alder lake because they were seeing they already pushed everything out of this already outdated 14nm node
> more transistors on the die means it requires less voltage to do the same job which translates to less heat but they kept adding more cores and kept upping the default clockspeed and voltages on a chip that wasnt originally designed for those speeds and number of cores.


Your believe what you want, but nothing you said counters what I said so far.

The Ihs also has changed as has the socket since quad core century just not enough for you apparently but it still isn't the Cause.

Look to delidded exploits on water blocks, while temperature is reduced it's still hard to keep cool because the die is Denser with transistors, simple.

But like I said you believe what you want.


----------



## Assimilator (Aug 6, 2022)

All of the current alternatives to silicon are currently of nothing more than scientific interest due to the fact that silicon has been around for so long, is so well understood, manufacturing and using it in microprocessors is such a massive industry, and it's still currently Good Enough. To transition to something like cubic boron arsenide, or carbon nanotubes, or something else will require an incredible paradigm shift in both microprocessor design and fabrication. Guaranteed there are research departments in universities and companies figuring all this out, but much like alternatives to lithium ion batteries, we won't see them anytime soon.


----------



## Shrek (Aug 6, 2022)

I believe Germanium (which I seem to recall came before Silicon) is still of use for high speed

Germanium Can Take Transistors Where Silicon Can’t - IEEE Spectrum


----------



## DeathtoGnomes (Aug 7, 2022)

Shrek said:


> I believe Germanium (which I seem to recall came before Silicon) is still of use for high speed
> 
> Germanium Can Take Transistors Where Silicon Can’t - IEEE Spectrum











						New Germanium-Based Material Could Replace Silicon for Electronics
					

A half-century after it was supplanted by silicon, germanium may push silicon—and graphene—aside




					spectrum.ieee.org
				




talks about better stability than it had prior to 1960.


----------



## lexluthermiester (Aug 7, 2022)

DeathtoGnomes said:


> New Germanium-Based Material Could Replace Silicon for Electronics
> 
> 
> A half-century after it was supplanted by silicon, germanium may push silicon—and graphene—aside
> ...


The compound in question, germanane, is the actual material in question proposed to be a replacement for silicon. It looks promising, but there are challenges to production. For one, Germanium is not as common as Silicon which makes mining and manufacuring less than ideal.


----------



## R-T-B (Aug 7, 2022)

lexluthermiester said:


> That is a myth. Quantum computers do some work very fast, but they will never replace traditional computers. Even if someone comes up with a room temperature QBit IC, they still will not replace transistors. They don't work the same way.


That, and you can generate quantum hardened encryption (that quantum computers can't deal with) on a classical computer too.


----------



## Shrek (Aug 7, 2022)

R-T-B said:


> That, and you can generate quantum hardened encryption (that quantum computers can't deal with) on a classical computer too.



How so? Quantum encryption is commercially available but needs more than a classical computer.

But now I'm off topic.


----------

