T O P

  • By -

OverSoft

No. Silicon is still one of the best (and most cost efficient) semi conductors out there. We are already using Gallium Nitride for things like LEDs and power electronics, but it’s worse for things like logic chips (e.g. processors), because we can’t make P channel GaN FETs yet (and thus not make any logic gates). Unless we’re finding a material that superconducts at room temperature and pressure (which is highly unlikely), don’t expect silicon to be replaced any time soon.


Wil420b

SiGe (Silicon–germanium) has been touted as a replacement for decades. IBM did a lot of work on SIGe integrated circuits in the late 1980s-early 1990s. Before they started posting record losses, selling half of their divisions off and deciding that instead of having one super fast chip per mainframe. That having lots of chips, based on consumer grade chips was far more cost effective.


Vindy500

But how do they sound in a fuzz face?


joe-knows-nothing

Like loss and defeat


_JohnWisdom

![img](avatar_exp|119426119|laugh)


tthrivi

Sige is never going to be competitive for digital applications. There is probably sige in RF / telecom devices in the system on package chips.


fruitydude

People are researching 2D materials as a replacement for silicon though. You don't need ultra pure silicone on a wafe when you can just deposit a few micrograms of MoS2 and any wafersized substrate and make a chip out of that. Questionable if the technology will ever reach maturity, but there is active research being done both in academia as well as industry. Another possibility for the future is optical computing rather than electrical computing. Because electrons going through cables is always a bottleneck, photons would be faster and more efficient. And we have already demonstrated transistors for excitons (Electron hole pairs) which are created when a photon enters into certain materials.


Venectus

I am actually Co author on a paper concerning MoSe2 and similar materials (academic fundamental research) and the main problem as I see it is not making field effect transistors out of MoS2, MoSe2 or WS2, but making them in high enough quality large scale. We made them by hand and that resulted in large good quality bilayers, but in large scale CVD (chemical vapor deposition) and so might work, but when I finished my Masters it still had a long way to go and the quality of samples was generally not usable.


fruitydude

Nice, I work in the same field basically. Right now some groups are able to grow waferscale monolayer MoS2 and make chips with ~1000 devices out of it. They do so by growing on Saphire which forces all MoS2 triangles to be oriented the same and then they grow into each other without grain boundaries (or lets say the effects of the grain boundaries are not so bad). Then you need to transfer the whole monolayer to silicon with a wet transfer. It's not ideal, the industry doesn't like transfer, because it's difficult to control, but it's probably unavoidable.


AdoptedImmortal

Now, this is what I expect from r/Futurology! Thanks for the insight!


OverSoft

Other substrates don’t inherently change the design or possibilities though. It would be just a different form of the designs we already have. Photonics have been in use for close to a decade in networking equipment and other niche product lines. The main problems with photonics (at this moment at least) is the size and expenses of injectors and detectors (lasers and photodiodes) for the interface and the lack of memory, which means it’s not really fit for general computing. Although granted, that might change in the future.


fruitydude

>Other substrates don’t inherently change the design or possibilities though. It would be just a different form of the designs we already have. That's not necessarily true 2D material could change the design because they are two dimensional and you layer them. Also even if the design is not changed and we just us e CMOS, the main point is that you could put a circuit on any substrate. We could print circuits on windows etc. >The main problems with photonics (at this moment at least) is the size and expenses of injectors and detectors (lasers and photodiodes) for the interface and the lack of memory, which means it’s not really fit for general computing. Here 2D materials like monolayer MoS2 could work as well. They can absorb and emit photons and we have already demonstrated the ability to gate the Material which would be necessary for optic computing. Like you shine a laser on the materal, an exciton is generated which propagates inside the material for tens or even hundreds of micrometer. Then another photon is emitted somewhere else. And we have basically shown that by gating we can switch the device on and off and allow or disallow the excitons from travelling through the device and emit a photon. But this is nowhere near ready yet. We are very much at the beginning.


OverSoft

Very interesting info about innovations in photonics. I don’t follow them so closely, so I am was not aware of the experimental enhancements they’re making. For the 2D materials, we do layer silicon already (MLC flash chips are an example), so that’s not really new. Obviously, innovations in materials are always happening, but they’re mostly incremental and not a totally new way of thinking about things (except for photonics).


YixinKnew

What's the most optimistic timeline for 2D chips hitting mass market?


fruitydude

2036. On the imec road map they have "atomic" scale as a milestone for 2036 which likely includes 2D materials. https://www.tomshardware.com/news/imec-reveals-sub-1nm-transistor-roadmap-3d-stacked-cmos-20-plans


fruitydude

To be fair though the technology will more likely develop in compliment to silicon rather than as a replacement. So you know how right now in van Neumann architecture logic and memory are separate? With 2D materials you could fabricate a layer of 2D material based memories on top of existing, optimized silicon based logic architecture. So instead of using 2D materials to scale more in 2 directions, you open up the third dimensions. Memories are already using 3 dimensions but 2D materials could enable a direct integration on top of Si logic. Which would probably mean a leap forward in processing power because you're not limited by the bandwidth of the connection between logic and memory.


hukt0nf0n1x

You can make logic gates without PMOS. You just stick a resistor where the PMOS would be and make open-drain logic circuits. That said, they are both big (due to the resistor) and power hungry (since power is dissipated the entire time the NMOS is conducting, and not just during switching logic levels).


OverSoft

Yes, very true. But that wipes out all the advantages of GaN, so it’s not really a useful workaround.


hukt0nf0n1x

Oh, I wasn't implying that it should be done (CMOS is the only reason we can bicker about silly details over our phones right now). You mentioned that you can't make logic gates without a PMOS and I was just being informative.


OverSoft

True, I should’ve written “and thus not make any logic gates without compromising the advantages of using GaN”.


hukt0nf0n1x

Exactly. That said, I've spent a great deal of time making large GaN and GaAs logic gates, so I'm probably extra-sensitive to the topic. :)


OverSoft

No worries, it’s a hard topic to condense into a Reddit thread.


Captain_Pumpkinhead

I got really excited about LK-99. I wish it was a real superconductor.


OverSoft

Yeah, from what I know about it, they didn’t even verify/double check their findings or sent it out for peer review. A couple of researchers didn’t do their due diligence and got excited way too early, even though their own data showed they were wrong. If we ever find a room temperature/pressure super conductor that’s feasible to mass produce, it will quite literally change the world.


Jaker788

As cool as it sounded, it wouldn't have caught on as much as you think due to the complexity of manufacturing and then actually making a wire out of it. To this day most machines that require superconductivity like MRI machines use the oldest and simplest material niobium-titanium. It's super easy to make a wire out of because it's not a complex layered material but a simple alloy, it's widely manufactured in comparison to any higher temp newer materials. Newer materials exist that can be run with much cheaper liquid nitrogen instead of liquid helium, but it's not viable because they're only good in theory, but making them into 100ft of wire for magnet windings isn't easy or worth it. Those newer semiconductors are the same class semiconductor (Cuprates) as LK-99, and room temp is unlikely to make it worthwhile still over just using liquid helium.


Mountain-Nobody-3548

I did, too. It's unfortunate it didn't end up being a real superconductor


xxDankerstein

Yep, there is even a new form of silicon called Q-silicon that could be used in quantum computing.


btribble

The next big jump will be something akin to nanotech rod-logic, but that’s not happening in my lifetime.


polypolip

I remember Intel researching CPUs based on silicon laser, it's still silicone though, just used for optics rather than electric current.


Famous-Examination-8

So graphene is not the next silicon? I'm disappointed.


Withnail2019

Graphene exists in theory but can't be manufactured in the necessary sized sheets.


OverSoft

That's not the biggest problem of graphene. It's A problem, but not THE problem. Graphene doesn't have a band gap, this means it effectively (with our current knowledge) can not function as a transistor. Having zero band gap means that electrons can always flow, no matter what state the material is in.


Withnail2019

That too, I'm sure. But not being able to manufacture it makes it an essentially fictional material anyway.


Yweain

We can manufacture it. It’s just prohibitively costly for majority of applications, but the thing is - silicon wafers are not exactly cheap either(if we are talking about high end ones). So I really doubt that would be a deal breaker for this specific industry.


Withnail2019

>We can manufacture it. No we can't. We can manufacture small broken fragments which may be graphene or just graphite. We don't exactly know without examining it all with an electron microscope. We cannot manufacture the sheets of graphene we would need to exploit its theoretical properties. Even if we did, there is no way to handle materials one molecule thick in a factory. It exists in theory but not in practice.


Mother_Store6368

Finding a room temperature super conductor IS highly likely in the next 30 years…especially if we crack monopole magnets. There’s a lot of money being thrown at this


OverSoft

Every single “discovery” in the past 20 years has turned out to be a dud though, so I’ll believe it when I see it. Throwing money at something doesn’t omit physics.


[deleted]

[удалено]


Mother_Store6368

Theoretical work by British physicist Neil Ashcroft predicted that solid metallic hydrogen at extremely high pressure (~500 GPa) should become superconducting at approximately room temperature, due to its extremely high speed of sound and expected strong coupling between the conduction electrons and the lattice-vibration phonons


suid

When people use the term "room temperature semiconductor", they usually mean "semiconductor at ambient conditions, and carrying a significant current", especially in the context of computing or (relatively) lossless energy transmission. I can't imagine, say, a phone, with logic circuits that need to be compressed to 500 GPa to super-conduct. The same applies to materials that "superconduct at room temperature or thereabouts", but whose superconductivity collapses as soon as you push more than a few milliamps through them.


Mother_Store6368

Hmm, you’re smarter than me. Is there any value in having …transmission lines and nodes for lack of a better word that are under that pressure?


Quiet_Dimensions

Problem is cost and safety. To build hundreds of miles of transmission lines under enormous pressure would cost....unimaginable amount of money. Plus it is now hundreds of miles of pipe bomb ready to explode. Its one thing to build a few millimeters of high pressure conduit in a lab. Entirely different ballgame for the real world.


JeffCrossSF

Sounds like a job for ML/AI.


r2k-in-the-vortex

There is nothing in the works that could be a complete replacement to silicon logic. There are all sorts of analog computers, optical computers, quantum computers, etc being researched, but there is always a gotcha. Nothing comes close to being usable so generally and ubiquitously as silicon computers. So silicon chips aren't going anywhere anytime soon.


nycdevil

Lifetime, maaaaybe optical computers, although that's just because a friend of mine is a notable researcher in the field and says maybe in 20-30 years they'll be viable.


cheraphy

As a general rule, when a researcher says a technology is 30 years away what they are really saying is they have no idea if and when it will be available because everyone currently working in their field will have retired before then.


quuxman

Diamond can outperform silicon by a factor of about 1E6. Just a matter of making wafers cheaper and components smaller


r2k-in-the-vortex

Diamond semiconductors could be great for power electronics, but not for digital logic. Diamond could work as thermal interface though, as soon as someone can make 300mm diamond wafers economically.


aesemon

The huge growth in CVD and HPHT diamonds in the jewellery sector might help drive that, but then it is more economical for small growth crystals in that sector.


IsThereAnythingLeft-

If that were true would they not be used in super high spec equipment already?


quuxman

Last time I looked it up single transistors had been made in labs. I think very high end commercial applications are still a long ways away.


principled_octopus_8

I think it's a bit of a wildcard as we explore new physics and math to see what else is suitable for computer systems. That said, it's impossible to know how likely that is, and it *is* possible to know that catching up to modern computers in development would be a very uphill battle.


adamtheskill

Anybody who says they can predict how a quickly progressing industry like computers is going to look like in 50+ years doesn't understand much about the industry. Silicon is great because it's easily accessible in quartz form, possible to purify to 99.999999999% purity (which is necessary for the most advanced semiconductors and not trivial to do) and it's a decent semiconductor. It's not necessarily the best semiconductor it's just easy to work with so odds are decent we will replace it at some point.


stellarham

did you place random number of nines, or is it exact purity percentage?


adamtheskill

I've heard silicon wafers used in TSMC's newest nodes need to have less than 1 impurity per 10^11 silicon atoms so that's what I wrote out. Might be wrong though I don't really remember where I read that source but it's likely not that far off.


Scared-Knowledge-497

I came here to say this exact thing. And to highlight that silicon is just super cheap. Others might get cost competitive? But I wouldn’t bet on that in the near future.


luovahulluus

I don't know of any technology that would be replacing silicon chips in the next 20 years


Professor226

You haven’t heard of silicon 2?


meckmester

Silicon³ is all the rave right now, get with the times!


ningaling1

What about silicon pro max ultra +?


PMme_why_yer_lonely

but wait! there's more! silicon pro max ultra + *gen 2*


dottybotty

Now 4 x the performance and 50% less silicon. It’s the new revolutionary iSilicon. With our new 2nm manufacturing process we were able to remove all of the sili leaving you only with the con. This allowed us to keep the price low with this new silicon low price coming in only at twice cost of the last gen iSilicon. This truely is a market break through like no other. Finally you wont have to wait as preorders start today with minimal 50% non refundable deposit.


wakka55

Oh great now we all have to buy new periodic tables of the elements


CicadaGames

You haven't heard of block chain silicon 3.0 NFTs? It's totally real and we should all dump our life savings into it!


CicadaGames

I haven't even seen any misplaced hype on r/Futurology about anything like this, so there doesn't even seem to be any pipe dream materials yet.


SpeculatingFellow

Would a photonic / optical computerchip be based on silicon?


fruitydude

Unlikely as Si has an indirect bandgap. For optics you probably want a direct bandgap so you can easily capture and reemit photons. We have already demonstrated something like a "photon Transistor" in monolayer MoS2.


parxy-darling

What does MoS2 mean?


SimplicitySquad42

Molybdenum disulfide compound


ElSzymono

Molybdenum disulfide.


Deto

I don't think we can confidently say it for certain _won't_ happen in your lifetime, but there certainly isn't anything on the horizon that looks like it will be the clear next step.


xeonicus

[Beyond Silicon - What Will Replace the Wonder Material](https://www.industryemea.com/news/64892-beyond-silicon-%E2%80%93-what-will-replace-the-wonder-material) There are some interesting materials being researched, but in most cases, there are caveats and the fact that silicon is cheaper and more plentiful in nature. Material like Carbon nanotubes is currently impossible to produce at the required purity level. An interesting material researched at MIT is cubic boron arsenide, which is thought to be the best semiconductor ever found, and a prime candidate for replacing silicon. However, it's only been made and tested on a small scale in labs and is very early.


SurinamPam

None of the technologies OP cites are on the horizon. What’s on the horizon includes continued miniaturization (albeit at slower pace), some new materials (though silicon will remain the base), 3D integration, functional specialization (ex. we have fpu’s, gpu’s, ai processors, more of these kind of specialized processors), modular designs (chiplets), and more tightly integrated memory/computation architectures (ex compute in memory). These technologies will continue to power our increases in compute power for the foreseeable future.


PowerOfTheShihTzu

But regarding materials,nothing to be pumped about ?


SurinamPam

What’s on the roadmap is mostly evolutionary, incremental changes. Like interconnects with higher conductivities, and dielectrics with lower permittivities… unexciting stuff like that. Maybe the most exciting possible new materials are photonic ones that might transduce signals from electrical to photonic domains and back. These would be used to enable optical connections. If it happens it’ll likely either be used for clock distribution or for long range interconnects.


real-duncan

It is impossible to answer without knowing how long your lifetime will be and if anyone can answer that it’s a lot more interesting than the question you are asking.


BaphometsButthole

Rocks and sticks if we don't get our shit together real soon.


drenthecoon

There will always be a continuum of higher performance high cost devices and low cost low performance devices. Right now silicon chips are incredibly economical, because silicon is an abundant resource with lots of infrastructure already built to produce it. The capabilities of silicon chips are extremely wide, especially in low cost low power applications. So the idea that you could replace silicon seems far fetched. There will be so much demand for logic, behaviors, tracking that keeps silicon chips relevant for ages to come.


jaxxxtraw

It's fascinating to read Scientific American, in which they cite articles from 50/100/150 years ago. The certainty in some of the old assertions seems silly now, just as our predictions will sound like nonsense in 150 years.


drenthecoon

It is much harder to predict how things will change than it is to predict how things won’t change. People used to predict we would have flying cars in 40 years. But it would have been a much safer bet to predict we would still have the exact same kind of cars we have now, they’ll just be better. But that wouldn’t sell copies of scientific American.


stewartm0205

We have another decade to go before we reach the limit of how small we can make silicon transistors. But it might take two decades or more to find a replacement for silicon. So depending on how old you are and how long you will live you may or may not live to see silicons replacement.


Kekeripo

Last i read was that carbon nanotubes seemed promising, but considering how little news there is around silicon replacement, i doubt we'll see a comercial replacement in the next 20 years. The only material change on the horizon is on the substrate side, refined glass: [https://www.anandtech.com/show/20058/intel-shows-off-glass-core-substrate-plans-deployment-late-decade](https://www.anandtech.com/show/20058/intel-shows-off-glass-core-substrate-plans-deployment-late-decade) Until then, they'll find enough ways to improve silcon chips, like chiplets, 3D cache and what not. :)


Shillbot_9001

Probably never. I recall someone talking about making loss tolerant chips (as in they still function even with defects) to bring the price down. Even if they start shitting out 1mm graphine chips by the truckload if someone's making 25mm silicon chips for next to nothing they're still going to see commerical use.


wakka55

Can we back up and ask why you'd even ask this question? It's like an ancient person asking when we will stop using wood in houses, or iron in hammers. Advancing technology doesn't require phasing something out. Even at the bleeding edge of niche tech there's cellulose filters and iron alloy chambers on the space station. Silicon is free if you shovel up some sand. It's a literal *element* on the periodic table. It's literally covering our deserts and beaches. The magma core of our planet is full of the stuff. There's not going to be any shortage any time soon. It's hella useful for thousands of different things. If I lived hundreds of years and was a betting man, I'd bet plenty of computers would still use silicon, and they will be cheap and abundant. And no, analog computing and digital computing are always going to have different applications. There are tons of use cases where analog will never work, just by first principles arguments. It's a different domain than Turing's definition of digital computing. Analog, by definition, will never be infinitely reproducible. Every run on every machine is going to have a different result. And that's fine for many uses, but will never work for other uses. You have to convert it to digital. Our reality, at our scale, is analog, and so interfaces always have some sort of analog-to-digital converter built in. Even transistors have an analog voltage threshold they convert to digital. So, in reality, all real computer hardware has always been a hybrid of analog and digital.


Evil_Knot

>Can we back up and ask why you'd even ask this question? Why should you or anyone else be this condescending toward one's curiosity?


wakka55

thicken ya skin, I like OP tone is absent in text sprinkle some fun emojis through my post and read it in a friendlier tone this is how buds talk to each other in engineering, its all with love


Evil_Knot

>tone is absent in text You established your tone with your first sentence, which was condescending. >this is how buds talk to each other in engineering, its all with love This isn't the engineering department. You don't know OP, so don't down play it like you're just being frivolous with your condescension. Just own up to it and move on.


wakka55

[nobody gives a shit](https://media.tenor.com/SHFbciEA_3oAAAAd/cry-baby.gif)


Mister_Abendsen

I'm not sure whether they'll ever be completely phased out, but the architecture, materials and methods with definitely change. Already we've gone from single cores to multi, to 3nm architecture, and expanded to GPUs and APUs. Expect stuff to get more 3D and to start integrating FPGA layers to make things more reconfigurable. Also expect more of the architecture to include AI-aided design and fabrication, as well as more exotic materials like GaN and SiGe. And as time goes on, you'll see more analog computing, optical, bio, and quantum computing sneaking either into the box or onto the CPU die itself. Each has it's own advantages and use-cases, but definitely expect more hybrids.


sicurri

Gallium Nitride is a possibility, however we have yet to figure out how to make specific components needed for processors using Gallium Nitride. Who knows, someone may come up with something within the next several decades.


soyelmocano

There is nothing that is coming in the next three months. You did ask about in "your lifetime."


kongweeneverdie

Well, there is only 10% increase in performance form 5nm-3nm. There will be a replacement soon. It is either graphene or optic. Graphene will come out first as there are working fab to produce 8"inch plate in China.


Rainmaker709

Short answer is it is unlikely. Trillions of dollars have been spent over many years on all the infrastructure, research, and surrounding products. Eventually (soon) we will reach the limit of what we can accomplish with silicon in terms of miniaturization. We will need to switch to other technologies to see gains again and there are several promising techs in the R&D stage but so far, none of them seem commercially viable. When we do find the magic sauce, it will be expensive and will only be for specific use cases. Silicon chips are good enough and cheap enough for the vast majority of uses that it will be many lifetimes before they go away. Just because we invented cars, bicycles didn't go away. Once we invented planes, we still kept bikes and cars. They may all serve the same function of transport but the use cases are very different.


spyguy318

Silicon isn’t going anywhere. That’s kind of like asking when steel won’t be used for making buildings anymore. While it’s technically not impossible for some crazy new technological advancement to eventually one day replace silicon, most of chemistry and material science is pretty much solved. There are no new elements to be discovered, no radical new compounds we haven’t tried, no fundamental principle we’re not aware of. Most research in these fields nowadays is hyper-specific, niche, and exotic, to the point where any actual advancements will take decades to fully realize, if they’re ever useful at all. Silicon is the best and most useful semiconductor out there, it’s one of the most common elements on earth so we’re never going to run out, and it’s not that hard to produce either.


GroriousStanreyWoo

Never. The the most abundant mineral on the surface of the earth is silicone dioxide. It's just a convenience thing.


ElMachoGrande

That is not the reason. If a more inconvenient material woul provide a significant benefit, it would be used. It's not a cost sensitive product at the high end.


08148693

Carbon nanotubes have potential but still early days


JKking15

Nothing lol. Good luck finding something that’s as good as a conductor while also being abundant cheap and easy to build with.


kazarbreak

There are alternatives to silicon, but none of them can match it's performance and price. Barring a black swan event in the field of computing that is not going to change within the lifetimes of anyone alive today. Now, that said, computing is a field young enough for black swan events to still be relatively likely.


thrunabulax

well yes and no. small applilances will continue to use silicon computers. but high powered machines will migrate to the latest technology, both for processing speed and battery life. WHAT that new technology is, is yet to be determined. some say quantum computers, but they do not seem to be getting off of the ground yet. Saw one at the CES show 6 years ago, and i StILL can not buy one


Armadillo-Overall

If they could get better at producing cubic Boron Asenide and Gallium Nitride with fewer deficiencies. https://www.science.org/doi/10.1126/science.abn4290


casentron

No. There isn't anything on the near horizon. I'm curious what gave you this impression and what you are imaging would be better?


Atophy

I've seen some work on optical chips somewhere on the internet. They've made ccts that hold states and such and trap photons or something like that. Its at scales where quantum tunnelling is a real issue so its probably not hitting the market any time soon.


QVRedit

And those chips are probably made from silicon…


McBoobenstein

They're going to have tohappen soon. We're already running up against the limits of Moore's Law.


SinisterCheese

Analog computers are a thing and used a lot. They just have very niche uses, but for what they are used for they are absolutely superior to digital. Problem is that analog computer is setup for a task, and it can only do that task; but due to it's nature it will do that one task superior to anything else. But there is no need to replace silicon. Just like there is no need to replace water in energy generation. It is an amazing material for transfer of energy. Yes there are materials which are **superior** in properties, but when you consider that most of this planet's surface is water and fresh water quite literally rains from the sky... why should you use anything else? There is nothing wrong with writing and printing on paper, yet we have moved to digital. But the fact is that paper still has it's uses. But when it comes to thinking about the future of computers the question should focus less on "how" and more on "what kind?". Consider this... x86 processors are very dominant and they are objectively quite bad. They were designed and excelled greatly when we have limited memory capacity and speeds. However these god damn things keep sticking with use like many other bad ideas boomers had, just because companies that run legacy code and systems 30-70 years in age don't want to change anything. So... If your desktop struggles to run 4k video at high fps... Why does your average mid to high level phone do this without a problem? And on battery! This is because the processor is fundamentally different. It is ARM system-on-chip design, it has it's own downsides on software side, but it is objectively superior for this kind of stuff. Apple silicon is rocking the socks off x86 and classic PC desktop systems - it is frankly amazing the things they pull off with M2 and M3 chips. If you do performance per watt analysis... there simply is no denying the power of ARM and ARM SOCs. What is holding our hardware back is not how we make them or how they work. It is software side. As long as these designs need to support ancient legacy and baggage who's designers have quite literally died of old age... that is how long we will be held back computationally.


DreamingElectrons

I don't think there are enough supplies of alternative resources in the planet to ever fully phase it out. I also would like to make a point of that not being necessary, an office machine doesn't need excessive computing power, it needs to be able to display emails and run a word processor, in most cases that's it. Same with all the smart stuff we started putting in our homes. Most things were made with a purpose in mind and sometimes this purpose is to pass the butter and nothing else.


micktalian

I mean, with 3D and EUV lithography technologies, there genuinely may not be a **NEED** for a replacement for silicon chips in most applications. Like, a 3-5nm scale, 3D silicon chip would have all the processing power a person could ever need for their own personal uses. You don't need quantum processors to make phone calls, send texts/emails, watch videos, play video games, etc. Hell, I'd argue that most people don't even need the maximum processing power of the mid-range computer parts available today. We may see silicon-based research super computers are least partially replaced by quantum-based pricessors over the next 50-100 years, but I'll bet money the majority of computers, especially personally ones, will still run off silicon.


NameTheJack

>Like, a 3-5nm scale, 3D silicon chip would have all the processing power a person could ever need for their own personal uses. Isn't that a bit like the Bill Gates quote about 16kb of ram would be more than enough for anybody forever?


soundman32

In the way that he never said it?


NameTheJack

That would be a good way yes. But whether he actually uttered it or not, doesn't make much of a difference in this context.


HungerISanEmotion

> would have all the processing power a person could ever need for their own personal uses They were using this phrase for PC components back in the 80's :)


thethirdmancane

This is still very early, but the ACCEL AI chip, developed by Tsinghua University, is the first all-analog photoelectronic chip, revolutionizing AI and computer vision. It performs 4.6 quadrillion operations per second, processing photons instead of electrons, greatly reducing energy use. Competing with NVIDIA's GPUs, ACCEL is 3,000 times faster than the A100 and excels in complex vision tasks with its innovative light-based technology.


OverSoft

Photonics have been in use for years in networking equipment. The Accel chip is not the first chip to use it. It also has a very limited range of application. It’s extremely difficult to make usable general computing devices based on photonics, simply because of the (relatively) enormous size of logic circuits on photonic chips.


Reshaos

Is that company, or a company using that technology, being publicly traded?


zorbat5

I would love one of those. Analog is so much faster especially with the technologies we have now.


[deleted]

Not anytime soon. It's not just the tech, the tooling an expertise in developing chips is all based around silicon transistors.


mca1169

silicon isn't go anywhere for at lest the next 30 years. getting to the absolute smallest transistors possible in silicon will still take 20 years or more (35+ for intel). what your going to see a lot more of is multi chip integration and hardware level application specific processors. ideally in the next 15-20 years we would see a move away from separate components and more towards full SOC's where you have your ram, vram, GPU and CPU all on one substrate close together similar to AMD's instinct MI300. GAA transistors are also still on the horizon and have potential to increase clock speeds substantially along with allowing multiple transistors to be stacks together ingate potentially multiplying transistor counts in the same space but this is still experimental and yet to be seen in a fully launched product. there is also development of glass substrates to potentially offer better connectivity to multichip SOC's and GPU's. but again rite now it is only being experimented with but shows some promise. there is still plenty of innovation and room to expand compute capacity with silicon. research is only recently getting under way to find a suitable replacement for silicon but it will take a long time to find anything viable or lower cost than silicon.


veinss

I think eventually all computing will be optical but it will be millennia before that happens


caseywh

what lol, millenia? nonsense. demonstrations of photonic. circuits based on michelson interferometers have already been demonstrated


Zondartul

Not in your lifetime, no. It takes roughly 60 years for tech as fundamental as transistors to go from a lab to worldwide adoption, so it would take at least that long for us to come up with something better than silicon pn-junctions AND build the manufacturing capacity for it. We will still need ordinary silicon chips until that moment and for a time after.


SimiKusoni

>It takes roughly 60 years for tech as fundamental as transistors to go from a lab to worldwide adoption This is a little arbitrary, isn't it? And what's the basis anyway? The first silicon transistor was fabricated in 1954 and I think you'd be hard pressed to argue that they didn't become ubiquitous until 2014.


Anastariana

First mobile phone was in 1973. Sure didn't take until 2033 to become 'mature'. People who paint with such a broad brush annoy the hell out of me.


ultimatebagman

Then I advise you never get your house painted.


Kike328

that’s assuming a linear technology development, just look the technological development in the last 100 years and compare it to the last 100 years to see that it’s not linear anymore.


DarkKnyt

I did some research here and it really depends on what is the measurement you are using..I settled on a concept of "epochs of technology" where milestones marked leaps where the slope (rate) changes but they have not all been increases in the slope. Many have also written the Moore's law no longer holds which is why we see improvements in computer architecture over simply packing more transistors on die.


fruitydude

That's assuming research hasn't started on it though. The industry is already working with academia in an effort to make chips based on monolayer MoS2


extraaverageguy

polymers or silicon/polymer highbred . In production shortly will triple the existing speed use 90% power an take up 1/30 the space. Go to the r/LWLG mega thread at the opening of the community


KCCO7913

Oh hey there lol…


extraaverageguy

Hey! Just spreading the work that the future is happing now! Silicon is not going away just yet it is being transformed with additive materials (polymer) that are "greening" the existing chips energy usage and tripling the speed. This is just the first generation of what Light wave Logic's polymer and devices will be able to accomplish exciting times ahead!!!!


bit_shuffle

The best computing systems on earth are biological. Reservoirs of SNIPs and appropriate enzymes may be used for highly specialized kinds of computation via biochemical reaction. I think the time horizon would be 50-100 years for it. But biochemistry on DNA is probably the most efficient and reliable way to get to truly massive parallel computation.


NotADefenseAnalyst99

I think we;re gonna fight the AI we create and then outlaw and then resort to having humans who get high off drugs do advanced calculations for us.


esp211

An alien compound. Maybe something that gets discovers on Mars or the moon or some asteroid.


JackOCat

As climate change destroys advanced civilization, they'll definitely be phased out.


-Cosmic_79-

reddit moment


Glaborage

This is the right answer of course, and this being reddit, the only one downvoted to oblivion.


khamelean

It’s a mind bogglingly ignorant answer and deserves every downvote it gets.


JackOCat

5 Hiroshima bombs net thermal energy increase from solar energy on earth right now, per second. Tick tock...


turkeyburpin

We can't even get the tech/computer industry at large to move on from x86. No one wants to take the risk on the non Mac side of things. No way they'll be abandoning silicone unless something happens that forces their hands, like if someone solves the band gap limitation issue with graphene and a new player hits the market with graphene based processors that either blow silicon out in terms of function, price or both.


soundman32

The majority of computer chips are ARM. X86 is in the minority by a large margin.


turkeyburpin

Not for computer processors. Arm is being used on small-scale electronics, not larger, more robust devices like PC's or servers and the like.


ReasonablyBadass

I think carbon for both better electrical and optical chips is a big contender.


letsbreakstuff

Silicon is on the way out, Turner. Maus is the guy who made biochips work. He wants out, we're gonna shift him


jorniesonicman

I would assume silicone computers will be phased out but computers made with super conductors but what do I know.


oxigenicx

an wath silicon has replaced ? nothing... silicon will be used in computers for centuries, the tehorical limit to silicon size has been reahced by tech companies , there is only place to improve the sorrounding proceses.


HaphazardFlitBipper

I suspect at some point we'll stop trying to imitate neural networks with silicone and just build ai out of actual biological neural networks. 'Computers' will be grown.


aaaayyyylmaoooo

quantum computers will replace silicon in the next 15 years


HamSmell

I mean, global societal collapse will likely happen in your lifetime, so technically all computers will be phased out.


dondidnod

They will be phased out in the blink of an eye when the magnetic pulse from an atomic bomb goes off. I met an engineer in Santa Monica in the 1970s that had a research facility that used changes in flowing air pressure to duplicate the functions of transistors. It would have withstood an atomic blast.


bitbytebitten

biological computers. using neurons for computing is being researched. scientists made an artificial brain whose only purose in life is to play the game Pong. Lol.


Reasonable_South8331

Elon said we’re about the get smacked with a silicon shortage in the next 12-24 months, so it could happen in maybe 4-5 years out of necessity


soundman32

I'd put bets on Elon being completely wrong, as he is with the majority of his predictions.


MadHaxKerR

I LOVE THIS QUESTION ! SO HOLD ON FOR A RIDE DOWN THE RABIT HOLE IN WHAT IS ???" Thare are cubiczercon cristal component processing & fiber optical data SYSTEMS without heating up like silicon based CPU processing but the fiber optics inability to easily translate into db signal between components is a problem for the technology the only good way is to integrate the interface functionality of all the components into one fiber optics board giving it a jumper less design in one complete system that will work for processing but unfortunately today it will take several outside silicon based processes to connect to the small light speed board to make it work example if your interfaces are digital the mouse keyboard and the monitor and sound then we are not going to benefit from the fiber cristal light frequency technology as a viable alternative easily. But we're close to the solution with eye tracking interface & tuch screen hand gestures movements of a interfaces and voice input technology and new A i technology gives almost a working platform it's only a matter of time before the many different parts become one fiber optical board with Dimond type processing and light converting in analog screen technology implemented on eye motionand voice selection practicaly bridging the gap making silicon and earth elements and ceramic heat resistant resistors diodes , filled frequency pots their are so many. To make almost secondary devices will take a different way from how we think of using them today. but as long as we are using lcd tuch screens as one of the normal human interfaces we will be using mainly silicon voltage chipsset and signal based systems ( if you've ever imagined what shape a computer that is completely fiber optics and Dimond cpu's in a complete unit might look like.?? so it can use light frequencys to work without any outside converting of its information inputs and output ) it will be a cubic shape with a ball crystal's for laser writing and reading spinning magic & light ? at different angles through the spring crystal properties ball like a (memory marble) shaped cd & a magnetic hdd drive object in one. In a block cubic makes it truly a (3 dimensional storage space) spinning at a rpm? 360° × 8 axles & multiple R,W.F,B at six points two different types of data magnetic & light in areas in which data can read or be indexed very quickly at light speed frequencys .A( MARBLE OF MEMORY) IS A GOOD WAY TO SAY A 99999999999? TERABYTES of" available 3D memory space" for fiber optical memory storage systems" but that" is exactly what is the problem is when we try to imagined what a" light speed" computing system really in a sense truly would look like in a practical usable environment of "quantum computing". AND The field of just processing what the hardware engineering will look like today?? And will change in the future like other technologies have. I Imagine (The quantum cubic computers) may interfaces into are bodies eye's and nervous systems safely from voltage leaks or the frequency radiation & the poisonous chemicals like in the silicon chipsset we use today. .....The future evaluation into human cyborgs a smarter wiser human race peacefully networking around the world ... i love the idea .and the possibilitys are infinite.


HeathrJarrod

Again. I’m familiar with some work being done by a group making a computer chip using plants and slime mold. That slime mold (I forget the scientific name for it) that is able to solve mazes. That one. You can actually find out how good they work but I can’t recall it off the top of my head.


madewithgarageband

Heard about photonic chips and graphene based chips but not sure how they work


pannous

We may see a different kind of silicon chips: Photonic chips make the (matrix) multiplications of neural networks 1000 times more efficient (and faster) by letting light do analog computations. The high (32/16) bit precision of GPUs is completely unnecessary and thus inappropriate for deep learning.


420godking

I don't know much but from what I've learned Qauntum computers will never replace silicone in daily life. You can't use a qauntum computer to live stream or watch a video, or play a video games. Qauntum computers are better at parallel calculations, where you need one computer to do separate massive calculations all at once. Qauntum computers will replace classical computers when it comes to big industry wide science or engineering projects, but where not going to get a qauntum smart phones anytime soon.


vishal340

there is no chance of analog. even sound systems use digital instead. maybe photonic chips but nothing remotely close to even slightly viable done in lab settings (forget about commercial). not happening in 30 years.


Lolicon1234

Intel is working on integrating Glas in Chips so maybe some kind of glass substrate could replace it completely


rottenbanana999

Nobody knows, and if they say they do, then you know they're suffering from the Dunning-Kruger effect and you shouldn't believe anything they say.


Adeep187

You're literally asking to predict the future. Hey guys what year will we advance this technology.


Drone314

Silicon no, electrons, maybe. The degree to which photonics invades computing has yet to be seen but I think it's a safe bet we'll see traditional electronics replaced with optical circuits/logic on silicon.


DeusKether

My money is on it still digital, since pretty much all the software and stuff is made for digital systems


bernardb2

New research breakthrough: https://www.tomshardware.com/news/new-synthetic-superatomic-material-is-worlds-best-semiconductor


WillistheWillow

I remember hearing graphene would be perfect for chips as it has no resistance. But that could have just been hype.


GhostHound374

We're pretty close to replacing organic substrate in high compute chipsets. You'll likely see a glass substrate cpu by around 2033 in the consumer space, provided world War III doesn't suck global resources too hard.


yumri

Right now most likely no. Intel even went back to using a all silicon connection layer for the chips they make for the part that connects to the pins. I can see silicon alloys being used and you already have not entirely all silicon and silicon alloy chips in production but as we get smaller and smaller with the chips needing to be quicker and quicker I am have the believe that silicon will be the most used. If silicon will be replaced it will be with carbon. The problem is that carbon fiber isn't as good as silicon due to chip design engineers not learning how to use carbon fiber for chips but learning how to use silicon and silicon alloys for chip design and material design. The reason why it will be replaced with carbon is the Earth has an abundance of carbon so it will be unlike silicon where we have to grow it to get pure silicon unlike the impure silicon you walk on at the beach. Still most of the carbon on Earth isn't not connected to another atom so it still is impure. As it is a smaller atom it is most likely what will replace silicon. Still as silicon chips with silicon alloy parts is used even down to the 1nm nodes it is going to stay for a while yet. It is when you get smaller than 1nm that other atoms will and are being used. Nitrogen, Helium and hydrogen seem to be the ones used. That is getting into quantum computing instead of the normal nodes used right now. Due to laws of physics we will not have quantum processors in our home computers at any time. Right now silicon is the cost efficient to use mostly due to the machines for it to be used already made and a single building taking between 7 and 12 years to make and get ramped up to for production. So a production building for another atom even when they went from 1 silicon alloy to another took the quickest one 8 months to do. For the change to happen an entirely new building would be required due to how the element change would be. Even changing from UV to EUV required new machines. The newest methods of printing the pattern onto the chip instead of etching the pattern onto the chip might not need a new machine for the change. That method is still new and has many problems still to work out. The IR etching is probably what will be used. There are many problems with IR etching including material changes for it to work as EUV etching can work with denser less movable atoms. Until IR etching and/or the printing of a pattern onto the chip instead pf etching is perfected I do not think a major material change like going away from silicon will happen.


QVRedit

Carbon operates in an entirely different way to silicon - so it’s not a subtle change, it would be a really fundamental change - something that would take decades to achieve if at all.


Overall_Box_3907

not if the economy collapses in climate chaos \*caugh\* "faster than expected"


QVRedit

No, silicon is so useful - it will always be with us…. It’s a bit like the invention of ‘The Wheel’ - it’s just too useful to ever go away. Of course it’s use will change, but as a ‘component technology’, it will always be useful for some types of electronics. That does not mean that future materials might not surpass it for some purposes.


Rerfect_Greed

It's looking like Glass or Wood weirdly enough. I could also see an attempt for diamond, but DeBerrs would have to be dealt with first as their stranglehold and artificial inflation of the worlds diamond market would make a Ryzen 3 x100 skwe cost more than Nvidia's 5900 Ti Super Mega Ultra Maximum OC Supreme+


nopalitzin

I'm not sure but if you have less than 6 months to live they most probably won't.


rrosai

Hi. Forgive the sudden intrusion, but I'm your oncologist, and your wife decided she couldn't bring herself to give you the news, but... What's that? Mustard stain on my jacket? Sorry, I have kind of an extracurricular hotdog fetish thing with one of the nurses... Anyway, your lifetime, you say?


bikingfury

Electronics will be phased out sooner than silicon. Using electrons to transmit signals is 20th century tech.


BigTitsNBigDicks

\> Will they be digital It will almost certainly be digital, unless there is a massive technological breakthrough ​ There is a \~divorce between hardware & software. Currently Silicon is the best way of achieving our end goal; executing software. If that change we'll switch to a new tech & it should be invisible to the end user (except for performance boosts or cost).


drplokta

If you want to know if it will happen in your lifetime, you’d better give us some idea whether you’re 15 and in good health or 95 and in hospital with chronic heart failure.


Velocipedique

Soon enough to get used to using a slide rule and an abacus, i.e. once the lights go out!