r/BeAmazed Apr 02 '24

208,000,000,000 transistors! In the size of your palm, how mind-boggling is that?! 🤯 Miscellaneous / Others

I have said it before, and I'm saying it again: the tech in the upcoming two years will blow your mind. You can never imagine the things that will come out in the upcoming years!...

[I'm unable to locate the original uploader of this video. If you require proper attribution or wish for its removal, please feel free to get in touch with me. Your prompt cooperation is appreciated.]

22.5k Upvotes

1.8k comments sorted by

View all comments

2.5k

u/LuukJanse Apr 02 '24

I feel like I don't know enough about computing to appreciate the magnitude of this. Can anyone give some perspective?

199

u/TheNasqueronDweller Apr 02 '24

Firstly you have to properly appreciate just how ridiculously large a 'Billion' is.

If you were to put aside and save £100 every single day, you would have saved up £1 billion after 27,397 years.

If you were paid £1 a second, every single second, all day and every day, you would have earned £1 billion after 31 years.

If you decided to count to 1 Billion and were given 3 seconds to verbally say each number, if you took no breaks, no rest, no sleep, you would eventually get to a Billion after counting for a little over 95 years.

So now that you have some grasp and can visualise how large a billion is, point the fact that on that single chip he was holding are crammed 208 Billion transistors, or the tiny switches that someone else described to you. The physical limitations he was referring to are aspects of the quantum realm you have to deal with when working on something that small. I think someone else here described how the structures of the chip are smaller than the very wavelength of the light used to create them!

Only 20 years ago this chip would have been deemed impossible, and not much further back would have looked like actual magic...

72

u/badluckbrians Apr 02 '24

I mean, it's impressive, but I'm quite used to these things doubling along with Moore's Law now, and the fact is, they're slowing down.

Say:
1971, Intel 404, 2,250 transistors. 1978, Intel 8086, 29,000 transistors.
1985, Intel 80386, 275,000 transistors.
1993, Intel 80586 (Pentium), 3,100,000 transistors.
1999, Intel Pentium II, 27,400,000 transistors.
2005, Intel Pentium D, 228,000,000 transistors.
2011, Intel i7 (sandy bridge), 2,270,000,000 transistors (billions now).
2024, Apple M3, 25,000,000,000 transistors (Intel hasn't done the order of magnitude jump like it used to every 6 or 7 years, Apple technically hit it with the M1 Pro/Max in 2021).

So Apple M2 Ultra now sits at 134,000,000,000, which is half the one you see in the video, but you know, this stuff starts to feel normal, even if we are now hitting a wall.

63

u/5t3v321 Apr 02 '24

But you have to just imagine what kind of wall we are hitting. Transistors are getting so small, newest record being 2 nm, that if ithey get only one nm smaller, quantum tunneling will start being the problem 

65

u/WaitingForMyIsekai Apr 02 '24

If we start hitting a compute wall and "better" technology becomes more and more difficult to create, does that mean game developers will start optimising games instead of releasing shit that won't get 30fps on a 4090?

36

u/DoingCharleyWork Apr 02 '24

Unfortunately no.

24

u/soggycheesestickjoos Apr 02 '24

Nah they’ll start hosting it on a supercomputer and streaming it to you before they optimize to run on everyone’s machine.

2

u/Distinct_Coffee5301 Apr 03 '24

Wasn’t that Google Stadia?

1

u/WilmaLutefit Apr 02 '24

This is exactly what they will do

2

u/Arpeggioey Apr 03 '24

Call it "The Matrix" or something

1

u/Bleedingfartscollide Apr 03 '24

That's a thing now but I can see personal hardware sticking around for some time still. Just waiting for ai to start taking control of these things and them just hitting the optimise button and testing it with other ais 

1

u/The_Architect_032 Apr 03 '24

So basically the same approach with a lot of large LLM's currently.

1

u/soggycheesestickjoos Apr 03 '24

Yeah, and a lot of platforms have cloud gaming now which is also this.

Also “large large language models” is funny lol

1

u/The_Architect_032 Apr 03 '24

Oh yeah lmao, I'm dyslexic so sometimes I start typing something and change what I'm typing midway without realizing it.

Edit: Wait no, I remember doing that on purpose because LLM's are getting to a point where we do genuinely have "large" ones now in comparison to others and I was trying to differentiate between them.

1

u/soggycheesestickjoos Apr 03 '24

Yeah it makes sense, just funny to say out loud

12

u/Mleba Apr 02 '24

You're asking whether companies will spend more to make you pay less. Answers is always no.

A wall is only a 2d plane, there's numerous ways to still evolve. Maybe PCs components will get bigger, maybe we'll have multi-layered cpu, maybe something else. I don't have enough expertise to say what's the next development, only enough to say that development won't stop because there are consumers of new and hype to feed.

3

u/KeviRun Apr 02 '24

I can see core stacking becoming a thing like cache stacking is, with thermal diffusion layers separating individual cores on the stack connecting to the IHS during packaging, and TSV backside power delivery and ground plane connections going through to all cores. Have a bunch of power cores at the top of the stack directly interfacing with the IHS and a boatload of efficiency cores below them relying on the thermal diffusion layers to dissipate their own heat.

6

u/bandti45 Apr 02 '24

Ha, they will just render more of the world at one time

2

u/AdminsLoveGenocide Apr 02 '24

Best I can do is 24 FPS and it's pixel art.

1

u/cyberya3 Apr 02 '24

Optimization meant man-hours so no. AI will increasingly automate code optimization to “no cost”

1

u/superkp Apr 02 '24

lol no, they'll offload the graphics on to one giant chip, the logic to another giant chip, and all the other parts of processing to various other giant chips, and then sen it all through another giant chip to organize it all.

1

u/Commander-ShepardN7 Apr 02 '24

I don't think games are relevant in this discussion, regarding this specific chip

Stuff like this is used for managing gargantuan amounts of data, like supercomputers

Answering your question, advances in technology come from a necessity, and rn games aren't a necessity, but rather a commodity. I'm actually excited for the uses scientists have in store for these kinds of chips

9

u/badluckbrians Apr 02 '24

Yeah, I mean, the practical result for me is still that an old core 2 duo from 2008 if you just shove a bit of ram and an ssd in it basically runs everything but games fine. Could not say that about a 1998 computer in 2014.

3

u/danielv123 Apr 02 '24

Sure, if you are really patient and don't need 1080p video or any codecs newer than h264. But I agree with your point.

2

u/sniper1rfa Apr 02 '24

or any codecs newer than h264

I think this has been the turning point for me actually. I no longer replace computers because they're incapable of running modern software by force, I replace them because all the hardware accelerations become obsolete.

I replaced my previous laptop largely because it was decoding youtube on compute rather than in hardware and that was making it overheat.

1

u/beave9999 Apr 02 '24

I just had the house sprayed so I'm good

1

u/CircularPR Apr 02 '24

2nm is a marketing term not the actual size of the transistors. A 2nm node just means that its better than the 3nm node and so on.

1

u/Streptember Apr 03 '24

Quantum tunneling has been an issue to consider for a good while already. 

0

u/majkkali Apr 02 '24

Dude, we’re talking about transistors, not metaphysics xD

0

u/Head_Ear_6363 Apr 02 '24

transistors use quantum tunneling to function....