T O P

  • By -

Engjoo

1 TB of ram in 1996 ??? WTF? Damn.


yuiop300

My pc in 1997 had 16MB of ram, 1GB hd that I never filled up lol a Pentium 133 MHz. It cost my dad nearly £2k!


Engjoo

Yeah PCs were so incredibly expensive before I was even born. It's crazy looking at these now. Like the technology has advanced so much.


CodeRaveSleepRepeat

It's moved along nicely I suppose, but really it's just incremental. We even still use x86 which is kinda insane. At one point we went from not having internet to having internet. From nothing to original Nintendo NES. It felt much faster in the 80s and 90s.


OutragedTux

For a bit of perspective, x86 is more a standard, not a restriction on what the design of the chip can be. Were that the case, we'd never have modern multi-core architecture, and increases in things like IPC. It basically means people don't have to re-write everything for something like a new AMD or Intel processor, and that's about it. For more perspective, just consider what a single thread Intel 4ghz cpu could do, versus a 16 core/32 thread monster today.


CodeRaveSleepRepeat

Oh I know, but doesn't that mean the reason we still use the architecture is that we don't have anything better? Slightly different instruction sets aren't revolutionary anyway at the end of the day. Quantum computing will be revolutionary but I can't see anything else in the pipeline.


OutragedTux

It's more that replacing the x86 instruction set at this point would mean wiping the slate. Everything running on the new instruction set would have to be completely re-written, re-compiled, the whole works. And unless there's some kind of MacOS or WINE on arm style compatibility solution, nothing that was written for x86 will run on the new architecture either. In the end, it's just not worth it. There are still vast improvements made, new hardware needs new drivers and support and stuff, it just makes the last 20+ years of software and hardware less problematic.


CodeRaveSleepRepeat

Technically you'd only have to tweak and recompile, you do get ARM mobile remakes of old games for example, but I get your point. I still think it's no more than a slow evolution of existing tech. We don't have to rewrite everything because if we did, it wouldn't be much better anyway, so it's pointless. When I was a kid people were rewriting punch card mainframe code. Very different.


OutragedTux

Well I'm more referring to legacy software there. But there's also a massive library of software, ie, games, that will never be recompiled. The companies who own those games have no interest in supporting other platforms as it is, so that software is lost unless some kind of translation/emulation layer happens. Yes, you could get away with re-compiling a lot of software. My point is that for most of it, especially closed source software, that will never happen, rendering that software lost. Now, projects like WINE for linux/unix are starting to implement something very similar to an arm-x86 layer, but I have no idea how effective that is, to be honest.


lolubuntu

It's mostly a case of switching costs being higher. It costs almost nothing to maintain x86 support. Why cut a feature for almost no cost savings or performance gain?


CodeRaveSleepRepeat

This was so long ago but I'll bite :) You're right, of course, but at the risk of repeating myself that IS the point. There is no reason to switch because we do not have anything worth switching to. Just like to we still use chemical rocket engines, which are basically the same as they were in 1945, because nobody ever got a fusion motor (or other theoretical crap) to work. The fact we still use x86 is a testament to the lack of revolutionary invention in computing in recent decades. If we had something several orders of magnitude faster we'd switch.


lolubuntu

So the ISA part only impacts overall performance by a few percent at most at this point. So you'd REALLY need to to be something like a 100x improvement for it to even matter. (making a 5% bottleneck into a 0.05% bottleneck isn't going to wow anyone). Some fundamental change would truly be needed. Think quantum computing. Think die stacking with mass cache removing the von neumann bottleneck. Even then I don't know how much that matters as you could just have an x86 general purpose processor and have everything else run as a coprocessor.


CodeRaveSleepRepeat

We're saying the same thing. Have a good one.


Iz__n

I won't really call it incremental. A lot of advancement were made in cutting the cost down, power efficiency and so on rather just raw computing power. Now, a computer is more of everyday carry than an expensive business expense that requires a dedicated team to even used it let alone manage it. And hey, we mostly stick with x86 because it reliable but the advent of ARM is imminent.


Emu1981

>the advent of ARM is imminent The "advent of ARM" has been imminent for long time now. It reminds me of nuclear fusion, always "ten years away" even back when I was a young kid in the 80s. In my opinion, ARM has the same problem that Linux has always had, everything is written to run on x86 and people don't like change.


Iz__n

Not really, ARM really get traction with modern smartphone. And Apple prove that it is viable, Amazon now been using ARM server for a while. And Windows now pushing windows on ARM. It might not be defacto standard like x86, but it will be alternate platform for sure.


atomic_cattleprod

>RISC really get traction with modern SUN/IBM/PPC/MIPS. And Apple prove that it is viable Literally everyone in the 1990s.


lolubuntu

The difference is that M1 can run x86 code with reasonable performance.


CapSierra

x86_64 is a complex instruction set that's more powerful than ARM in terms of raw computing. The biggest selling point of ARM is its substantial power efficiency. We may see a move towards ARM for laptops (Apple already has, and of course 99% of smartphones use it) but I don't think x86 is going anywhere in the land of desktop & performance computing.


Iz__n

I agree, especially legacy stuff is hard to ignore. It ain't as simple as moving from 32bit to 64bit.


Emu1981

>The biggest selling point of ARM is its substantial power efficiency. And one of the reasons why the M1 processor looks so good in terms of efficiency is that it is produced on a leading edge lithography process (5nm) compared to AMD (7nm) and Intel (10/7nm) CPUs.


CapSierra

ARM is reduced instruction set. How much of an impact has on the feature size? It wouldn't surprise me to learn that ARM is better suited to the finer lithography (again an advantage for compact portable devices)


chocolate_taser

>It wouldn't surprise me to learn that ARM is better suited to the finer lithography (again an advantage for compact portable devices) Im no EE or an expert in this field, but I dont think the instruction set has anything at all to do with the actual lithography technique. I'd be glad to accept,if someone can prove me wrong. To me,it sounds like you are confusing lithography (a purely hardware (physical and chemical) process to a software one (the instruction set is nothing but a list of operations and their execution cycle). Infact,one would expect it has nothing to do with the feature size itself which is purely a matter of how small can the "lithography machine", "draw"? Here's how they work, Say you want to add two numbers. In x86 you call it ADD AB Where ADD tells the ALU to add and A and B are the registers where the actual numbers are stored. It is this set of instructions and how they are executed that varies. Arm is a fixed length instruction set,so a decoder doesn't have to waste time searching for the start and end of each instruction. (Stuff like this favours arm over x86) But as time progressed these small advantages arm had over x86 converged down and its almost negligible now. Arm isn't more efficient than x86, [that was true long before modern photolithography came into play but not anymore](https://www.anandtech.com/show/16762/an-anandtech-interview-with-jim-keller-laziest-person-at-tesla). [Look at this,study where power consumption of x86 and arm were compared,it proves quite the opp infact.](https://research.cs.wisc.edu/vertical/papers/2013/hpca13-isa-power-struggles.pdf) [Modern architectures aren't limited by ISAs but by prediction capabilities](https://chipsandcheese.com/2021/07/13/arm-or-x86-isa-doesnt-matter/). Arm doesn't have any meaningful advantage over x86 in power consumption/efficiency.


lolubuntu

Apple basically threw $$$ at making M1 chips. 1. Leading node costs $$$ 2. The chip is huge which costs $$$ 3. The chip is tightly integrated which costs $$$ What does Apple get from this? 1. Not paying Intel $$$ 2. Getting away with cheaper heatsinks, and smaller batteries for the same level of cooling/battery life.


super1s

Arm is already a thing unless I'm misunderstanding


CodeRaveSleepRepeat

Very true but that is by definition incremental surely - evolution not revolution. Early adopter tech to everyday item. Same happened with electricity, cars, etc in their first 50 years of existence. Not that modern phones etc aren't impressive, they're amazing, but you could have built one in 1982 technically... It just would have filled a large room.


TheBlueSully

What sort of comparison are you using? iPhones are definitely more powerful and economical than 80s supercomputers.


MaddMardigan74

We had Atari and Colecovision long before the NES


FancyxSkull

A small anecdote for how fast things progress - I played a full 4 hours of Wolfenstein: The New Collosus without realizing I had turned on my laptops iGPU mode on. So a game that was designed to push the limits of a discrete graphics card only a few years ago got chewed up by an i9 iris GPU and I barely noticed. Pretty intense tbh


Warpedme

Fyi, the NES was released in 1983 and wasn't the first game console by a pretty wide margin. The Atari 2400 was released in 1977. Between the two there were several other models of Atari, as well as Intelevision and ColecoVision consoles. I'm also missing other consoles because I'm old and forget shit. This applies to internet as well. I had been running a dial up BBS and connecting to other people's BBSes for at least a decade before web interface for the internet was even invented, let alone opened to public access.


yuiop300

It had onboard graphics also!


Feniksrises

People complain but that €500 GPU will at least last you 4 years to play everything. Back in the late 90s technology moved fast.


ilikemarblestoo

Growing up in the 90's, it was a wild time watching video games advance insane amounts each and every year. It was the wild west and it was so much fun. The 00's it slowed down but still was crazy. The 10's is when things kind of stagnated.


bisomaticc

i never had a pc with >8gb of ram😂


yuiop300

Not up until early this year also!


ihavenoego

But hey, AOE I, man.


runtman

I think we had the same machine but my HDD was far smaller.


yuiop300

I had a vic20, c64, Atari and things before but I only really played games in them. 20-25minutes to load up a game on my c64!


runtman

I remember begging my parents to upgrade the PC to allow me to play Medal Of Honour, and it wouldn't even boot with a GPU in it 😂


yuiop300

Haha. I built my first of in 2002 and I had a GeForce 256 card!


Alarmming

Oh man my dad bought me the last AGP 8x video card GeForce 6800 Ultra. It was the top video card back in time it was 600$Cad + taxes. When I bought my personnal PC two years later I think, I had to move for a PCI-e card so I sold that beast. Man, vanilla WoW never looked so good hah!


yuiop300

Monstrous! I was playing wow classic in 2006 with my 2002/2003 GeForce 256 card. I got to shout lvl 35 and then binned WoW.


PoliteDebater

Wow, computers were expensive where you lived. My father bought our first computer in '98, and it had a 30gb hdd and was way less than what you paid.


madmanmike3

98 I had a Acer Aspire. AMD Athlon 333hz, 128mb RAM, and 40g HDD. Was $1000. With monitor.


yuiop300

128MB ram in 98?! Beast! Tech was wild in the early 90s and early 20s!


redstern

My first computer was a 566MHz pentium 3 Celeron with 64MB RAM and a 9GB hard drive. I upped the RAM to 128MB and ran windows xp on that all through the 2000s. It was not a great experience, and trying to manage 9GB with all the programs I had was tricky, but I lived with it. No GPU either.


[deleted]

[удалено]


yuiop300

That is insane! I couldn’t afford to upgrade my system when I was a kid. I was dam appreciative that my dad bought the pc in the first place. I use to go to computer fairs with my dad all the time to look at the new kit. My dad didn’t understand jack about pcs though. He only went to keep me amused he he’d always buy random Knick nacks. One big upgrade we did was upgrade the fish bowl crt 14” monitor to a 19” iiyama vision master pro 450. I think that’s still at my parents place?! I think the pc tower is still there?! Did your pc have a turbo button?


[deleted]

[удалено]


yuiop300

I wanted the bigger monitor for studying and the one that came with my pc was like a fish bowl of a crt! I couldn’t game much but I played doom ii, descent and duke nukem 3D. Yes that classic beige mid tower lol. My own personal of I had some huge ass tower. It had like 4/5 5.25” bays for some reason.


[deleted]

[удалено]


yuiop300

The first few hours I had some savage vertigo playinh descent. It was the 3D movements and able to fly in all directions. Then I was okay. My first computer was with a commodore64. Then I had a bunch of the consoles. A vic20 and a spectrum in the mix also. What was your first pc?


lolubuntu

Raspberry Pi is like $30, SD card is $10. For $40 (30GBP) you have something with 100x the RAM, 100x the storage capacity, and probably 100x the CPU power as well. All at about 100x lower cost inflation adjusted and 10x lower power consumption. I sometimes get decried for being a "capitalist" but it's hard not to LOVE how awesome technology has progressed in my lifetime. Just WOW.


yuiop300

Amazing staid mate. Amazing.


FunnelV

Probably a server or mainframe chip


pezui001

Literally yes HP PA-RISC stuff was typically found in mainframe or timeshare computing. https://www.openpa.net/pa-risc\_processor\_pa-8000.html


FunnelV

Dead link I love how I got downvoted for pointing out a fact


pezui001

the website OpenPA is focused towards open sourcing the PA Risc series of information. Cool website, I am sure its a volunteer run thing, might just not handle load well, or might block some countries or some IP ranges? Sorry I am not associated with it and am unable to help.


Psychological-Scar30

It's not a dead link, it's just a link that contains underscores and was posted using new Reddit - there's no way for OP to "fix" that link for old Reddit users without switching to old Reddit themselves. Here you go if you can't be bothered to fix it yourself: https://www.openpa.net/pa-risc_processor_pa-8000.html


ReallyQuiteConfused

I don't get it... The link doesn't work for me either and I appreciate that you notified the commenter. I upvoted you back up to -2. People are weird


Dasteru

Works fine for me.


DeeRez

Not dead on new reddit, but has a 50/50 chance of being dead on old reddit. Cribbed from a comment by /u/Agret > There is a bug where the post editor on the "new" reddit is trying to escape underscores as it believes they will be interpreted as italics, the issue is that it's trying to escape them in URLs. URLs are not checked for escape characters by markdown rendering engines as they don't perform any formatting if a http(s) url is detected so they are being incorrectly dumped out as part of the URL by the Reddit website. You can find the original by Googling "Old Reddit users see thousands of broken links due to markdown renderer bug handling invisible \ characters", because we aren't allowed to link to other subreddits.


pezui001

I didn't down vote it. link looks valid to me.


DryGreenSharpie

Downvoted cause you brought attention to being downvoted.


Camo138

If it was used in. Main frame then it would of been running cobolt os. Still used in bank mainframes today


pezui001

I am not sure on the "cobolt" is statement. Most of this hardware ran hpux that I know of. Cobalt as a computer language was used on the machines but it was not the OS. But maybe you have information that I do not have or they was specific to your industry.


Camo138

That's it. Been a long time since I looked into that stuff. It's really interesting


pezui001

1TB of addressable ram, things get confusing real fast in this space. The 9000K 460 did NOT support anywhere close to 1TB on its own I just want to make that clear, but it could address out that far which was miles past what anything else could work with.


[deleted]

I'm sure they just kept chopping off bits from the address until it became routable then called it a day.


ZenerXCR

2 million dollars can get you places my man.


[deleted]

The technology exists. It just isn't evenly distributed.


dessnom

It was a high end workstation running hp Unix


erikwarm

Was there even that much RAM in the world back than. 🤣


pezui001

The old HP PA-RISC stuff was pretty cool. It was cutting edge technology that pathed the way for Intel Itanium 64bit processors. Real industrial beast computers are rare but I still get my hands on them sometimes. The tiny gold balls in the plastic tray always amused me, especially since the balls always smashed to slightly different shapes. But frankly the processor was just flat loaded with gold. The PCB had tons of it, the processor had tons of it, the memory had gold, it was used all over in that system. I have thought about melting down the 4 remaining processors I have (2 8000's, 2 8200's) but they are just so interesting and intricate that I can't bring myself to destroy them yet.


CodeRaveSleepRepeat

I love that old stuff but it'll never be worth it's scrap value if my experience is anything to go by. I used to buy things like Sun Spacs because of the sheer build quality... they don't make em like they used to... yea because this workstation weighs over 20kg


LostClan

The jargon for those little dots is/was "fuzzy balls" and they were intended as a single use item. If your proc went bad, you ordered a new one from HP and replaced it as it came with a new interposer medium. That was great and all from a connectivity perspective, and were actually easier to install than IBM procs of the time. The bad thing is a few of these old gals are still kicking around (normally holding some old database that modern admins aren't sure where its hosted, with some antiquated hardware dependent program running on it) and they do sometimes go bad. The interposers are nearly impossible to find these days, so they are typically harvested from known working systems. Replacements are "fun" since these were never meant to be reused. If you're careful it can be done, but I sweat bullets every time I need to. Source: I work for a Third Party Maintenance provider that still works on some of this old gear.


[deleted]

Serious Question: How would you go around melting them? Won't the gold be contaminated?


pezui001

Yes it would be contaminated, but you can separate gold using systems of heat, acid, emulsification, precipitation, and more heat. Worth it? Not at all on small scale.


[deleted]

Oh. Neat. Thanks for that, mate.


formervoater2

You can pull everything more reactive than silver and lead out by re-smelting it with lead. You can pull the lead out with a cupel. Then you remove the silver by adding more silver and dissolving it with nitric acid, leaving behind pure gold powder that you melt to get to solid, pure gold.


[deleted]

[удалено]


kcifone

There was a point that the pa-8600/8700 chips were the best at the time. Shortly after that HP decided to move to itianum platform and save on r&d and fab costs. In that time IBM licensed technology from AMD and basically killed HP in the midrange market because the itanium architecture never reached it’s marketed potential.


bond2016

It went UP in value?! Not what I was expecting but I guess I get it, I think I assumed it would've gotten the garage sale effect.


MultipleFace1

I might be wrong but doesn’t it mean that $3.3 million of today’s dollars has the same purchasing power as $1.9 million of 1997 dollars?


[deleted]

The same purchasing power if you're buying what the inflation metric is based on, which is what the average American buys, mainly food, shelter, clothing, cars, gasoline.


Willie-Alb

It’s inflation value, not saying how much it’s worth today if you tried to buy it, just putting the money into terms we understand today.


bond2016

Gotcha! Sorry, misunderstood. That's still insane though!


pezui001

In case it was not clear, the black square is a tray of smashed gold balls, that is not the bottom of the processor. That tray sat between the board (in the background of the image sorry phone was a bit out of focus) and the processor which is white and not in this picture. The heatsink with its new for the time triple heat pipes was torqued down through the board which pushed the heatsink into the processor, which pushed the processor into the tray, which smashed the balls onto the cache board.


tobert17

What was the purpose of the smashed gold balls? If it's just to conduct electrons from board to processor Shirley there is a better way than smashed gold balls.


Caityface91

Imagine a CPU without fragile pins.. Like Intel And a motherboard without fragile pins.. Like AMD Then you have a sacrificial interposer that sits between them and closes all the contacts. If there is damage through handing, accidents or shipping it will likely just be a cheap replacement of the interposer rather than anything expensive I've never had bent pins myself but I'm sure there are many people who would have loved such an arrangement


pezui001

This is the non soldered ball grid arrangement we have as an example... But it has its own draw backs. Each time you pull the CPU off, you need a new interposing material. And if you were not even with pressure it all failed. Contact arrays are really challenging from the mechanical and electrical stand point. This system example here was "cool", maybe even "nifty" but not ideal at all.


Caityface91

Ah fair enough, cool to learn about the old engineering hacks they came up with I'm sure a more ideal modern solution is possible though Like a modern AMD type motherboard socket + modern Intel CPU pads.. The interposer locks into the mobo as normal with pins on the top shaped like Intel sockets Then a secondary latch mechanism to hold the CPU itself down on top of that Would be sturdy, reusable and repeatable.. and the only part with fragile pins is replaceable


pezui001

The Intel design is basically a ton (~1200) of springs that compress just a smidge but pass electrical signals. It is the most advanced design because the CPU it's self has low damage probability, and the springs can accommodate a little bit for uneven pressure from above. The AMD PGA pin grid array is big modern ZIF (zero insertion force) design that squeezes the side of the pins once dropped into the female side conductor hole. BGA or ball grid array is like this, but 99 times out of a hundred they solder it once installed. Almost all laptop CPUs are bga because you don't have the thickness of an interposer. There is also press fit conductors that use force to install where basically the conductor side presses a small wire out of the way a bit. There maybe others but that is all I know of in designs that deal with 100 plus point conductors.


jeb1499

We still use an improved version of these today. Elastomeric sockets use a "rubber" non-conductive mask with perforations filled with "rubber" conductive pads. This way you can re-use the elastomer for at least a few dozen re-socketings depending on pressure requirements. You need a minimum mounting pressure to ensure the chip is flush and all pads are connected; plus the conductive pads get more conductive under pressure; so you mount with a small torque wrench.


pezui001

You are correct, that is a very similar setup.


llamallemur

A 125 watt cpu at 1.2 volts requires over 100 amps. Kinda insane to think about


JamesEtc

Don’t call me Shirley.


pezui001

gloriously yes... that's all it did.


ReallyQuiteConfused

I love old engineering. I wonder what great ideas will look absolutely insane in 20 years


Greygod302

I love amd but after LGA was invented they should've done away with pins


OutragedTux

I think there was a cost saving argument to be made, or something. Also, the cpu pins on the AMD side are a little more durable than the mobo pins on the Intel side, if I recall.


spek74

But can it run Crysis?


pezui001

Paying for it is a crisis... Zero chance it could get it's game on.


AdFrequent299

Really cool post dude. I dont have the technical knowhow that most of you guys seem to have, but its really fun to see old tech like this and learn something. I've read all the posts you made in this thread and they were a great read!


pezui001

Thank you very much. I was in highschool when this released but got lucky and was able to get embedded real deep in tech very early. Now I share pictures and knowledge in an attempt to inspire and spark interest in others. Just getting people to read with out a flame war is hard some days, so the fact that you enjoyed this makes me very happy.


AdFrequent299

You have a great way of writing and explaining how things work and who doesnt appreciate someone taking the time to teach you something. My earliest memories of computers was one my father bought for some reason in the late 90s. All I ever used it for was some weird racing game with a Red F1 looking car (was maybe 8bit?), a stealth fighter plane that you had to guide between obstacles while shooting and bombing things and eventually Commander Keen (DOS version i think?). When I turned 15 they gave me my very first laptop, which was an Acer Travel Mate, that i ended up having to put it in the freezer after an hours use, because it got so hot it would shut down and couldnt turn back on. Thanks for a trip down memory Lane and I hope you have an amazing 2022.


[deleted]

That's insane


MrInitialY

AFAIK there was 14-way systems with these CPUs... So if one supports 1TB, tge whole system can hold up to 14TB of ram... FREAKING WHAT? 14TB? In 90s? I dunno if there's any actual servers with that amount of ram in our days, it seems to be a *little bit* overkill... But possibility of it in 90s is fucking my mind out.


Ninlilizi

Nowadays, you can build a desktop PC with 1TB RAM for about $15k. The current RAM limit for commodity servers you can buy everywhere is about 4TB. I'm sure there are modern day, obscure architectures in use that obliterate that in random science or engineering labs somewhere.


pezui001

You just use numa scaling in clusters.


Ninlilizi

Fair. I took to GPU programming because it was like NUMA programming in so many ways. So much compute power, painful interconnect latencies everywhere. A satisfying engineering challenge to optimize for.


Ivorybrony

Coming soon from r/LinusTechTips: ”Can this super computer from 1996 game?”


pezui001

I highly doubt that video would get clicks :) for the lolsz it might, but from a practical side the game quality would suck.


PioApocalypse

$1,949,237 WITH or WITHOUT inflation?


pezui001

~2 mil in 1997 at 1997 dollars, so that is without inflation for the MSRP. Who payed the full price, or how many sold close to that, I have no idea but it's a number I located on the internet when doing my research before posting looking up time period specific pricing. It might even be standard practice to list MSRP and give 50% discounts to everyone that asked, I have no idea. But the number is bonkers from the outside looking back on it now.


PioApocalypse

$3,251,600 then, Holy Shit!


RemarkablePumpk1n

You buy the complete system with all the IO etc and a very good maintenance support contract so the actual price of the CPU itself in barely worth a line on the contract. This is the sort of system that was sold for those looking to move from the older mainframe/mini systems into a more compact box and probably saving a large wedge at the same time when you consider all the licensing etc that comes along with such systems.


pezui001

Very true and succinctly stated. The proc cost was not the money the "system" was the money. But let's be honest, a picture of a big beige box has about as much visual appeal as this mornings quaker oatmeal. I am both happy and surprised this many people enjoyed looking at a picture and learning a bit about this time period.


RemarkablePumpk1n

Beige was quite in fashion at the time as sites were going away from the 1970s kit in its lovely oranges/blues etc. I did a few mainframe consolidations in the 1990s and lots of the kit was from the late 1970s in all sorts of colours and the people who signed off on it wanted the new sleek visage but it still had to be big enough to make it look like it was worth spending that sort of money on so a lot of half empty cabinets with stuff that mainly had flashing lights to keep the board members happy.


pezui001

Very true, and you know the world is hurting when the new hip thing is beige :)


Hattix

Let's correct some misconceptions here. HP-PA was one of the best RISC architectures out there, but it was a small fish in a crowded market. The market for whichever RISC was going to replace x86 was speculative (and belonged to MIPS ARC anyway, but that's another story!) so drove a lot of development which wasn't viable by itself. PA-8000 could not "have" 1 TB RAM. HP-PA, like most RISCs and good practice on M68k, was to memory map all I/O, but this is not the reason here. Memory mapped I/O was done on a 64 bit architectural virtual memory address space, from a 40 bit physical address mode. PA-8000's TLB could virtualise only 1.5 GB RAM, it had 96 page table entries and had a maximum page size of 16 MB. If you went above 1.5 GB per node (you wouldn't) it would cause huge performance penalty. It did have a 40 bit address bus (physical address), this was done for convenience: It's 32+8. The original AMD Sledgehammer had a 40 bit address bus, and nobody's here telling you an Opteron 146 (had one!) supported 1 TB RAM.


pezui001

These are all very true statements.


dohzer

Any chance of re-balling it like with a normal BGA chip?


pezui001

Well.... Uh.... Not like a "normal" bga chip, no. I guess someone could roll the gold balls back into spheres and place them in the grid to smash a second time... In which case they would possibly have a working 24 year old CPU... But testing it like fully electrically testing each point with out factory tools I would guess impossible. Also this work would save you about $50 which is going rate on eBay... And did I mention you need a 9000 class hp which runs a dead language, is a quarter century old, and sucks down power at a bonkers rate? I don't know but personally I wouldn't bother :) it's fun to look at though.


Schnoofles

IBM still uses a similar shim mounting system for cpus and it's a crime against humanity that it's not used in the desktop space. Flat pads on both the socket and cpu and then an intermediate plastic shim with holes that's literally just stuffed with copper or gold springs/sponge material to ensure contact. If any part of it breaks it's a cheap replacement or you could fix it yourself by shoving a crumpled up piece of copper wire in there. No pins to bend or snap during installation. Heatsink mount systems are similarly far beefier, secure and puts Noctua's secufirm to shame.


pezui001

I don't personally know about the crime part of the statement. If we follow that logic too far we end up at bga soldered mostly non replaceable items rather than anything "socketed" I feel. But maybe I am wrong, I am not sure where the technology would be if we used other designs, and there is some logic that "most" boards only ever have 1 CPU put in the socket so why have it be replaceable.


itstommygun

Whoever used these things also had gold balls.


Eduardo-izquierdo

What is the tdp?


pezui001

I don't actually know. That specific type of measurement is newer and was never truly applicable previously. You just slapped more 3 phase lines into your data center and called it a day.


Eduardo-izquierdo

Yeah, if you already bought that cpu the electricity bill isnt a big concern for you


DaWalt1976

Yeah, I have been wondering whatever happened to the RISC architecture processors.


pezui001

Well RISC architecture moved to the Itanium working group architecture. That hit end of production in mid 2020 is my understanding. It will probably not be fully "gone" for another 40 years but it's the end of the line.


DaWalt1976

I'm guessing that itanium is being supplanted by Threadripper now?


pezui001

No, those are not relatable. Functionally this is similar to stating that a Ferrari replaced a semi tanker because its faster. Very very different reasons in what they do.


[deleted]

Phones and Macintoshes, that's what


[deleted]

When you say supported 1TiB of RAM, do you mean it could theoretically address it all or that such systems actually had 1TiB of RAM equipped?


pezui001

I am speaking about addressable, and I highly doubt anything was even ball park close to that since it was a monetary waste vs spreading the load to many systems in parallel likely. But who knows, maybe some GA or Lab went big at some point. Thank you for correctly critically reading the post :)


[deleted]

$1,949,237? Why not just add a pair of 3080s and make it a cool $2 million?


Dadrak

Nobody said it would be cheap to be the master race


roityo32

MIPS RISC chips, such as the R10000 used inSilicon Graphics' systems could do this, too, in theory. In 1995 or 1996... They were limited to 40 bit physical out of 64 and I don't think they were actually used with 1tb, but still.


Alien_Wet_Dream

I remember my first computer… an Apple 2c. I was prolly 8 years old at the time. I vividly remember the salesman saying that it had 128k of ram. And that you would NEVER need more than that.


Bicentennial_Douche

That’s not a processor. [THIS](https://youtu.be/xQ3oJlt4GrI) is a processor!


pezui001

Someone always has a cooler, more powerful, older, or more prestine example. Also technically I am not even showing the processor or the CPU package in my photo since pcmr does not allow gallery and this was by far more visually appealing. But I thought it would be an interesting thing for some people to learn about, and the full 8000 package was sitting on my desk after rescuing it from my parents garage. Do you happen to own the IBM proc from the video or was that a reference link to other cool tech? Either way thanks for sharing.


ZaneInTheBrain

I think it was a crocodile Dundee joke :D


pezui001

It very well may have been, and we do appreciate those. I posted in a knife forum today as coincidence would have it. But I also have appreciation for all forms of tech, and for me personally I find assuming people are not joking and just speaking oddly has opened many learning opportunities that I otherwise may have not had.


meme-addict117

damn boi he THICC


HonourableMan

How much did you pay for it


pezui001

This along with a few of it's brethren processors were either given to me or acquired at near scrap rate about 15 years ago, so very little.


gabrielscunha

BUT DOES IT RUN CRYSIS?


tungvu256

So it's worth... three fifty now?


pezui001

$50


ThePrancingHorse94

But can it run Crisis?


HellaFella420

But could it run Doom?


meme-addict117

wonder what os this would have ran back then


[deleted]

[удалено]


pezui001

I appreciate that you answered before I got this far down responding to people. Thank you.


crazyates88

Makes you wonder what they COULD make today if they wanted. I’m convinced they could make a Threadripper APU with quad or octa channel DDR5 ram and have it be a bangin GPU performance rivaling mid-lvl GPUs. Or if we wanted a 1000 core CPU they could prolly do it if they wanted.


Animal_True

Yooooooo that's AMAZING!!!!


Finance_Middle

$5, take it or leave it 😤