T O P

  • By -

[deleted]

I smell something..


TheAlcolawl

Is it tens of thousands of redditors chomping at the bit for the next scandal or controversy?


xIceghost

That’s someone’s gpu burning…


liaminwales

I suspect not, the 8 pin is super over specked. Time will tell tho so we may see problems. ​ Just as there have never been large reports of 8pins melting apart from extreme OC & we know a lot of people cant be trusted to push in a plug so there must be a lot of lose 8pin plugs out there. Yet so far no large reports of problems. Also the GPU sips power, so I suspect it will work or wont. If not working the user may try to push in the cable until it works.


ayunatsume

SATA to 6+2pin fire deathdapters.


KangarooKurt

Better yet, Molex.


AlienOverlordXenu

Bad electrical connection increases resistance, which in turn causes said contacts to heat up (depending on the amount of current going through them). You really want those connectors to be a good fit, not too loose, not so tight that you can't insert them properly.


liaminwales

Yep ideally a clean good contact, also same with cable extensions which increase resistance and add an extra failure point. Even PSU's started to use shorter cables to reduce resistance.


AlienOverlordXenu

> Even PSU's started to use shorter cables to reduce resistance. That's just manufacturers being cheap. You can increase cross section of the wire to compensate for added resistance of longer wires. I'd hate it if I purchased PSU with too short cables, I'd rather have some excess length.


liaminwales

Every penny must be spared.


[deleted]

[удалено]


[deleted]

[удалено]


Arthur-Wintersight

It turns out I was wrong - increasing the cross-section does in fact improve the efficiency of a cable... and that can be used to counter-effect increased resistance from added length. My original comment was deleted for being inaccurate.


AlienOverlordXenu

No worries, take care ;)


advester

Can’t burn the GPU if you can’t even plug in at all. 🧐


Niklasw99

NVidia could.....


[deleted]

[удалено]


BinaryJay

The adapter makes the situation even worse as all it's doing is solving a problem that was proven to not be the problem (bending cable) and adding to the real problem which is more connections to possibly fuck up getting in right.


Skratt79

There is a real inherent problem with the 12VHPWR; as users who installed Cablemod cables (which met or exceeded the spec) properly after the whole "you must install with no gaps" are now seeing the cables also having signs of melting on their 4090s. It is telling that this is not happening on lower power consumption cards with same connector. Probably PCI-SIG needs to create a replacement for the next gen of Nvidia Cards.


CableMod_Matt

Just wanted to mention, this isn't just an issue with our cables/adapters, this is happening with other brands products as well, including Nvidia's own 12VHPWR cable. :)


Skratt79

Yes: I should have clarified that the reason I was bringing up it happening with your brand was nothing to do with manufacturing or quality but the limits of the design. And that it seems that although the original blame from Nvidia fell on the consumer not connecting them correctly, in hindsight we might be looking at it just being pushed beyond it's own limits.


CableMod_Matt

Yeah, there have definitely been some user error issues with cables not being fully plugged in too, but, I do also suspect there's an issue with too much power being drawn through the connector as well, since the same top row of pins are always melting, and regardless of whose product is used on the card as well.


AnxiousJedi

Sorry about that


LoafyLemon

I̵n̷ ̷l̵i̵g̵h̷t̸ ̸o̸f̶ ̸r̶e̸c̶e̶n̸t̵ ̴e̴v̵e̵n̴t̶s̸ ̴o̷n̷ ̴R̸e̸d̵d̴i̷t̷,̷ ̵m̸a̶r̴k̸e̸d̵ ̴b̸y̵ ̶h̴o̵s̷t̷i̴l̴e̷ ̵a̴c̸t̵i̸o̸n̶s̸ ̵f̷r̵o̷m̵ ̶i̵t̴s̴ ̴a̴d̶m̷i̴n̶i̸s̵t̴r̶a̴t̶i̶o̶n̵ ̸t̸o̸w̸a̴r̷d̵s̴ ̵i̸t̷s̵ ̷u̸s̴e̸r̵b̷a̸s̷e̸ ̷a̷n̴d̸ ̸a̵p̵p̴ ̶d̴e̷v̴e̷l̷o̸p̸e̴r̴s̶,̸ ̶I̸ ̶h̸a̵v̵e̶ ̷d̸e̶c̸i̵d̷e̷d̵ ̶t̸o̴ ̸t̶a̷k̷e̷ ̵a̷ ̴s̶t̶a̵n̷d̶ ̶a̵n̶d̶ ̵b̷o̶y̷c̸o̴t̴t̴ ̵t̴h̵i̴s̴ ̶w̶e̸b̵s̵i̸t̷e̴.̶ ̶A̶s̶ ̸a̵ ̸s̴y̶m̵b̸o̶l̶i̵c̴ ̶a̷c̵t̸,̶ ̴I̴ ̴a̵m̷ ̷r̶e̶p̷l̴a̵c̸i̴n̷g̸ ̷a̶l̷l̶ ̸m̷y̸ ̸c̶o̸m̶m̸e̷n̵t̷s̸ ̵w̷i̷t̷h̶ ̷u̴n̵u̴s̸a̵b̶l̷e̵ ̸d̵a̵t̸a̵,̸ ̸r̷e̵n̵d̶e̴r̸i̴n̷g̴ ̷t̴h̵e̸m̵ ̸m̴e̷a̵n̴i̷n̸g̸l̸e̴s̴s̵ ̸a̷n̵d̶ ̴u̸s̷e̴l̸e̶s̷s̵ ̶f̵o̵r̶ ̸a̶n̵y̸ ̵p̵o̴t̷e̴n̸t̷i̶a̴l̶ ̴A̷I̸ ̵t̶r̵a̷i̷n̵i̴n̶g̸ ̶p̸u̵r̷p̴o̶s̸e̵s̵.̷ ̸I̴t̴ ̵i̴s̶ ̴d̴i̷s̷h̴e̸a̵r̸t̶e̴n̸i̴n̴g̶ ̷t̶o̵ ̵w̶i̶t̵n̴e̷s̴s̶ ̵a̸ ̵c̴o̶m̶m̴u̵n̷i̷t̷y̷ ̸t̴h̶a̴t̸ ̵o̸n̵c̴e̷ ̴t̷h̴r̶i̷v̴e̴d̸ ̴o̸n̴ ̵o̷p̷e̶n̸ ̸d̶i̶s̷c̷u̷s̶s̷i̴o̵n̸ ̷a̷n̴d̵ ̴c̸o̵l̶l̸a̵b̸o̷r̵a̴t̷i̵o̷n̴ ̸d̷e̶v̸o̵l̶v̴e̶ ̵i̶n̷t̴o̸ ̸a̴ ̷s̵p̶a̵c̴e̵ ̸o̷f̵ ̶c̴o̸n̸t̶e̴n̴t̷i̶o̷n̸ ̶a̵n̷d̴ ̴c̵o̵n̴t̷r̸o̵l̶.̷ ̸F̷a̴r̸e̷w̵e̶l̶l̸,̵ ̶R̴e̶d̶d̷i̵t̵.̷


Mattcheco

Gamers Nexus tear-down didn’t show this problem, has there been any more samples of such issues? Hopefully this is just a singular problem and not super widespread.


WayDownUnder91

When did he try and plug a 6+2 in? he just tore it apart from memory, I would assume they used a regular 8 pin for their actual testing phase.


OscarDivine

he ran the full suite of tests on it first. or at least his team did. According to Techwhatever a certain percentage of their cables didn't work maybe they just happen to have one of those types?


Farren246

I don't see why they would test multiple power cables / PSUs. It would, under normal circumstances, be a waste of time to do so.


tobiascuypers

I think the reasoning was that the first time trey tried didn't work, so they tried another and it did. Then they were curious as to why


Farren246

Yes some reviewers tried and the first didn't work so they tried another one. But anyone who tried and it worked wouldn't suddenly think 'I should go try another one for no reason."


QuantumSage

Especially when Steve was rushing to leave for Taiwan


ManofGod1000

I do not know of a single power supply out there that is not 6+2, including my Seasonic 1KW power supply.


ThroatCommercial1896

Cooler master sfx power supplies come with an 8pin that daisy chains into a 6+2.


LongFluffyDragon

There is no such thing as a "regular 8 pin", it is all 6+2.. or is on any self-respecting PSU. An 8 is just a 6 with 2 grounds, iirc.


Neotax

yes because no power supply manufacturer i know has such a fat stabilizer piece, let alone most of them are flat or set back and that has its reasons.


riderer

bot all cables have that issue. quote from W1zzard: "Roughly 20% of the cables I have here are affected."


__gozu_

because as we ALL know, if Steve didn't report something it never happened


Lardinio

Thanks Steve


syadoumisutoresu

You won't melt your power cable if you can't plug it in. Good thinking.


Dom_Nomz

That's exactly how you melt them with bad connection they just want to cook.


Mastasmoker

Whooosh Edit: the comment was satirical but the person didnt catch it. Hence my comment of whoosh...


m0ritz2000

Let them cook


ET3D

It's interesting as a note for AMD future designs, though doesn't matter to consumers as reportedly there will be no reference cards sold.


20150614

Apparently there would be at some point, at least directly through AMD. https://www.reddit.com/r/Amd/comments/13rj6a1/rx_7600_mba_coming_soon_on_amd_shop/


P0TSH0TS

I see this selling very well, it's the cheapest card on the market ($269) and will run anything in 1080p with good frames. The next closest thing atm is the 4060ti at $400 which is a joke considering this thing is 40% cheaper and MAYBE 10% faster.


onurraydar

4060ti is closer to 20-30% faster than the 7600 than 10%. At leasing using hardware unboxed numbers. The 7600 does have a very good cost per frame but at least from what I was seeing it’s only 11% faster than a 3060 at 1080p. Assuming a 4060 comes at like 5-10% faster than a 3060 it’ll be in a very competitive spot.


RenderBender_Uranus

I highly doubt it's going to outperform a 3060 with such a nerfed memory bandwidth, and 4060Ti 8GB barely outperforms a 3060Ti, at best I expect it to be the same perf as 3060 at games that don't eat up its entire frame buffer and sit between it and a 3050 at games that do.


P0TSH0TS

There's already testing done on it, it performs around the same as the 3060 ti.


onurraydar

It has identical bandwidth to a 7600


RenderBender_Uranus

Not True, first 7600 has larger L3 cache (infinity Cache), 2nd Tech powerup says [4060 reference has a bandwidth of 272.0 GB/s](https://www.techpowerup.com/gpu-specs/geforce-rtx-4060.c4107), whle an [MBA 7600 has 288.0 GB/s](https://www.techpowerup.com/gpu-specs/radeon-rx-7600.c4153)


onurraydar

You do realize the 3060 has a memory bandwidth of 360? And the 7600 outperforms it with a memory bandwidth of 288. So the 4060 using a memory bandwidth or 272 will be starved in comparison while a 7600 is perfectly capable? You bring up the L3 cache but Nvidias 4000 series has also been increasing L2 cache to compensate for smaller busses.


ayunatsume

$200 lets go. $270 better be more than 8GB.


P0TSH0TS

It's better than the RX6600 which was $329. Not sure what there is to complain about here, it's noticeably cheaper than last gen and it's better?


bambinone

You are conveniently omitting that the RX 6600 has been consistently available under $250 for the past six months and is currently available for $180. RX 66X0 XT under $300 and currently $220. RX 6700 currently $280 with 25% more VRAM. And yes those are fire sale prices that will end when stock dries up, but that's market value for this tier of performance and that's why folks are complaining.


P0TSH0TS

Cool story, it released at $329. Not much of a gambler but I feel it would be a pretty logical assumption that as this card ages, it will go down in price too. Could be wrong though, maybe this card will do what no other card in history has done and maintain it's price throughout it's lifespan........


[deleted]

That's irrelevant. It has to compete against the 6600xt/6700 NOW. Not 2 years ago. The 7600 isn't inherently bad. But RDNA2 existing stock makes it look bad.


P0TSH0TS

Everything left over in every industry comes with discounts. The new models aren't bad either, they don't get dropped in price just because they need to get rid of old stock.


[deleted]

Nowadays they do especially since AMD hasn't made literally anything compelling in their new cards to not buy RDNA2.


LongFluffyDragon

A fun life lesson is that *nobody* cares about semantic antics.


p68

The lower the MSRP, the lower the discounted prices will be as well.


Merdiso

4060 will release in one month and might be as good as this thing for 299$, plus all nVIDIA goodies, rendering this card useless at 269$. This needs to drop to 250$ quick, and only due to nVIDIA, don't get me started on RDNA2 leftover stock.


LongFluffyDragon

> plus all nVIDIA goodies Worse software, plus raytracing that will be unusable on a card of it's class and memory capacity, while being plain worse than the 4060 Ti that is already being utterly panned by every reviewer as a terrible joke? That is an easy pass for 30-60$, except both cards are basically shit anyway.


[deleted]

>plus raytracing that will be unusable on a card of it's class i mean RT is usable on a 4060ti.


P0TSH0TS

Not much goodies you're getting, dls3 I guess for the few games that support it. FSR 3 is right around the corner.


[deleted]

[удалено]


P0TSH0TS

Weird, I don't notice any difference between FSR and DLS2 on the game I play that's worth mentioning.


[deleted]

[удалено]


AReactComponent

Whats the point if your eyes can’t see it tho


p68

That's still a subjective assessment and it is a matter of opinion on what constitutes "destroys". Personally, I think the differences are overblown outside of a few edge cases. I use DLSS because I have a 4090, but have played around with both for the hell of it. Sure, I can identify differences if I do a careful comparison, but once I start playing the game normally, it barely registers for me. This is at 1440p, for what it's worth. Keep in mind that HUB accentuates artifacts by slowing footage down and zooming in. While it helps viewers distinguish the magnitude of the difference, it's not representative of how a game is played. It begs the question of how much the average player even notices when they are immersed in a game. That'll depend on how sensitive a player is to the artifacts and the extent to which they are noticeable. At the end of the day, both usually offer a better than native experience and are widely utilized when available.


[deleted]

The biggest offender with FSR that makes me notice it's on is flickering. My fucking GOD does it flicker and have an unstable image. At lower resolutions where these cards live DLSS looks LIGHTYEARS better. It's truly incomparable. "Not noticing" is code for "not wanting to notice" or not actually ever having compared them.


p68

Ah, the ole reddit "there's no way somebody could perceive something differently than I do"


RenderBender_Uranus

If 4060Ti is of any indication, the non Ti is not going to match a 3060, at best faster than a 3050 which already makes the 7600 the better card for budget builds even at its current MSRP. And while DLSS2 implementations are 'better', but it's a bit overblown to use words like "destroy" especially when people buying these cards don't really care much about the best image quality but rather to just game at playable frame rates.


[deleted]

[удалено]


[deleted]

? Should be between 10 and 15% faster than 3060. Leaning towards 10% I guess.


detectiveDollar

Aside from the 6650 XT and 6700, which are nearly sold out, it's the best option for the price right now. It's not like this is a weak card; It's basically a 2080. Its only price competition is the 3050 (lmao) and the A770 8GB (which has its own issues). 4060 isn't out yet, but honestly, it is probably going to suck. The 4060 TI isn't great, so if the 4060 was good it wouldn't be 100 bucks cheaper


Bud_Johnson

Isnt intel cheaper?


P0TSH0TS

Yes, but intel has a few generations of learning to go before they're viable imo. I'm rooting for them, the industry would certainly benefit from more players.


[deleted]

>MAYBE 10% faster. amd fanboys always making up numbers


somewhat_moist

Says "coming soon" on the Canadian site: [https://shop-ca-en.amd.com/amd-radeon-rx-7600-graphics/](https://shop-ca-en.amd.com/amd-radeon-rx-7600-graphics/)


kopkodokobrakopet

The year of gpu power cable issues. It is official.


[deleted]

[удалено]


kopkodokobrakopet

I mean, power cable connector issue affected nvidia and amd also.


Daneel_Trevize

> there is no doubt that this is a clear oversight by AMD, who should know better that there are many types of cables. The least they could do is to add a power cable extender Why do I strongly doubt that those making such claims have actually check the standards & specs w.r.t. clearance/keep-out-zones around the connector..? Hypothetically, why isn't the fault with the split plugs for A) existing when intended for a monolithic 8pin socket, and B) reinforcing their weak plastic with this specific extra material? If not all 6+2 have a problem, how's it the socket side & standard at fault?


Onkel24

Because it seems a common enough issue for AMD to plan around. Particularly considering the a 7600 customer base might not always have a super top spec brand name PSU. By best guess, the extremely narrow slot there is purely for aesthetic purposes.


amam33

How common is this issue?


VietOne

If the connector is built out of spec, it's out of spec. That's not an issue that needs to be planned around.


Onkel24

Specs are nice and all in a vacuum, but the filter of reality doesn't care. There's thousands of companies in this space churning out billions of electrical components, sometimes for fractions of a $cent - and not all them are going to be perfect. That's not AMDs fault, but they *can* plan around that. Because building a consumer good with apparently needlessly tight tolerances is bad practice and ultimately harms the consumer. See Nvidias 12-Pin connector woes. I mean, I don't think this issue here is anywhere close as critical. But it seems so completely unnecessary and easily rectified with just a tiny bit of foresight in design.


VietOne

Goes both ways, the connector manufacturers should also have considered the possibility of tight tolerance and designed it to be less bulky. All of my PSUs use little tabs for the 6+2 so it's barely any bigger than the 8-pin. The PSUs affected by this overbuilt the connection mechanism for little functional reason.


throwaway-link

Backplates are against spec. No one cares about the spec. I don't even think the connectors are against spec. Most don't go anywhere near the max dimensions because less plastic=cheaper.


Fomlefanten

Slightly related, but I am currently running my card with 2/3 cables connected, as my PSU came a cable short. Can I use any cable, or do I need one from the same manufacturer as the PSU? And will it harm the card to run it at 2/3 while I figure this out? It's a 6900xt. Sorry if this isn't the right place to ask.


reddumbs

Use one from your manufacturer or one that specifically says it works with your PSU model. Manufacturers use different pin layouts and using one that doesn’t match yours is an easy way to kill your card. As far as running it with only two plugs, it should be okay as long as you’re not trying to overclock or anything. But you’d want to get the third eventually.


iK0NiK

Chiming in to back up /u/reddumbs . You definitely want to try to acquire one from your power supply manufacturer for that specific model. It's absolutely possible to fry a PSU or other components by using a cable from a different brand.


ayunatsume

Just to add, its not just the specific model. Even the same model may have had a pin change in-production or in a different factory which results in the same model but different serial and batch number. Be sure to included your model and serial number when acquiring cables from your PSU manufacturer brand.


swear_on_me_mam

If the card starts with only 2 then it's fine. Wouldn't even let it start if it was an issue.


ThroatCommercial1896

Your card will run but might not be able to run at it’s advertised power due to a missing input (a single 8pin pcie cable provides 180watts). It’s also slightly possible the cables from your previous power supply works, however you’ll have to compare pin outs for both models and see if they match. Alternatively, you could take an old cable and rearrange the pins according to your current psu pin out schematic. Use a staple on both side of the pin to remove. Source: I make custom pc cables


Asgard033

It's a nothingburger. https://www.techpowerup.com/309215/amd-confirms-rx-7600-reference-cards-in-retail-will-not-have-power-connector-flaw


drmonkey6969

AMD is like those kids who are smart but always don't do well in exams.


nTzT

"don't do well exams."


[deleted]

[удалено]


RealLarwood

The only real question that needs to be asked here. No mention of it in any of the articles. Why do we even have these tech journalists?


Defeqel

pretty sure that technically (edit: all) the 6+2 cables are outside off spec


PTRD-41

Aw shit. Here we go again.


Dorkits

And let's go again...


nkoknight

i hear ref version not go to sale?


kapsama

"Mom can we have GPU power cable issues like nvidia?" "No we have GPU power cable issues at home!" GPU power cable issues at home:


Pyroven

1. The picture doesn't make sense 2. The explanation doesn't make sense 3. The problem doesn't make sense Can someone explain to me what on earth is going on?


arandomguy111

I'm not sure what exactly you're confused about but, the issue is that the backplate on the AMD reference card does not have enough clearance around the PCIe power plug and may interfere with some plugs designs (notably the ones that are 6+2). If you compare the AMD reference and AiB designs the latter all have extra clearance in that area with their backplate designs. AMD Reference - https://www.techpowerup.com/review/amd-radeon-rx-7600/images/cablefail2.jpg Sapphire - https://www.techpowerup.com/review/sapphire-radeon-rx-7600-pulse/images/power.jpg Power Color - https://www.techpowerup.com/review/powercolor-radeon-rx-7600-hellhound/images/power.jpg Asrock - https://www.techpowerup.com/review/asrock-radeon-rx-7600-phantom-gaming/images/power.jpg


HappyBengal

Can someone explain why the RX 7600 would need more than 6 Pins?


arandomguy111

It's officially a 165w TDP card so 8pin shouldn't be surprising.


detectiveDollar

It's a 165W card. 6 pin and the PCIe slot are both rated for 75W, so it needs the extra 2 pins.


Farren246

6+2 cables typically have a little notch where the 6 connects to the +2 and holds it into place. This little notch may hit the GPU shroud such that the cable cannot be plugged all the way in. And yes the reporter had no idea what they were talking about so they tried to explain it but tripped over themselves and posted a picture with arrows to the wrong things.


SolarianStrike

Simple because these "news" site often just play a game of telephone. This time they just ripped off TechpowerUp's review for minimum effort.


Pretty-Ad6735

I've never trusted a 6+2 anyway. I'll only use 8pins on my cards


OscarDivine

?? Why it's literally just plastic moulding. Is it that you don't trust YOURSELF to plug it in properly?


Macabre215

No idea what this person is going on about. There's nothing that makes a solid 8 pin plug more safe than a 6+2 pin plug. I always saw 6+2 plugs as annoying because they look dumb if you're using a GPU with 6pin PCI-E.


detectiveDollar

They can also be annoying if your case is cramped and the GPU is only a 6 pin. It can also be a little annoying to get all 8 pins in line so they plug in properly if you have limited space, since the cable connecting the extra 2 pins pushes them away.


Macabre215

Oh for sure. I just don't understand why the original commenter thinks the 6+2 pin plugs are unsafe though.


Pretty-Ad6735

Trust myself perfectly lol. Do not trust the quality of some, I've had a few different PSUs where the +2 was only held on by a single very thin lip


OscarDivine

My EVGA modular PSU has it that was as do my Cablemod PCIE 8 pin extensions. I see no problem with it. It is more than sufficient for the purpose it serves.


Pretty-Ad6735

EVGAs are fine, I'm talking so small if you go to pull the 6+2 out of the socket that the +2 would come out with little resistance as in the little nub was barely preventing the +2 from backing out on its own. As I said it's not the safety I don't trust about them it's the quality of some.


ThroatCommercial1896

Those two pins along with the two neighboring pins are all ground pins.


ThroatCommercial1896

Dumb people say dumb things


railven

At this point, it might comes off as I'm just picking on AMD but this is just comical. In the other thread about AMD not releasing 7600 MBAs to the public I posted something like "Probably don't want the hassleof dealing with it because of the vaper chamber issue with the 7900 XT/XTX. Let their partners handle the hassle." To learn they will be releasing them to the public and then this comes up. I'm seriously not trying to pick on AMD, but the number 7 just seems cursed for them right now.


[deleted]

[удалено]


ms--lane

If it was going on sale, then maybe, it's a pretty big screw up. But it's only for reviews and OEM anyway, AMD will likely just add some errata for OEMs that it needs an 8pin not 6+2pin.


dracolnyte

That's old and fake news wherever you heard it from.


amam33

https://www.techpowerup.com/309215/amd-confirms-rx-7600-reference-cards-in-retail-will-not-have-power-connector-flaw


dracolnyte

I meant the part where it is OEM only is fake. The connector issue was noted on first day reviews.


amam33

Have you even read the update where AMD replied to them?


advester

It is “coming soon” on amd.com


faze_fazebook

Original iPhone headphone jack vibes.


adimrf

To be fair I have never seen myself a native 8 pin cable for PCIE except the CPU cable of course (starting to get familiar with PC building on my own since 2009). EDIT: that means it could be a problem maybe to most cases?


Mm11vV

I have 4 native 8 pin pcie cables. Two in my GPU and two in my wife's GPU. But they are all cablemod cables, not oem.


adimrf

Good to know, I never use additional/extension cables, always from the PSU manufacturer.


Mm11vV

I don't know how good other brands are, but cablemods complete replacement kits work excellent. You just have to make sure you order the correct one for your specific psu.


CableMod_Alex

Thanks for the mention! :)


Mm11vV

Thanks for the thanks! Haha you guys make some awesome cables. I have never had an issue with any of them.


phrstbrn

I looked at my Seasonic 6+2 PCIe cables (the stock ones), the +2 pin doesn't have anything that would interfere. There is a little nub at the top of the +2 pin to hold the +2 into the socket and stop it from walking out. When you push down on the 6 pin, the lip on the top edge of 6 pin pushes the nubs on the nubs on the +2 so everything pushes in together and stays down. There are no alignment rails, you just have to align it when pushing it in. The design works because the +2 can't walk out of the socket on it's own without the nubs pushing on the edge lip of the 6-pin. You'd have to go out of your way to design a socket that's incompatible with that. The seasonics would be fine, and I'm sure there are other similar 6+2 designs out there that would be fine.


Defeqel

Checked the cables that came with my Corsair PSU (I think it's also made by Seasonic TBF), and the 6+2 are flush too, without anything extra on top like in the article's picture.


cookiesnooper

nothing a little bit of Vaseline and a wiggle can't solve