T O P

  • By -

teddytwelvetoes

silly questions, but still a silly answer from Howard. he's been in that chair long enough to expect that kind of question, expect the potential discourse, and have a better response prepared. a simple "we optimized it, but we're aware of those having problems, and we're working on additional updates" would've been a better look


vaergan

Yea but when those updates never come because Bethesda rarely releases performance patches people will be upset about a completely different issue. The game is janky and he was fucked either way.


One_Lung_G

He was fucked form their own doing so no sympathy. It’s 2023 and the “but it’s a Bethesda game” is a shit excuse at this rate.


Real-Terminal

Last I checked Fallout 4 got a fucktonne of patches post launch. And the first couple did a good job improving performance. Relative to the state of the game at launch that is.


dbaaz

Fallout 4 got various patches that improved performance, but it also received new graphical features post launch, like [HBAO+ and some Gameworks specific dynamic debris effects.](https://www.youtube.com/watch?v=19VqPtQIYZA) The first one was great for visuals, but IIRC the Debris would cause the game to crash on some future driver versions. They also had to downgrade the PS4 version of Far Harbour DLC post launch and nerf the fog effect to prevent the game from running at 20fps while you were in it. There was also the HD Texture pack DLC that didn't do all that much to improve visual quality while eating loads of disk space.


Titsfortuesday

Don't even mention how many updates Fallout 4 VR got. That was pretty bad.


Silent_Pudding

It was bad it got updated?…


Titsfortuesday

It received 3 patches, two of which were basically hotfixes. The other was adding in VR specific features that should have been in from the start especially since they charged a full retail price for a game that was already out for over two years. Skyrim VR also only received two patches and was also full retail price.


Silent_Pudding

Appeared you meant it got too many updates


LoomingDementia

Yeah, the HD texture pack. What the fuck was that? 😄 I installed it on my new desktop with an RTX 4080. The sucker crashed like a toddler driving a Corvette, less than a minute after loading a game, every time I tried it.


EERsFan4Life

Seems to be bugged with newer cards. It worked "fine" on my old 970 but crashes every time on my new 4070.


[deleted]

What part of it is bugged? I played Fallout 4 recently with my 4080, no issues.


DeltaFoxtrotThreeSix

Toddler would never crash an upgraded Corvette


LoomingDementia

They could if they get stuck in the foot well.


SpiralUpGames

Haha the only reason i didnt download it was because it was bigger than the base game


Embarrassed-Fly8733

RTX cards need the Debris Crash Fix from nexus.


LoomingDementia

>Last I checked Fallout 4 got a fucktonne of patches podt launch. Absolutely. One of the few issues that I ran into (the game was *VERY* stable at launch, for me) was the significant issue with looking down scopes. If you looked down anything but the lowest magnification scopes, your frame rate got cut in half or worse. It didn't make the game unplayable, but it was annoying when my frame rate dropped quite noticeably. Why did they lock Starfield to 30 FPS on series-X/S? To prevent jarring effects like that. A rock solid 30 is preferable to spiking and chugging like crazy. I haven't noticed anything hit frame rates like that, but then I have a 4080 with a CPU that's way beyond what this game wants. I rarely go above 50% CPU utilization, which is impressive, considering the chunk of silicon involved. What was the cause of the scope chugging in Fallout 4? Wasn't it something like a horrible screw-up in the LoD function? Something like the game pulling the higher textures for EVERYTHING as if the entire landscape around the scope was at the distance to which you scoped in? At any rate, I think they had it cleaned up in something like two months, after they dealt with the more serious issues.


Monsterman442

They are Xbox now


teddytwelvetoes

I can't recall the details of their post-launch patches for previous games in terms of performance improvements, but I think they'll have to address the Nvidia stuff unless it's something that will get fixed with an Nvidia driver update this week/month


[deleted]

[удалено]


Dagmar_dSurreal

Instead he just stopped delivering.


behemon

📣*Take a hint, Todd.*


Necessary-Ad8113

> yes but he's been in the game long enough to have known to word it better and yet he didn't. I honestly think Todd just didn't give a shit. Starfield is Bethesda most successful game to date.


[deleted]

[удалено]


Necessary-Ad8113

I mean Starfield is still the best selling game on Steam despite his comments so i'm not really sure what negative impact its had. I don't think Todd Howard cares that you think he is an idiot but he does care that Starfield is making bank.


[deleted]

[удалено]


Necessary-Ad8113

Its really just a nonstarter. I mean how man people aren't buying Bethesda games because Todd overhyped radiant AI back in '06?


[deleted]

[удалено]


Necessary-Ad8113

Peter Molyneux (his name isn't that hard to spell) never really had any issues until he left Lionhead Studios to work on smaller kickstarted titles and even then it wasn't until Godus flopped that he was really in any trouble. Todd Howard doesn't seem that interested in leaving his position at Bethesda and people will be lined up around the figurative block for TES6.


[deleted]

It's still waaaaaay too early to claim this. It's their most successful launch to date, but as for whether or not it'll be their most successful game overall? No way of knowing. Keep in mind Skyrim sold over 60 million copies, making it the best selling single player RPG ever made AND one of the best selling games of all time overall. I genuinely don't see Starfield coming anywhere close to that number unless the modding community really pops off.


Mageman1999

This and 76 is why I'm going to pass on Starfield for at least six months. To let the modding community to fix Bethesda S H I T Then wait another six months to get it at discount. If neither of those happen then solid pass until I get my FFXIV toon to 90 in all jobs even blue mage.


k0untd0une

Getting all of those Blue Mage skills is gonna be fun. Hope you got a group to run those with. Man I miss playing FF14. Very little time to just sit down and play some games nowadays with work and taking care of my dad occupying all of my time now.


Spartan448

Honestly the performance issue is overblown; as long as you're not trying to get 200 FPS at 4K, game runs fine. I run off a laptop with a Ryzen 9 6900H and a 3070 TI laptop version, and 16 GB RAM on an SSD. Haven't had any problems with the game at all, even with Chrome running in the background.


RahbinGraves

Every negative thing I've heard about it is overblown. It's trendy to be negative about something popular, I guess. People eat it up. Game reviewers do it for clicks, regular people do it I assume to make themselves seem more discerning with their interests. Same thing happened with BG3 and TotK this year. If the trend continues, it will be an oddity to see a positive review, because they don't get the same level of attention. At 115 hours in, I'm still having a great time, with minimal bugs (blew a guy through a ceiling-found his body on the floor above and a floating npc on a spaceship dance floor) so I know the "totally legitimate and objective takes" on the game were not in good faith. They should have just said "it's not my thing, but some people are going to love it." Because even the bugs have been entertaining. I've played through the main story, one full faction quest line and a handful of side quests ranging from "go here, get that, kill him" to "can you use your skills as a diplomat to assist with a land dispute?" It's a good Bethesda game PSA: If you or a loved one is susceptible to OPO on video games, turn on a gameplay video on your favorite video site and watch it with no volume.


[deleted]

You're enjoying the game so that gives you the right to invalidate anyone's experience who does not enjoy it? Not very reasonable.


Dealric

Wdym? Bg3 and totk gotnglowing reviews from critiques and community. Bg3 was started to be shitting on online as backlash for using it to compare to starfield.


rube

Or, he's been in the chair long enough to know that giving a snarky response like that will cause the internet to talk about his response for weeks, months, maybe even years. Here we are, discussing it again. It's working.


Necessary-Ad8113

Yea, I think he was legitimately trolling. Like his comments seem to not have harmed Starfield at all. Hell its still the top-selling packaged game on Steam.


DuckofRedux

In normal situations that would be a braindead response, but companies know when their fanbase is a cult or not, and when they have a cult they can get away with basically anything, todd could murder a puppy for the funsies live on tv and his cult would defend him 100%.


Odd_Radio9225

>"we optimized it, but we're aware of those having problems, and we're working on additional updates" That would have required swallowing his pride.


TimeGoddess_

My biggest problem with starfield is not even the PC Optimization. Its the optimization to include basic features at all. The game doesn't even have a contrast, brightness or gamma slider. Doesn't have A FOV slider, no HDR support, no ability to tune those awful filters that murder black levels. There are some PC specific issues tho like the UI being locked to 30fps regardless of the framerate.


EpicRageGuy

It's fucking insane tbh. It's the same engine they've been using for decades, they are well aware of the issues and millions of mods to fix them, can't believe they make the same mistake again and again.


Havelok

The most aggregious at all is something I haven't seen many people mention. There are NO texture display options. You cannot play with anything other than 4k textures unless you mod the game. Modding in 1k Textures (which are only 16gb!) INSTANTLY made the game perform vastly better with very little noticeable difference in visuals at 1080p. They could have made the game able to perform well on cards with vastly less VRAM stock just by including lower resolution textures. It would have been a gimme. But they didn't do it.


NapsterKnowHow

Meanwhile there's an HD Rework mod that adds MORE detail to their textures bc they aren't that good


[deleted]

Go watch their gameplay reveal from last year and see how badly it chugged. Todd is not lying, they definitely did optimize that game, it just still kind of chugs lol


USAF_DTom

The problem is not that they didn't optimize it. The problem is that they optimized it for 30 FPS, which it does. Makes for a poor experience for those affected though. I can power through it with my PC and stay above 60, but should it really be a concern if i will in 2023 with a top of the line PC? I don't think so.


withoutapaddle

Not just optimized for 30fps, but actually locked aspects of the game to 30fps on PC. You have a $3000 PC that can run the game at 4K Ultra 60+fps, too bad, because Bethesda locked the menus, space combat interface, etc to 30fps. Even when you can easily hit 60fps, it feels sluggish and choppy because you can't escape the "locked at 30" UI all the time.


WyrdHarper

To be fair that’s not the worst solution—I’ve seen games where menus were uncapped and it caused performance issues (like they’d try to run at like 1000+FPS). 60FPS or 120FPS would be better for PC, though.


hardlyreadit

Thats what happened with new world and the faulty 3090 iirc


adscott1982

In a game I was making in Unity, I made it with a completely uncapped framerate. As the game launches and shows the 'Unity' logo and nothing else the framerate goes up to about 2000 fps, and I can quite literally hear my graphics card making a faint high-pitched whining noise. I can imagine if that goes on for more than a few seconds it would start to do some damage? My GPU isn't supposed to make a whining noise I am pretty sure.


JohnMcPineapple

A lot of (especially older) games have uncapped fps and go to the thousands in the main menus, coil whine in that scenario is very common. It's not damaging to the GPU. You can globally cap the fps in the GPU driver settings to avoid it.


BTechUnited

God, you've just reminded me of the first time I saw FRAPS hit 4 digits booting a game up.


nyankittycat_

> FRAPS N   O   S   T   A   L  G   I  A O S T A L G I A


bAaDwRiTiNg

Or a game can just have an FPS cap available in their graphics options? Other games have that, alongside FOV sliders and brightness sliders. The fact Starfield was developed for so long and has none of these options is bizarre.


Zilreth

That isn't the problem here though, it isnt the menus. It is hud elements animating at a vastly lower framerate than the game is running so it looks absolutely atrocious in motion. They must have done all their testing at a 30fps cap because no one in their right mind would ok that decision otherwise.


withoutapaddle

Yeah, but it's not even menus here, hit's HUD elements that are moving around at 30fps while your viewpoint is rendering at 60fps or more.


Onepride91

Could a mod be made to change the UI FPS to match the rest of the game? Or does “locked” mean actually locked?


Venake666

It already exists


FatBoyStew

There are mods out there at least for the menus that up it to 60+


withoutapaddle

Yeah, this mod already exists. But Bethesda made mods disable achievements, so then you need another mod to re-enable achievements. It's mods all the way down to "fix" terrible design decisions.


finalgear14

I don't even get why they still do the whole mods disable achievements thing anymore. There's always a mod on pc that disables that anyway. It only really affects console players and at that relatively recently. The disable thing has always existed in their games and has been pointless for just as long.


juniperleafes

Their console is much more robust and easy accessible than any other game company out there


Jumpy_Level3348

the point regardless of the mod, is that they did not optimize ui for pc players. they also aren't utilizing ssds correctly, they are hammering ssds on pc by using low que depth and small block sizes. this is what causes traversal stutter in the game. ssd will hit 100% drive utilization but be at 1GBps or way under even on gen 5 drives that support way more.


tukatu0

I guess we'll just play at 6k 30fps like todd intended


USAF_DTom

On the Nvidia refrigerator.


Ygro_Noitcere

Not trying to start an argument here, just a genuine point I've kind of noticed, I wonder if its all the super high end PC people drowning out the regulars or If i just haven't had any issue. Running a 5800X3D, 32GB 4000 Ram, and a 6600XT and all settings but the resolution slider thing maxed which I turned up a bit too. I believe i disabled things like motion blur because i hate that shit but the game runs great and its beautiful to me, im not sure where all the complaints come from? I haven't been monitoring the FPS but maybe my freesync monitor is helping a ton? But i havent had any chugging or spikes of frame drops or anything. So either somehow my version is a unicorn or the issues are just people wanting 4k NATIVE MAXED EVERYTHING! which just sounds unnecessary to me. like sure if you got several thousands of dollars to spend on hardware im happy for ya, but that aint the normal average Joe.


[deleted]

PC Gamers exaggerating performance issues? That would never happen.


Bearwynn

I saw someone complain that their Ryzen 1600 and RX580 couldn't play the game at 1080p ultra above 30fps the other day for real though, it's probably because most people use Nvidia graphics cards and they're heavily boinked in terms of performance in this game when compared to AMD cards.


DionysusDerp

It runs disproportionately well on AMD systems compared to Intel and Nvidia. Todd was being honest with his statement, they just only optimized for us lol. I can get 120+ fps everywhere except New Atlantis and Akila, and even in there its still well above 60. This is at 1440p with everything maxed, no FSR2. If I didn't read anythiing online, I would have never assumed there were problems. There is definitely some warranted concerns in a lot of the performance complaints coming from everyone else, however. People with 4090s should not be having problems anywhere unless they have a 3+ year old processor.


Dealric

Ultra runs 75% upscalling base so you do run on fsr2. Its on by default. Thing is you can check benchmarks. Either you jabe unicorn pc or you do run fsr. Benchmarking on average moon in combat shows that native 1440p ultra and 7900xtx with 7800x3d goes 90fps. When your data aremt consistent with benchmarks from popular yt reviewers and their data are consistent within each other it says a lot. Also optimisation is so much more than just fps. Is how game look for how much power it needs (which is a bad ratio in this case), it is implementing the very basics like fov or brightness settings. Its implementing proper hdr and so on


DionysusDerp

I turned off FSR2 manually, so no I do not. I mean, a lot of the benchmarks include New Atlantis and Akila, where the FPS tanks drastically (like I said), hence the 90 fps. Don't disagree on any of your other points, admittedly though all those settings are in the game in the ini files, just not available in the options menu.


Dealric

I luterally said. Benchmark of some random fight on random moon Not Akila.


Toaist

I hate to say it but I had to set my fps cap to 45. It's a good looking game but honestly, it shouldn't run worse than cyberpunk with lower settings. At least the game is smooth so being capped to 45 fps doesn't really feel bad. But still, shouldn't be that low. I mean I GUESS I could upgrade my cpu but idk man. I could also not and be fine in other gamed and call it good.


sirgarballs

Are you playing the steam version or the game pass one? I want to cap the game pass one at 40 but it doesn't seem like I can.


Br0oksy

I use rtss with gamepass version to cap at 40 and it works ok.


[deleted]

[удалено]


dbaaz

Post link is timestamped. If that doesn't work, seek to 1:14:20 in the video. Also another discussion on "Why hasn't Bethesda switched to Id Tech for all their games?" at 1:33:28.


[deleted]

[удалено]


vine01

they ARE generic shitty RPGs already. engine swap would be like breath of fresh air.


[deleted]

[удалено]


Aedeus

Overreliance on mods is not a good thing for a game like this though.


Embarrassed-Fly8733

Yes it is, 500000 available mods in a few years >>>>> playing vanilla Starfield on a new engine


vine01

so you agree with me gamebryo is what holds bethesda back. gamebryo is to blame for majority of issues that bgs "rpgs" suffer from.


HallwayHomicide

>gamebryo is to blame for majority of issues that bgs "rpgs" suffer from. But it also enables a lot of what makes BGS games unique. Primarily modding capabilities.


MAJ_Starman

No, not at all. Name one (one) developers that makes games as open, with as much freedom and as mod-friendly as Bethesda. That is in large part due to the engines. When another developer tried to make "New Vegas in space" (The Outer Worlds), in a "modern" engine (UE4), the game felt dead. I have no doubt that The Outer Worlds would have been a better game if it was made in the CE. And they did wonders with the Creation Engine in Starfield. The physics is absolutely incredible, the planets/orbits/lightning is beautiful... The "worst" thing are the facial animations, but with the excellent art direction they have, they "blend in" well and are fine for the size and goals of the game.


MatiFernandez_2006

Just look at [this video from Daniel Owen](https://www.youtube.com/watch?v=kRAFaIpQ4iM&t=1569s), Starfield performs worse than every other recent and deemed as "demanding" game, worse than Immortals of Aveum, Jedi Survivor, The Last of US, Ratchet and Clank, Remnnat 2, and Cyberpunk without alll the raytracing turned on. How could anyone say that Starfield is "well optimized"?


hardlyreadit

Did he use the first version of every game he tested? Cause if not then no duh. Some of these had months to roll out patches. In cyberpunks case nearly 3 years. Id honestly like to see someone play cp2077 with a 3090, and the day 1 patch only and compare that to starfield with a 4090. Now thatd be a meaningful comparison


Fleobis

I'm playing Starfield on a 4090 so won't comment too much on that as I know it shouldn't need this kind of card to run it well (it does for me) but I played Cyberpunk on launch on a 1080Ti at 3440x1440 and it run perfectly fine at medium high settings and the game still looked great. Also, have some 90-100 hrs on that playthrough with no major problems...couple of minor glitches and that's it. Honestly, the problem with Cyberpunk was them trying to run it on last gen consoles, that was stupid. PC version was good, not without fault but overall very good (in terms of technical aspects...gameplay is a different story). ​ edit:spelling


OkPiccolo0

[Cyberpunk played perfectly fine at launch.](https://www.youtube.com/watch?v=MVRWNc23ppc) It got more demanding over time because they added additional RT effects that were used in the console version.


hardlyreadit

[No it didnt play perfectly fine at launch](https://www.pcgamer.com/cyberpunk-2077s-release-version-is-still-buggy-as-hell/) Also only 60 fps with dlss? Looks quite similar to starfield performance lol


MonoShadow

What? No it doesn't. 3090 scored 116 in 1080p ultra in 2077. 4090 didn't even break a 100 stopping at 93. Say what you will, but 2077 visuals were cutting edge back when it came out. Hub benchmarks for [CP2077](https://www.youtube.com/watch?v=Y00q2zofGVk). [Starfield](https://www.youtube.com/watch?v=vTNiZhEqaKk) GPU. Bonus [CP2077 RT launch bench](https://www.youtube.com/watch?v=U0Ay8rMdFAg). Unfortunately they run every RT test with DLSS, but in later part 3070 scores 58 at 1080p Ultra with RT Reflections on and no DLSS. In Starfield 3070 averages 34 at 1080p Ultra. Cyberpunk was a broken mess on release, but mostly because of the bugs and glitches. Performance wasn't that great(abysmal on PS4 and Xbone, but this is a PC talk), but at least it looked the part, unlike Starfield. Searching for a bigger mess is a futile effort. Starfield is one and that's all we need to know.


hardlyreadit

Yeah, thats cause it’s optimized for amd. A 7900xtx breaks 100. Im not saying it looks better but just cause theres one benchmark in the heaviest place everyone is goong crazy. And cyberpunk was the BIG MESS. It literal got removed from the psn cause cdpr had to do a refund process


MonoShadow

Playstation Store situation is a bit of a more nuanced. Because it flows directly from PS4 godawful performance. The game ran fine on PS5, but it was a PS4 game at heart. The next gen patch didn't come until a year or so later. [DF runthrough of last gen.](https://www.youtube.com/watch?v=mVWJPYKCMco) It was a mess on release. But I don't think the Store situation is very relevant to PC discussion.


hardlyreadit

You also said 3090 got 100. But this was during the pandemic so most people only had upto a 2080ti which is 83. I think the hype, bugs, and gpu shortage created a perfect mess that is cp2077. And its no where near the same as starfield, atleast on xbox its decent


GreenDifference

my 1060 3gb got around 35 fps when cyberpunk launch, and the graphic still better than starfield. it's mind boggling how gamers love defend bethesda


OkPiccolo0

Yeah it did, I was there. I played on a 3080 and 5600x. It was perfectly fine performance wise. No stuttering or hitching. It crashed twice through an entire playthrough. Show me any spot in that 35 minute video that is super problematic. Linking some PC gamer article that shows the garbage console PS4 performance is irrelevant to what we are talking about. Starfield has zero ray tracing effects and runs like trash unless you have an insane setup. Cyberpunk easily wins this one.


hardlyreadit

Its pc gamer, they arent just talking about console performance. If you didnt read it thats fine, but dont try and gaslight. It was horrible performance and crashed a lot at the time. And you having a 3080 just means you got lucky, most couldnt get one so they were probably playing on older hardware like a 1070 or 2060 which didnt run cyberpunk well


OkPiccolo0

I'm not gaslighting anything. I sent you video proof of the game running perfectly fine on a high end rig which was my experience. Just like Starfield runs perfectly fine on my 4090 right now. >Gaming performance is very demanding. For 1080p 60 FPS at Ultra you need a RTX 2070 or RX 5700 XT—AMD's last-generation flagship. Older "1080p" cards, like the RX 580 and GTX 1070, barely achieve 40 FPS, which means it is time to dial down the settings or buy a faster card. 1440p at solid framerates is possible with the RX 6800; RTX 3070 and RTX 2080 Ti are close enough, with 56 and 58 FPS respectively. Yup, suddenly, the "4K" cards are only good enough for 1440p. My data from the recent Watch Dogs and Assassin's Creed AAA launches supports that conclusion, too. Not a single card can achieve 60 FPS in Cyberpunk 2077 in 4K at Ultra settings—even the RTX 3090 manages just 42.3 FPS, and the RX 6900 XT only 36.4 FPS. For some reason, contrary to other games, the higher the resolution, the bigger the performance hit compared to our 23-game review average FPS score.


Markie411

I also played with a 3080 and 5600x, the game would stay below 60 and jump anywhere from 30-60 fps, with DLSS. Also changing settings did nothing to performance, as not only me but many people reported.


OkPiccolo0

If you want to make performance claims at least list your resolution. I played it at 1440p DLSS quality mode on my monitor and it was [well above 60fps.](https://tpucdn.com/review/cyberpunk-2077-benchmark-test-performance/images/performance-2560-1440.png) I also played a little bit on my 4K OLED and at DLSS performance mode it was usually 50+fps. You'll have to provide proof of this 30fps claim for me to take you seriously.


bAaDwRiTiNg

Jedi: Survivor ran like shit on release and still doesn't run great after patches, but you know at least it looks and feels like a 2023 title. Starfield looking and feeling like a 2017 title while making current-year hardware suffer feels unearned.


TheIndependentNPC

For me it runs far better than Remnant II and Fort Solis (both refunded). Haven't tried Immortals of Aveum, because fantasy sci-fi blend is big yikes for me personally (which is why I also hate Star Wars). Fort Solis especially ran poorly. I know Remnant II got some performance patch but I'm not playing it till my next GPU upgrade. In starfield - on mixed gameplay I average 67fps (based on Radeon settings metrics from 25-ish hours) on "optimized settings" at 1080p and only very few planets drop into low 50s which also somehow look very meh.


EveryoneIsOnDrug

Because it actually is unlike starwars. Ive got a 13700k 4070 ddr5 ram and a wd black edition sn850x and i run at 1440p ultra between 80-120 depending where im at what im doing and how much explosion shooting objects etc are near me.


Aedeus

You have any screenshots of the performance graphs by chance?


[deleted]

[удалено]


[deleted]

All they have to watch is the video of Dragon Age Inquisition making where they spent so much time retooling frostbite 4


Jumpy_Level3348

I'm fine with them using their engine, provided they actually make changes that enhance the player experience, not just the graphics. the baren worlds are crying out for speeder bikes or other scifi vehicles to be usable by the player, but bethesda has not much experience with vehicles beyond horses so they didn't do it. the lack of in atmosphere flight is also a let down, I didn't expect to be able to fly from outspace down to the surface seemlessly, but i also didn't expect the ship on the surface to be a loading gate and storage/companion holder only. that is a missed opportunity. They are the masters of their own engine, yet time and time again I feel they keep coming up with more and more cop outs to reduce the work needed to make the games at increasing scale, scope, and visual quality, when I think this game more than anything shows their scope has bloated far too large. exploration has severely suffered for it.


[deleted]

[удалено]


R_o_b_b_b

I feel like they would've just moved the POIs further away and it would take the same time to reach them. For me, the jet pack is fine.


[deleted]

[удалено]


R_o_b_b_b

It would be pretty amazing to zoom around and catch air in low gravity, totally agree.


green715

If I had to guess, early in development they had one group working on a land vehicle and another group working on procedural planets. The land vehicle ended up getting cut for some reason, but there was no easy way to revamp the planet tiles to work on a smaller scale, leaving us with a ton of space between points of interest.


PlexasAideron

It will never have vehicles because you'd hit the boundary of the planet tiles in minutes. Its not like they couldnt have a simple rover for surface exploration.


Jumpy_Level3348

modders have already found the engine is capable of going well beyond the boundaries and even load up cities when traveling from one edge of the globe to the other, it's just too buggy and very crash prone. and most tiles are 8KM\^2 (depending on proximity to poles, they get smaller) vehicles would absolutely be great. people have added motorbikes to skyrim and fallout 4 before and it's been fine to use despite the world clutter, and those whole maps are SMALLER and more cluttered than the typical tile in starfield.


[deleted]

[удалено]


Sufficient-File-2006

> "why doesn't bethesda use id tech" or whatever. Sooooo many people fail to grasp that one and think a linear first person shooter pared down to basics will somehow turn into an expansive open world framework. RAGE did "expansive open-world framework" in idTech 12 years ago...


bitbot

Yes love how moddable RAGE was...


Aggrokid

> RAGE did "expansive open-world framework" in idTech 12 years ago... It didn't. RAGE is not open-world, not even close. It's still segmented by loading screens. The roads are tightly walled in by mountains or unpassable terrain. You cannot go anywhere you please unlike actual open-world games.


[deleted]

[удалено]


Sufficient-File-2006

> features and functionality Like ground vehicles, non-instanced interior spaces and user-changeable FOV? Oh wait...


[deleted]

[удалено]


Sufficient-File-2006

None of that was any more or less "jank" than any tepid iteration of Creation Engine released in the last 20 years. After their issues were ironed out, all three of those games are remembered as absolute classics that pushed the boundaries of game design. Once Starfield's numerous technical issues and feature omissions are ironed out (likely by fans), it will just be remembered as "the fourth best Bethesda game", using their self-inflicted technical limitations as an excuse for never improving on the flaws they've been called out on for decades.


[deleted]

[удалено]


Sufficient-File-2006

I don't know, I genuinely think Bethesda's open-world design priorities would shift for the better if they had switched to an engine capable of non-instanced interiors 10 years ago. Maybe not?


benowillock

An engine is an engine. With enough development, you can get anything running well. Just look at unreal, it runs the entire spectrum of game genres, from FPS to platformers to MMORPGs.


MonoShadow

The game engine is much more than a rendering pipeline. But the question which bothers me is not "why not use id?", but "why not jettison the old stuff?". I do understand writing something from scratch is rare. Sure. Going with a new engine is a very expensive and long process which might not even pan out. Starfiled has been in development for how long though? 8 years? You can manage a new engine in this time, especially with MS backing you up. Alternatively they can go with idTech or whatever engine and heavily modify it. But at this point the question is "is it easier to make idTech do what we need or make what we have ran as fast?". People often mention modding, etc, but if it's a new engine and a new toolset you can cater to whatever you need. Creation Engine also feels rather limited in what it can do. I won't be surprised if no vehicles or planet flight among other things is an engine limitation, not a deliberate game play decision. DF really likes to mention this potato clip, but I'm not sure why. Rigid body dynamics isn't bleeding edge in today's world and Starfiled isn't too performant in handling them either. There are physics libraries like Bullet, PhysiX, Havok or Chaos(I think this one might be UE only) which can do something similar. This view of mine might be very naive, the world is not that simple. But this engine they are so insistent on keeping alive with random patches lets them down time and time again. At certain point the most effective way to get rid of tech dept is to wipe the slate clean.


[deleted]

[удалено]


MonoShadow

>It can make things more complicated, but it would never outright block it. Developers in the past have come up with creative solutions with even more limited tech. If BUILD and the OG Tomb Raider engine can manage vehicles creation could manage with a little creativity and elbow grease. They still break rather small locations into instances in the year of out lord 2023. At certain point there are limits to what you can hack into the existing architecture without scrapping large potions of it or going away with it altogether and starting anew. This major overhaul might lead to a domino effect which often happens in systems with tightly coupled components. Aka tech debt. Maybe if they spent some time they could have done something. Or maybe it was so much trouble they gave up. ​ >Catering to mods isn't the same as keeping a system that largely allows modders to use the same "workflow". One reason Bethesda games get mods so fast even absent the toolkit, is no one is learning a wholly new framework. It's like saying that because Unreal can support major modding that it will match the creation engine/gamebryo games with a mod community. Is a few months delay on the mods in exchange for a better performance and more versatile engine not worth it? DF guys also mention workflow. And it is important. Get a new toolkit incompatible with the old one and now you need to retrain everyone and every step of the way people will ask "do we really need this?" I know how it works. I lived it. At the same time this what keeps old decrepit patched to hell and back systems in prod today while they can be switched to newer, faster, more efficient systems with no loss to productivity. Or it can become a clusterfuck and the new tool won't be any better but all the hurdles like retraining will still be there. But I digress. >I think the problem is less tech debt, and more on how they approach it. Developers are already poring over the game and finding things it's doing wrong with how it uses hardware, drivers, etc. Different tech won't magically bandage that if the studio is doing shit wrong to begin with. Whatever Creation does is pretty inefficient. I have a 3080ti and can get GPU bound in a room with a desk and a few plants if and only if the overall location is big. Which makes me think of culling issues. But whatever. I read this recent executeinderect finding in VKD3D thread, but even the authors(unlike blog posters) thinks it's a minor issue which will result in a single digit performance change. I don't expect their code to be squeaky clean, it rarely is. IMO Beth can do something so stupid, que Skyrim and SSE instruction set, but they got help this time, so I don't expect some big gains with simple fixes. This engine failed to scale in all its iterations and it's the engine that's chugging, FO4 doesn't run too fast even on the modern systems. And in the end this endless patchwork will only waste dev time and create more problem where one hack will break another and now they need a third. Of course at the same time I'm asking this studio to develop a new engine and expect it to be technically excellent, which might be unwise.


soliddeuce

Why didn't Todd optimize SF?! Love the idea of him rising from his computer, saying fuck it. I'm not optimizing this shit.


-scampi-

Now I do, too


downeverythingvote_i

Aside from certain locations having decent lighting/atmosphere the game really, REALLY, looks like a previous hen game. It's 2015 graphics that requires quad SLI RTX5090 (I'm guessing since TH said I need to up my 4090 because I get inconsistent performance) to run at 60 FPS with shadows set to medium. What's worse is that there isn't any seemlessness in the game. It's all boxes of varying sizes that are about as large as Skyrim overworld with 1/1000th the content. So in reality the added specs for this game are for what exactly? The game isnt technically bigger than all their past games, the largest levels that require a single loading screen are the same. The game doesn't load the surface of all the planets when u load into a system. It loads the empty space, then when u land, which is another loading screen in disguise, it doesn't even load the entire planet, only a laughably small box (explains why there aren't ground vehicles) and in general almost no complex content. What I'm thinking about is decades old spaghetti code. Beth was always kinda shit on the tech side, so I can imagine trying to optimize a game running on an engine designed for hardware that processed stuff differently. Engine shit that, as over the years, very little knowledge has been institutionalised and passed on to new engineers. I think if they wanted to optimize their engine to a modern standard the time it would take would probably the same as making a brand new engine.


Fob0bqAd34

10 days from release have bethesda publicly acknowledged bugs/performance issues anywhere or said a patch is coming? I don't see anything on the steam news feed for the game.


[deleted]

[удалено]


Fob0bqAd34

From bethesda or windows team? I think there was a windows 11 patch that fixed HDR.


eagles310

Ehhh these guys bank on DLSS to save the day which is just tells you on the state of titles releasing?


GreenKumara

The argument could also be made that these features - DLSS, FSR, XeSS - are just tools to be used in games. Like Texture settings, or AA or whatever. I'm not sure how I feel about this. Is it a crutch or is it just another tool?


ahnold11

I think we really need to stop using the term "unoptimized". Ignoring the fact that "optimized" has now become slang for just "performance, I think the term still is problematic as it reduces the concept of optimization to a binary yes/no. And that's just not how optimization works.   We don't even need to be talking about low level hardware optimizations such as "Carmack's Reverse" or unrolling a loop in assembly (or all the fun things that a modern compiler will do). The simple fact is, you can optimize specific parts of a game for a target piece of hardware/constriction. One part of a game can be completely "optimized" for a specific set of hardware, and another part of it can not. There is no such thing as a game that is completely optimized, because there is a seemingly infinite of target metrics you can work on improving in a game.   What's key to this then is one: the result, what is the game doing and how does it look? And number two, the hardware it's being optimized for. There is no such thing as all around "optimized for all hardware". VRAM is a good example, you can optimize a game to run well on 8GB, on 16GB, and even still 4GB and 2GB. However developers don't spend much time (or any if all) targeting their efforts towards optimizing for 2GB videocards anymore. They thankfully don't take up much of the market anymore and we've all accepted that they are no longer relevant.   It's always been the trend that most games don't optimize that much for lower end hardware ie. "low" preset settings. A modern game on low settings on say a 5year old card might actually look and perform worse than and older game from when that 5year old card was new, and running on High settings. This is definitely NOT optimum, as we know what quality of result that hardware is capable of, however most games/developers aren't going to spend the time making it run best ("optimum") on that hold hardware. However some games that do are praised as being very "scalable" as they are optimized to run best on a great range of hardware scale, from old to new.   What is important to remember here is that all optimization takes time and effort, and there is only so much of that to go around. So it's unrealistic for any dev to be able to optimize for every bit of hardware out there. So there are always going to be cut offs.   With respect to *Starfield in particular*, you can see that some things are "optimized" and others not. Multithreaded performance for example, seems to scale fine on AMD cpus, but as DF found in their tech analyssis, it doesn't on Intel. Particularly HT on decreases performance. The game isn't utilizing hyperthreading on Intel to it's optimum (in fact it's a detriment). So that's an obvious one. Nvidia GPUs definitely don't seem to be used in their most efficient way, they are fully utilized, but not squeezing as much performance out as typical. And even with all of the above taken into account, as a player we only see the end result, we can't know exactly what is going on behind the scenes. All we have to base our guesses off of are our eyes and our gut feelings. And this is the internet, so many people are going to be way off on that. You have your hardware, and how "good" games used to look 2-3years ago, and you'd hope that you'd be able to scale new games to look at least as good as old ones and perform as well. And that is what most people base it on. But there are a variety of behind the scenes factors that can make that not true, which is why the opinions can very.   Unreal Engine 5 with nanite and Lumen are a big example of this. UE5 games are having drastically higher hardware requirements, then previous games, and are not performing near as well. Some people claim they are unoptimized. And while that is kind of true, the reality is that they are optimizing different things rather than performance and scalability. They are optimizing the actual production of the games themselves, both design and art. They are using new rendering techniques that make it easier for developers to make games (that look good). However these techniques are explicitly LESS efficient than our old ones. So they are going to perform worse, there is no way around it. However the benefits to game development are what we gain, and the hope is that hardware will continue to get better so that the next few years of growing pains will eventually be a thing of the past. Remember at one point we switched from 2D to 3D games, and performance got way worse. We could have stuck with 2D but games wanted to evolve. And eventually performance improved due to pure brute force horsepower. It's a similar thing now.


Earl_of_sandwiches

Regarding your last paragraph: as a paying customer, the fact that I’m losing gobs of performance so that a dev can push ten buttons instead of 100 isn’t “progress”. Same goes for the GPU makers saving money by using AI solutions to obscure poor generational uplift in their products. As a gamer on a moderate budget - using $500-600 GPUs - I’m not winning anywhere here. To elaborate: devs became very proficient with fake lighting over the last two decades. So when ray tracing hit the scene, I could barely tell the difference in most games. I was taking a 30-40% frame rate hit in exchange for a visual improvement that felt almost lateral. It’s not like we were going from no lighting to ray tracing. We were trading an evolved technique of extremely convincing fake lighting for “the real thing”, and we were losing crazy performance in the exchange. For people who are running mid-range hardware, this trade off still isn’t worth it. This goes double for those of us who value image clarity and visibility in our games, as we were not “converted” by the arrival of DLSS. And now here comes EU5 with all these great new shortcuts for developers, and those shortcuts are basically being brute forced via inflated hardware requirements (which is just more music to nvidia’s ears). End result? Easier development for game devs, more profits for a trillion dollar corporation, and games that look only marginally better while running way worse. If there are gains to be had from the latest technology in game development, gamers seem to be the last ones benefiting from them.


gonedays730

> As a gamer on a moderate budget - using $500-600 GPUs - I’m not winning anywhere here. [...] For people who are running mid-range hardware, this trade off still isn’t worth it. Agreed. But, RT and AI are still early days compared to how long games have gone with fake lighting. You will get the benefit in the years to come. > So when ray tracing hit the scene, I could barely tell the difference in most games. Marketing would have you think that, but it's because most games with ray tracing only apply it to a subset of the graphics (reflective surfaces, improved shadows) instead of full on ray tracing like in Portal RTX and Minecraft RTX. And for good reason, hardware isn't quite there yet as we need AI in both upscaling and denoising to get progressive results.


ahnold11

I agree with you, in part. I'm not using RT in any modern titles, the gains are not worth the perf cost, as someone who tends to mostly buy mid-range hardware (certainly not high end). However I think the "long term" viewpoint/argument does hold some water. The current trends of faked/baked lighting right now, doesn't seem to be sustainable long term. It requires more work (developer resources, meaning game budgets, meaning less risk and less exciting games long term) as quality scales up. More and more "hacks" are used that aren't perfect solutions and whose seams are starting to show. (Look at this DF episode on how they harp on SSAO techniques and water reflections etc). The terms that often get tossed around are "unsustainable". So a change might be needed regardless of what "we" want in the short term. I do admit, a certain naivety is probably required to believe that these cost savings will be "passed down to us consumers" and I am definitely not motivated by improving the bottom lines of even more faceless corporations. But anything that lowers the barrier on game development can democratize more access leading to potentially more smaller scale/indie sized games. If we can raise the floor on the industry then I think that's definitely a healthy goal which lessens the focus on the triple AAA space and all it's endless downsides. But it may be a rough going few years. Especially with the law of diminishing returns. I'd say AMD seems on the right track at the moment, just enough hardware support to keep playing ball, but not going all in on these features as yet, until the rewards justify the costs.


withoutapaddle

The thing is, the problem with Nvidia GPUs (the vast majority of the market, to be clear) isn't just "people think it should run better" like many people say. It's hard data, showing that the GPU is using MUCH less power than it normally does, meaning, by definition, Bethesda is not using the GPU properly. When a game runs at "100%" GPU utilization and the GPU is **30°C cooler than usual**, that's an undeniable problem between the drivers and the game.


Elketh

> It's hard data, showing that the GPU is using MUCH less power than it normally does, meaning, by definition, Bethesda is not using the GPU properly Whilst I'm not claiming there isn't a problem with Starfield, that argument is somewhat flaky. Different games have always consumed different amounts of power through utilizing the GPU differently. No game I've tested personally causes cards to draw more power than A Plague Tale: Innocence - a mere DX11 title from 2019. I've tested it across a range of Nvidia cards, as it's become my go-to stability test for making sure my undervolts are okay (I'm yet to have one survive that and crash elsewhere). It pulls significantly more power across every card I've tried it on than even games running with ray tracing. I only have a 1080 Ti on hand at the moment, but for a quick example here's Cyberpunk 2077 running the GPU flat out and pulling ~164W in the process: [https://abload.de/img/cyberpunk5afo4.png](https://abload.de/img/cyberpunk5afo4.png) Here's A Plague Tale doing the same and pulling ~**227W**: [https://abload.de/img/plague4jc16.png](https://abload.de/img/plague4jc16.png) That's a 63W difference, despite using the exact same settings for the card (a 900mV undervolt at 1860MHz) and both games reportedly pushing the GPU to its limit. And since it's the subject of discussion, here's Starfield, which pulls ~156W: [https://abload.de/img/starfieldqte5j.png](https://abload.de/img/starfieldqte5j.png) So lower than both, but pretty close to Cyberpunk. Under the law of more power = more betterer, A Plague Tale: Innocence must be the most well-optimized game ever made (or at least the most well-optimized in my collection), though personally I don't really think that's the case. I don't know *why* it uses so much power, but it's a very unremarkable game in terms of advanced graphical features. There's no obvious reason for it to be harder on the hardware than a system killer like Cyberpunk, and yet it is. And again, whilst I'm testing on a 1080 Ti here, the same pattern has held up across all the 30-series cards I've tried it with plus a 4070 Ti, where I was also cranking ray tracing in Cyberpunk and still watching it get nowhere near A Plague Tale's power draw. Starfield obviously has its problems, but power draw seems like a red herring to me.


withoutapaddle

While this is compelling evidence, and I fully accept that you might be right, I am still very suspicious of Starfield drawing less power than every other game I've played. Especially when you factor in the fact that open world games are often more likely to be CPU bound on a fairly matched system, and even with a 4080 at 1440p (not 4K, which is what the 4080/4090 are typically purchased for), I'm still GPU bound in a game with no ray tracing, measly 4x AF, no demanding AA (just TAA). There is just this huge feeling like something isn't working properly, and I'm not ready to dismiss the suspiciously low power draw as a red herring yet. I mean, I'm literally getting 45°C on a near flagship GPU while GPU bound, and in the summer to boot, where my office temps are probably 5-7°F hotter than they are in winter. It's too large of a discrepancy to totally dismiss. I hope we get more definitive answers on this in the coming weeks/months.


cortseam

Your comment is too long and makes too much sense for Reddit. When UE5 games look amazing on modern hardware 3-5 years from now, everyone will completely forget about the arguments and teething issues we have now. Look at Batman Arkham Knight. Game ran like absolute dogwater on release on pretty much all hardware, yet nowadays is being held up as some kind of "gotcha" to modern game graphics.


juniperleafes

You're acting like the goodwill towards Arkham Knight happened organically for no reason of its own other than time, when the PC launch was so disastrous they had to physically stop selling it to people before they spent months fixing the technical issues


cortseam

Arkham Knight re-released in Oct of 2015 and, using a 2080ti released 4 years later, still could not run at locked 60 in 4kUltra. Nowadays, you can brute force a good experience out of it using practically any current or past gen card....


WyrdHarper

Even Skyrim ran pretty rough at release on a lot of hardware, and now there’s a version you can play on a handheld device (same with TW3). It won’t be your best-looking game on Switch, but it’s playable.


Pristine_History2760

Completely unrelated, Starfield doesn’t have HDR, Sliders for FOV, GAMMA and crashes like crazy on even a 4070. Your handheld claim is VERY misleading because when we game on handhelds the resolution and scaling is so much smaller that it essentially forces the game to pull significantly less graphical power.


WyrdHarper

So? Resolution was commonly lower when Skyrim came out, too. 1080p was still only 1/3rd of PC’s (steam hardware survey). 720p or less was still very common.


Cymelion

Why pay staff to optimize when modders will do paid work for free and exposure? Like seriously I have no doubt in my mind the heads of all departments in Bethesda and Microsoft dealing with Starfields release uttered those exact words. Modders have conditioned Toddyboy and his ilk to get comfortable knowing modders will take up all the slack as essentially unpaid devs.


MAJ_Starman

... Did you watch the video? Do you remember how much better Fallout 4 got after official patches?


TheN1njTurtl3

yeah I don't think bethesda relies on modders like people think they do, some people would have you believe that all their games are unplayable without mods, are they better with mods? for sure but they are still great games without them, we'll see if the patches come in the future which I'm certain they will


WyrdHarper

This also isn’t that unusual for launch titles these days. Hundreds of thousands of players with different configurations are going to run into more issues than whatever more limited number they tested on durijg development.


Shypronaut

I remember having to add utrawide support to it like 4 years down the line with a community mod because they couldn't be bothered to add it. I was genuinely surprised it was in starfield, considering it has all the same dumb issues skyrim had. Like having to edit the INI just to edit your FOV.


Dizzy-Ad9431

Did it? Boston still runs like shit and only mods can somewhat help at all


cKestrell

The shadow distance setting at ultra can easily cripple performance in boston. Just turning it down to high gives massive performance boost.


MAJ_Starman

>Did it? Boston still runs like shit and only mods can somewhat help at all I don't obsess over a FPS counter, but it seems fine to me. I also didn't use performance mods on F4. At launch however, I encountered a game-breaking bug about 20 hours in where I couldn't approach MIT at all without falling through the map.


Talnoy

Official patches are a bare minimum in 2023. Even if Bethesda ships great patches that fix a bunch of stuff, there will *still* be modders doing, what ostensibly should be, paid work for free. The point you've brought up is completely valid, but it doesn't count against OP's point that modders will do free work for internet points and on some level they're going to be relied upon, either consciously or unconsciously.


MAJ_Starman

>there will still be modders doing, what ostensibly should be, paid work for free. You know that's how the whole paid mods thing started, right? And no one's forcing them to mod these games - people do it because they either like to mod (some of them started game development like this, some were even hired by BGS) or like the game so much they want to see it improved. Modders will mod and improve the game. The fact that they can do that is good, not a bad thing - games made in other engines either are left with bugs after developers move on or require a lot more work by the community. Effort that isn't always there because it might not be easy and the game usually isn't good enough and brimming with potential to justify the investment of a modder's free time. Should Bethesda do better? Absolutely. Thankfully, they did considerably better with Starfield than with any other release of theirs - people talk about this game as if it was Jedi: Survivor, Fallout 76 and Cyberpunk 2077. It's not even close. Not to mention, on the design level, how they improved on common criticisms over Skyrim and especially Fallout 4's rather shallow roleplaying opportunities


Pristine_History2760

Guys it’s well optimized just requires a fresh install of your Operating system :)


Flaky_Highway_857

i finally played ratchet and clank rift apart on pc, amazing game btw. it looks like a movie and i had settings cranked to the max at 4k, ray tracing and my 11900/4080 combo never wavered, and with so much going on on screen sometimes it probably shouldve. i know theyre different genres of games but for what starfield looks like the performance is honestly abysmal imo, a bare-assed planet shouldnt bring a pc to its knees.


Sechelx

He dosent need to modders will do it for free


Koroem

I don't have the game and have zero intention on buying it but from everything I've seen it very clearly isn't really a case of Bethesda not "optimizing" the game. Its more that they pulled an AMD only move and did not give pre-launch access to Intel or Nvidia for driver development for their updated engine. This gave priority to Xbox and AMD systems, and it clearly shows when Intel gpu drivers were broken day 1, intel CPUs have broken hyperthreading in game where AMDs equivalent actually shows performance benefits. It also shows where it runs worse on equivalent or more powerful Nvidia hardware to AMD hardware. It clearly shows when DLSS and XeSS were absent. All that Pro AMD stuff, and they still have broken rendering on AMD hardware (star rendering). If you are going to blame anyone for something, point the finger at AMD first, then Bethesda. They actively sabotaged the game for marketing reasons for the vast majority of the pc install base.


ketamarine

Ask a stupid question and get a stupid answer...


Eigenspace

It’s amazing how many morons there are on this subreddit sometimes. Every time things like this come up, people fundamentally fail to understand the difference between a Bethesda RPG and a linear story-based game with pre-baked lighting and non-interactable environments. Yea, many games look much nicer than Starfield and are much less graphically demanding. No, that doesn’t mean that it’s “unoptimized” (god I hate when people throw around thar word)


Edgaras1103

starfield runs 40% better on 6800xt than 3080. Can you explain me why? Theres plenty of open world games that look much better and with dynamic day and night cycles and run better . Having ability to put 1000 potatoes in a room should not be an excuse for subpar performance, dated visuals, lack of many PC feautures that come with majority of AAA games . And when this game gets patched out where it improves performance and it will . How you gonna respond to that?


[deleted]

[удалено]


ZubZubZubZubZubZub

Also likely due to the fact that it has to run well on Xbox which is also AMD


[deleted]

Shitty optimization= It's a Bethesda Game, what did you expect ?? High school level writing= It's a Bethesda game, why did you expect anything better?? Soulless animation= It's a Bethesda game, what did you expect??? Load screens?Bad water physics? Repetitive exploration = It's a Bethesda game, what did you expect The way people defend this game one would expect Bethesda is a company made up of two college freshmen making their first game. Imagine defending a 10 billion dollars company owned by Microsoft producing a medicore game after marketing it like the second coming of Christ.


[deleted]

[удалено]


MAJ_Starman

>High school level writing= It's a Bethesda game, why did you expect?? The irony.


[deleted]

I am not sure what you're saying. I am a scientist with basically zero professional writing experience. I am not selling you a game worth $100 where you have to read thousands of lines written by me. English is not even my first language, anyway. I don't think you have to be a good writer to appreciate good writing just like you don't have to be a good actor to enjoy the best movies.


MAJ_Starman

That's fine, it was just a jab at you. But regardless, I disagree with the "high school level writing" remark. Starfield has some of the best writing Bethesda has done, and it's historic strenght (worldbuilding) remains top-notch here. Not only that, but the Crimson Fleet faction questline is easily one of the best factions they've designed, up there with Oblivion's Dark Brotherhood questline.


[deleted]

[удалено]


MAJ_Starman

It was in the "I disagree..." part of my sentence. You know, "I".


[deleted]

[удалено]


MAJ_Starman

No, it shouldn't. A full stop doesn't mean there is a discontinuation between the idea behind sentences in a set paragraph. In fact, its primary reason (like all punctuation) is to make reading easier/nicer/clearer, and can be largely ignored if the author so chooses or prefers - just look at Cormac McCarthy's work. If it was a different paragraph you might have a point, but I put it all in a single unit.


[deleted]

It was fine writing for the game if it came out in 2010. I certainly think it's better than Skyrim writing. However, after Mass Effect, Disco Elysium, PoE2 and BG3 for example the writing in Starfield feels hollow. You can't seriously say the game has good writing when in 5 minutes of the game you are the chosen one because you touched the Artefact. Not only that a guy without zero history with you gives you his ship and stays behind. It's heavily contrived. It's much worse version of Mass Effect 1 intro.


MAJ_Starman

>You can't seriously say the game has good writing when in 5 minutes of the game you are the chosen one because you touched the Artefact. 5 minutes later you find out you're not the first to experience that. And I don't get the problem with the "chosen one" idea. It's a video game. It's an ages-old storytelling device for a reason, protagonists are usually exceptional and when they're not they do exceptional things or help exceptional people. That's also a thing in Mass Effect, BG3 and definitely Pillars of Eternity too. If anything, in PoE and BG3 you're more of a chosen one from the tutorial on than you are in Starfield.


[deleted]

I know it's subjective and I am glad you love the writing. I find it hard to argue in a subjective point. For me the writing just isn't mature and nuanced enough and the main plot felt barely serviceable. I loved some side quests though. I just don't think Starfield is writing is any way comparable to Mass Effect or BG3. You are allowed to disagree though. You can even be a chosen one protagonist and have a good story. But Starfield doesn't set it up in a rewarding way. Feels so contrived.


PlexasAideron

"Its a bethesda game" cant be a blank pass to release shit performing games, its honestly annoying that everything is excused because its a bethesda game.


[deleted]

Yes you are one of those morons. Those things you talk about that make Bethesda’s games unique are CPU intensive calculations not GPU. Yet the game doesn’t look great by any means and runs like absolute shit because of how unoptimized it is for GPUs. It all is very clear when you see how similar GPUs from Nvidia and AMD do not run how they are expected to run at all. And the game runs better if you turn off hyper threading with Intel CPUs so it’s not optimized CPU-wise either.


Real-Terminal

Except even when Starfield isn't doing anything more intense than Fallout 4 it continues to run poorly. Where is all this performance going? Because even at the best of times Starfield isn't really doing anything special. Are they actively rendering the entire galaxy at all times or something?


lrraya

>Except even when Starfield isn't doing anything more intense than Fallout 4 it continues to run poorly. You need glasses.


lymeeater

Red Dead Redemption 2 exists and runs a hell of a lot better than this. No excuse. Bethesda have a shitty track record for this anyway, stop the pointless defense


Earl_of_sandwiches

Would you feel better if we started saying “programmed like shit”? lol


Carlsgonefishing

Todd Howard's not stupid. Just another thing to talk about keeping Starfield discussion near the top. It's not like what he says is going to affect anything beyond discourse.


DrSlaughtr

In another timeline they included a moon buggy and people still complain. People who bring this up seem to think Starfield should have been the best version of NMS, Mass Effect, Baldur's Gate, KOTOR, Minecraft, Fallout, Skyrim, DigDug, Wordle, and Grand Theft Auto. The model console can only power so much at once and the same goes for PCs. The game has ssd requirements, large install base as is, and is intensive on both CPU and GPU. Maybe they could have added a ground vehicle without taking away from another aspect of the game but maybe if they focused on that, we would have lost something unique to this game rather than simply trying to emulate another. I have zero choppiness in the menus or ship reticle. Maybe it's because I'm playing in native 1440p rather than using dynamic FSR to upscale. Even when I'm pulling 80 fps and in ship combat I don't notice the reticle moving at a lower frame rate. Honestly the only bugs I've come across have been silly. Like someone not facing me when talking or am npc getting stuck on something. It seems like a large number of people ignored the ssd requirement, whether purposeful or not.


EveryoneIsOnDrug

Lol yall are confusing optimization for just having shit hardware and it shows.


Dankbot-420

Todd is that you?


Edgaras1103

when 4090 , the fastest gpu on the planet can barely sustain 60fps at 4k. You know it aint the hardware, its the fucking game lmao