30fps for top end consoles in 2023 is absolutely pathetic.
Not really, developers always push hardware more as the generation progresses and these machines are three years old now. That’s almost half a generational cycle.
Of course it is. As always a choice between visual quality and framerate.
DF did a pretty decent video on the whole 30fps question. https://www.youtube.com/watch?v=i9ikne_9iEI
My guess is that gunplay in any version of “creation engine” is going to be janky as …
30 FTS for this kind of game should not matter too much in world, but I agree that it’s pretty disappointing. But I’m extremely skepticial about the whole release anyway.
No Man’s Sky runs at a very stable 60fps, I personally know people who have wrangled it up to 120fps. I know they don’t have the same underlying tech, but they’re very similar in terms of gameplay (from what we’ve seen)
They’re a wildly different level of detail though. The NMS physics engine is pretty simplistic, mostly effecting NPCs and a very few physics objects. Starfield is like other Bethesda games, tons of little items and junk that all have their own physics and interactions.
“Game dev here,” Carlone writes, adding that they are a “big fan” of Dreamcast Guy. “Wanted to clarify: it’s not a sign of an unfinished game. It’s a choice. 60fps on this scale would be a large hit to the visual fidelity. My guess is they want to go for a seamless look and less ‘pop in.’ And of course, [it’s] your right to dislike the choice.”
Sure. Maybe. It could be this. Or…
Arm-chair babbling idiot who plays too much video games here, I am one hundred percent convinced that it has nothing to do with visual fidelity and everything to do with that asthmatic engine they’ve been dragging since Morrowind. Can’t prove it but… you know. Just a hunch I get from playing their games.
Call of Duty still runs on the Quake 3 engine, if we go off of the logic people uncharitably use for Bethesda’s games specifically.
No, it’s most definitely a choice. You can make any engine run at 60 FPS if you sacrifice something else for it. The RE engine runs beautiful games at 60 FPS, but they had to make all sorts of sacrifices to fidelity to get World Tour in Street Fighter 6 to run at all, let alone at 60 FPS on current gen consoles.
that asthmatic engine they’ve been dragging since Morrowind
I don’t believe that’s true at all, though. At least by Wikipedia, Morrowind was NetImmerse, Oblivion was Gamebryo (modified Havok), and Skyrim was Creation. And I remember in the announcements for Skyrim that they remade the engine for the game. And Starfield is an updated engine, Creation 2
People constantly complain about the engine that they use but no other game engine is as flexible when it comes to modding and no other game engine has the same level of complexity when it comes to being able to pick stuff up and move it around. You can take items off a shelf or desk in skyrim and fallout and stack them somewhere else. You can if you want decide to hoard a bunch of garbage you stole and stack them into a pyramid in your home base area.
Are their quirks? Sure the physics tied to framerate in skyrim was a problem, the games are always buggy, and they arent usually the prettiest games out there(though skyrim looked decent when it first came out and the graphical fidelity mods can work magic).
As for the premise does it have to do with fidelity? Of course it does. Setting a framecap on consoles means theyre able to use higher resolution assets, better lighting effects, and more complex models. I understand the preference of giving up fidelity for some smoothness and frames but 30fps isnt totally uncommon in console spaces and this is a bethesda game not a twitch shooter or a 2d fighter.
Outside the PC space gamers hardly ever talk about or think about framer rate. Graphical effects and details and fisual fidelity are a higher priority and more important in a game where generally you mostly just walk around and explore.
It would be nice if they had an option for a lower res mode or less detailed mode and 60fps target, but I get why they made the choice they did and ideally Im sure it’ll run at a normal framerate on pc.
Now if it runs poorly on PC then we can riot.
It’s also a personal choice of Bethesda not to rename their engine. Many other studios do this same thing and reuse engines, but they often rename them after significant rewrites. Bethesda just doesn’t do that.
Also they aren’t worried about how the game will be released. Their games have legs. So a 60fps version will eventually come out. Then they’ll release it 5 more times.
But they did? For Oblivion it was Gamebryo, for Skyrim it was the Creation Engine
I mean that they haven’t changed it from the Creation engine. Which has been used since Skyrim despite some big rewrites for Fallout and I’m sure more big rewrites or additions for Starfield
But it’s only been 2 games since Skyrim, right? And for Starfield it’s being renamed Creation Engine 2. Either way that statement “Bethesda just doesn’t do that.” Doesn’t seem accurate when they have done that multiple times.
Huh okay yeah that’s fair. I guess I’m thinking more about the time span since that game engine is now well over a decade old whereas the previous examples are separated by a handful of years. And I didn’t know about them putting a ‘2’ in front of it for Starfield.
Yeah that’s a weird choice. Todd Howard said that the game has performed upwards of 60fps in some places, but they made the choice to lock it down to 30fps on console for full graphical fidelity.
I get that not everyone has a TV that supports VRR, but they should be able to programmatically check what the xbox is currently supporting. If it is a Series-X and does support VRR they should be able to unlock FPS to up-to 60. I mean even 40fps on the steamdeck is surprisingly good, whereas 30 can be really jarring. Or give a choice to the user, 4k@30 or 1440p@60 with VRR.
Ya, I’d also like to see a 40fps mode. Really adds to the smoothness. Todd Howard suggested that they still had a decent amount of overhead, just not enough to hit 60 consistently. Would be nice if 40 became a new standard option, at least.
I don’t understand it. If it’s a problem on console, why not have a full-fidelity “quality” mode but also offer a reduced-fidelity “performance” mode? Presumably there could be options like that similar to the PC build.
I just hope it plays 60+ on PC, who knows it could be pretty rough. I haven’t watched the digital foundry video on it yet, I assume it’s just Xbox version.
If there is a ‘mods’ system like Skyrim on Xbox, it should be possible to remove the frame rate cap. People managed it with Xbox before they added FPS Boost to Skyrim, using INI tweaks and a dummy ESP plugin. That’s without VRR, though.
This incessant nagging about fps is the most tiresome thing in gaming since gamergate.
Yeah how dare consumers expect their products to be good
Good ≠ a single metric.
Removed by mod
Every video game and every TV program for DECADES ran at 30fps. 29.97, actually. Nobody was motion sick or got eye strain.
Just because you’re okay with 30FPS doesn’t make it “fine” or “good” either. Higher FPS is objectively better. Period. That means 30FPS is bad, when the other options is 60FPS (Or higher, because the console is being DIRECTLY MARKETED to the consumers as a 60FPS-120FPS console)
Nobody was motion sick or got eye strain.
Wow, I didn’t realize you could speak on behalf of everyone’s personal reaction to FPS
Removed by mod
Sure, but a game is objectively better if it can run at a higher framerate.
Bloodborne is excellent, but it would 100% be better if it ran on solid 60 FPS.
Computers (including consoles) have limited resources, at some point you need to deal with tradeoffs, for example do you prioritize graphics quality or do you prioritize FPS? Do you want/need to have more resources available for the physics engine? That eats on the maximum possible FPS. Do you want to do real time procedural generation? Do you want to use the GPU to run some kind of AI? All this are design considerations and there’s no one size fits all prioritization decision for all videogames. Clearly the people working on Starfield believe that for their intended game experience graphic fidelity is more important than FPS, and this is a perfectly valid design choice even if you don’t agree with it.
It’s a matter of optimization and Bethesda games have all had pretty poor optimization. They could get it running at a higher framerate but there’s no need because people will buy it even if it runs 30fps.
If it was only a matter of optimization we would all still be playing games on the original NES.
What’s so revolutionary or ambitious about Starfield that it couldn’t be optimized to have “acceptable” framerate? Pretty much everything Starfield does has been done before and the creation engine isn’t some visual marvel that would burn down graphics cards. So where’s the performance going?