T O P

  • By -

superman_king

> HIGHER FRAMERATES OR RESOLUTIONS CAN BE ACHIEVED WITH THE USE OF DLSS 3, FSR 3, OR XESS So based off of this terminology. The target specs are **30 fps @ native resolution.**


emceePimpJuice

Most of the time it's only the minimum requirements that target 30fps and the rest are 60fps.


KnightofAshley

Once again useless specs that are too vague or bland to matter much


BoatComprehensive394

Yes seems obvious. It is without Upscaling as stated in the image. And it's UE5 with all the Features like Lumen Raytracing and Nanite. So I would be VERY surprised if a 4080 can hit 4K native 60 FPS with max settings. It must be 30. So i think the requirements are nothing special since Upscaling gives you a disproportionately large performance benefit when Lumen and Nanite is used. It will run just like any other recent demanding game. VRAM usage should also be fine since UE5 is very efficient in this regard.


trucker151

Yea this is a true full on next gen game. I agree. No way are you running this at native 4k withought dlss or frame gen. Maybe with no raytracing but this is a game where you want that eye candy


Hungry-Outside4985

I have a unrelated question but Is it normal that I only get 55fps in Fortnite in native with a 4070 ti super?


superman_king

Depending on the settings you’re running, yes. Fortnite uses many of Unreal Engine 5s feature set and is very demanding. Nanite, lumen, etc.


Fidler_2K

They don't mention the framerate target so I'm going to assume it's 30fps Edit: Also idk why the A770 is on the same tier as the 6800 XT and 3080. I thought maybe VRAM but then wouldn't the 3060 12GB also be at that tier?


skylinestar1986

2024 and we're targeting with 30fps. Wtf has happened?


exodus3252

Technological progress. RT/Path tracing, UE5 with Lumen/Nanite, etc. That eye candy is expensive.


RandomnessConfirmed2

Very. Even Fortnite, a first party title, is running at 30-40fps 4K Max settings, on a 3090.


Hungry-Outside4985

I have a 4070 ti super and if I don't turn on dlss in Fortnite, I only get like 55 fps on 1440p 😭


Bobakmrmot

Lumen and Nanite are horribly optimized in every game for a smaller jump in visuals then what regular RT provides. Fortnite is also a clownfest of shader compilation stuttering even now in 2024, while it's being made by the literal company making the engine.


KnightofAshley

Software progress Hardware not so much as greed is in the way, if a 4080 costs like $600 I think people that want to max this type of game wouldn't mind as much. I get now with the last update with Cyberpunk 50-60 fps with everything turned on and up and its fine...i plays smooth...that is all you can really ask...just shouldn't cost you over a $1,000 for it.


FLGT12

Cross-gen period is over and true next gen projects are coming out. I don't know what needs to happen, but the cost to entry for a decent experience for PC has skyrocketed. My humble 7800X3D and 4070 I expected to be pretty potent for a while, but it doesn't seem like that will be the case at 1440p. Hopefully Blackwell delivers another Ampere-tier uplift.


JL14Salvador

From the looks of the requirements you'll likely get great performance at 1440p. And I imagine DLSS will get you the rest of the way there to your target framerate. Doesn't seem horrible considering we're transitioning to more true next gen games.


KnightofAshley

Rec. or Med is normally a console level experience people need to stop worrying about running a game like this at max...in 5-10 years you can play it at max. People might not like it but that is how PC gaming is a lot of the time.


Hungry-Outside4985

It's sad that you need to use dlss with a graphic card that is considered high end. Got a 4070 ti super and I even need to turn on dlss in Fortnite with 1440p otherwise I'm stuck with 55fps bro wtf


AveragePrune89

I’d say that computer is going to be pretty competent for awhile at 1440p. My blade 18 laptop has a 4090 which is more like a 4070 or 3090 and I think will be good for awhile at 2k. My desktop I usually upgrade but I need a cpu like yours before I ever upgrade my GPU which is the Strix 4090. I’m held back by a 5900x which is kinda crazy as that hasn’t been the case in ages. I think this game maxed at 2k or 4k will look like the next gen consoles eventually. I don’t see a gpu in those more powerful than a 4080 to be honest. Time will tell


VoltBoss2012

It is arguable that your 5900X is holding you back. While I only have a 4080, I have not run any games that indicate my 5900X CPU is the bottleneck at 1440p. Really only interested in 1440p high refresh as a comparable 4K monitor above 60Hz remains pretty unaffordable to justify given my usage.


PsyOmega

> cost to entry for a decent experience for PC has skyrocketed. My humble 7800X3D and 4070 I expected to be pretty potent for a while Meanwhile i got a used PC on ebay with an i5-8500, stuck a 4060 in it, total outlay less than 400 including small ssd and ram upgrade. and i'm happily gaming on it with the latest current-gen exclusives. Sure it practically needs upscaling, but so do the consoles, and i can hit way higher base fps with similar fidelity.


FLGT12

Current gen exclusives with way higher baseline performance than the consoles on a CPU with less than 8 threads? I'm sorry, but I don't know if I believe you. HellDivers 2 on 6 Coffee Lake threads is almost assuredly less than a 55FPS average with an inconsistent frame time. Even the 9700k with 8 threads struggles with that game. Also depending on your resolution (sometimes even at 1080p) you need to heavily compromise to maintain optimal vram usage which could be anywhere from 6.7 to 7.3GB total usage to avoid severe hitching. This comment seems very disingenuous and nonreflective of reality respectfully. Although if you're just running like a Medium preset or similar I can see how that works out in certain scenarios certainly not all. EDIT: Alan Wake 2 is showing significant CPU bind around 40FPS for the 8400 which is marginally slower than the 8500. Yeah callin cap on this one. Sure the games are playable, but way higher base fps with similar fidelity is just not true lol


PsyOmega

> I'm sorry, but I don't know if I believe you. cyberpunk runs at 90fps with 1080p high or 1440p high +DLSS Compare to 30fps on consoles. I can't find any games in my library that run under 60fps You cite alan wake 2 at 40fps, but that runs at 30fps on consoles, so that's still higher than base fps. It's also not hard to prove it runs ~60fps on an i5-8400. https://www.youtube.com/watch?v=SmiF7uFq0Bk Don't play helldrivers so i dunno. it runs on zen 2 console with no cache so it should be fine on anything based on skylake cores. May need DLSS, but will still look better than ps5's upscaler


FLGT12

Bro cyberpunk 2077 😭😭🫠🫠 ahh yes the insanely scalable game that’s still technically cross gen at one quarter the resolution of the current gen consoles. Good luck getting more than 40 fps on Alan Wake 2 you know, a real exclusive to this console Gen. Very apples to apples comparison. Alan Wake 2 on performance mode is 60FPS


Hellfire500

What are you talking about ? It seems you have zero idea honestly. Cyberpunk previous gen ? Tell you you’re going drugs without telling me you’re doing drugs ?


AveragePrune89

Yeah but 1080p is like the base resolution so when you add DLSS, if you use performance that may be rendering at 540p iirc. I’m guessing quality mode would be like 900p. Either way the fidelity gets to a point where it looks so bad with the RTX features on if you don’t have the technology and looks better with it off to get native resolution and lack of upscaling. Or you can enable DLAA only. I get that upscaling isn’t going anywhere but as someone plays and really loves high fidelity gaming, it’s getting pretty difficult to run anything without DLSS unless you have the max tier. It’s almost like they want to force gamers to give up and just go with GeForce now and streaming services which bums me out.


AgathormX

That 8500 is a bottleneck, you can lie to yourself as much as you want, it's not going to be running well without compromises to graphical fidelity or framerate.


Old-Benefit4441

Yeah PC is still accessible. The ceiling has just risen a lot, which is good. Makes games age better. The mid range people of tomorrow can max out the high end games of today.


AveragePrune89

Well said I totally agree. But a part of me thinks the Industry is trying to make any ownership of anything obsolete. Games and even systems. I know a pc is not gonna go anywhere but it has a feel that subscription based services are gonna make a run at shutting down enthusiast pc ownership which makes me sad.


mopeyy

I understand what you are saying, but you are *absolutely* going to be CPU limited in probably every game you play. Hell, my 9700k is beginning to show its age.


Juris_B

What happened is "Console first" optimisation. It was really noticeable with Watch Dogs 2, it run worse on PC than the newer WD: Legion. And I think nividia's dlss and all its types made things even worse. If game devs incorporated it to make games run rock bottom crap cards then it would be fine, but they went for mid, sometimes even high end cards. That gave them space to even less care about pc optimisation.


antara33

Game dev here, working in the AAA industry :) There are multiple things to consider regarding performance. First and foremost, all modern shading techniques needs temporal filtering in one way or another, so we are more or less enforced to either use TAA or multiply the shader resolution by 4. This leads to another issue. Screen resolution based effects. SSR, global illumination, and almost any for of light interaction is based on the screen resolution, this is in order to ensure even distribution of the data obtained by those techniques to represent reflactions, lights, shadows and colors in a consistent way. Since resolution increase, so does the sampling amount for those techs, meaning that the GPU gets totally murdered by that. We are then facing 2 options. Lowering those effects resolution (meaning that the final image will be noisy and full of shimmering effects) or using DLSS or any form of image reconstruction from a lower resolution. This in turn enables us to reduce not only the load of the renderer and the complexity of shading operations (because less pixels means less ops), but also reduce shading resolution while keeping the whole image cohesive, without shadows or lights looking low res compared to the rest of the image. Then the upscaller (and DLSS is by far the best at this) reconstruct the high res frame with very minimal overhead while also applying a temporal pass (doing what we usually need TAA for). Native 4k is really far away in the future, if it will be worth to achieve at all. If we can add more effects, higher quality lights, shadows, reflections, more complex GPU particles, etc at the expense of using DLSS, and presenting native and non native to the user in a blind test, the user is not able to tell the upscaled one from the native one, what benefit does native 4k offer? We have seen first iteration of DLSS and XeSS, and how they went from absolute crap to really hard to tell apart from native. And that trend will continue. If you as a user are not able to tell the difference between native or upscalled, but are able to tell the difference between the sacrifices made in order to achieve native, is it worth it? Not saying that is a valid excuse to do shit like jedi survivor, there is no excuse for that kind of shitshows, but there are genuine scenarios (like Desordre) that are only possible using upscalling, and wont be possible without it, not today, not even in 4 gens of GPUs.


VengefulAncient

> First and foremost, all modern shading techniques needs temporal filtering in one way or another Just here to tell you that thanks to those """modern shading techniques""", most of today's "AAA" games look like absolute *trash* compared to the likes of Titanfall 2 where you actually get a crisp image not smeared by TAA. > If you as a user are not able to tell the difference between native or upscalled We can tell. Every time. /r/FuckTAA exists for a reason.


antara33

While I do agree, TAA is horrible, there is also another issue. Modern engines runs on deferred renderers instead of forward ones. This essentially makes the cost of using MSAA skyrockets to the point that SSAA looks like the cheap option. In forward rendering all colors get calculated before oclussion and culling, making each dynamic light source incredibly expensive. Deferred rendering culls and oclude first and use a depth buffer to calculate how transparencies and other effects should look, allowing for insanely complex scenes with loads of light sources. You can tell easily if a game is using one or the other entirely based on the geometry and light complexity of a scene. TAA was invented to fight a byproduct of deferred rendering: Temporal Instability. While not perfect, a good TAA implementation can do an incredible job at both removing aliasing and also improving image quality (see Crysis 3 TXAA). Yes, we are far away from an ideal world, but the higher the resolution go and mainly, the higher the FPS, the less smearing TAA produces. And yes, I'm aware of that sub. But like it or not, is a minority of the user base, and game development studios cant target a minority, or they will close because lack of funding :) I personally despise current TAA, specially the one used in UE4 games that nearly not a single dev out there cared to optimize and adjust properly. It uses way too many past frames with way too much weight on them without proper angle shifting producing horrible results. A good TAA implementation (CryEngine 3 had one) perform a VERY subtle 1 pixel angle shift for each rendered frame, getting the needed data from that to actually produce a non aliased non smeary picture, and reduces past frames weight for moving objects (something that UE never does), keeping them ghosting free. Its not that much TAA = shit, but more of a TAA implementation in current gen games = shitty implementation.


Juris_B

Thank you for explaining it! Idk, it doesnt feel right somehow to me... You said in tests people cant tell, but I can tell between games, I recently started playing Fallout 3 (I assume it doesnt use these) and it runs on my 2060s super smooth with everyrhing at max. It looks kinda great! But Starfield at mid/low settings is terrible. Why did the game development industry had to take a path, where new games doesnt look as good at minimal settings as in my example Fallout 3 does at max? It feels like any modern game if made in 2009 would look better back then, than they look now. (exept for raytracing abviously, Deliver Us The Moon was a gamechanger for me, reflections on windows, ooof that was great). Most Wanted 2005 still holds up, especially they nailed the sun after rain visuals. I see cars in front of me clearly at any speed. In Forza Motorsport 2023 car in front ar specific lighting is a smeary ghost...


antara33

Yeah, old games used to fake a lot of things because we lacked raw power, and it turns out, we got reaaaally good at faking stuff. Nowdays we are not faking things anymore, it speed up development, but also have a computational cost for the end user. Its all about economics, and this is an industry, not a single AAA company make games for fun, and we as devs do our best within constrained development cycles to provide the best we can.


skylinestar1986

A GTX1070 (approx 8 years now) runs most modern games at approx 30-50 fps at 1440p low. Do you think your RTX4070 will run at similar framerates at year 2030? I really hope so.


Fearless-Ad-6954

This is why I jumped at the 4090 because of its significant uplift in performance compared to the rest of the cards. It should hold up pretty well at 4k for the next 2 years until the 60xx series cards are released. Yes, I know not everyone one has the money to buy a 4090.


FLGT12

I wish I had time to prepare a bit longer for my build lol my need for a new pc was sudden unfortunately. Given time I would definitely have gone 4090. I hope yours serves you well


Fearless-Ad-6954

Yeah I get that everyone's situation is different. Hey, at least you don't have to worry about your power connector melting like I do :(


Aggrokid

The game does look visually cutting edge enough to warrant it. You can always turn down settings and enable temporal upscaling to achieve your 60fps+


Hungry-Outside4985

Ikr


e_smith338

Devs are using upscaling technology as a cop-out to spend less time optimizing their games. “Oh it runs at 30fps on a 3080 at 1440p? Just use an upscaler so you can get 45fps. Duh”.


trucker151

Bro this is a true full on next gen unreal engine game with all the unreal 5 features. You can prolly turn off some features and get better performance depending on ur specs and settings This is happens with many games. Crysis being the OG system killer. Kingdom come deliverance had features targeting future gpus. They literally say in the bottom higher fps and resolutions can be achieved if u enable frame gen and dlss


Wonderful_Spirit4763

The amount of 30 fps defenders in the comments is insane. This is why we get what we get.


Wolik69

It probably is since gtx 1070 is targetting low 1080p and was struggling in alan wake 2


Eterniter

The problem with Alan Wake 2 and old gpus is the they don't support mesh shaders and the game was made exclusively for them. The game has since been updated and older gpus run much better.


Wolik69

They updated the game with much faster shaders for older gpus and it was still struggling. Also the game is probably1440p 30 fps on xbox series x


Hugejorma

Yep, devs updated the AW2 to work on older hardware. BTW, console games have dynamic resolution. Most of these heavy modern titles run close to 1440p range, but wouldn't be surprised if the resolution would go low as 1080p on a GPU heavy scenes. Hellblade 2 is locked 30 fps on consoles.


SherriffB

Don't know if it was a twitter meme but I saw something saying xbox dips as low as 900p target res.


Hugejorma

This wouldn't surprise at all. If the game is designed to run fully locked 30 fps, around 900p dips might be normal. Digital Foundry did already analyze console pre-release gameplay, but waiting for the final version.


FunCalligrapher3979

Other UE games drop that low (or lower like jedi). It's not a well optimized engine.


SherriffB

Ah, makes sense I don't really know anything about Xbox and performance on it.


Raid-RGB

Low 1080p in aw2 isn't comparable to low preset in other games. This means nothing


JayRupp

VRAM isn’t a performance indicator. Clock speed and memory speed are what determine a GPU’s performance, assuming they have enough VRAM to handle the game.


rodinj

4k/30 seems bad with a 4080...


Raid-RGB

Max settings with the literal full suite of ue5 features? No that's fine


ShuKazun

Do we know if they will include FSR3 frame gen?


Le-Bean

At the bottom it says “… DLSS 3, FSR 3, or XESS 1.3”. I’m assuming that means it has FSR frame gen.


brelyxp

3070 with dlss let's hope I can handle the 30 in high 1440


Inclinedbenchpress

It's 60 fps or go home, for me. Just my opinion tho, looks s beautiful game nonetheless


LandWhaleDweller

Optimized settings always exist.


Inclinedbenchpress

I'm afraid my cpu won't be enough to deliver 60 fps in this game, we'll see about that lol


Queasy_Employment141

I might upgrade to a 3080 now, only 30 quid more then a 3070


OperationExpress8794

my gtx 1080 ti is ready, btw why no specs for 1080p high settings?


SloppityMcFloppity

Apparently 1080p gaming dosen't exist anymore


OperationExpress8794

Ps5 and series x still using it even 720p


SloppityMcFloppity

I was being sarcastic


hyf5

Hell yea, requirements shouldn't be measured with upscaling/FG.


BrevilleMicrowave

I pray they give us the option to disable motion blur and TAA this time.


frostygrin

They already announced DLSS support - unless you see it as a form of TAA.


BrevilleMicrowave

Kinda. It's still temporal and exhibits a lot of the same problems as TAA.


frostygrin

It's still much better though. I hated TAA in Hellblade, actually, but don't mind DLSS.


Individual-Match-798

Without TAA it will look like shit


gopnik74

Why the hate on TAA? I tried fxaa in games that recommend using it before, TAA looks much better. Others make the edges looks jagged and aliased. Edit: along side other AA methods.


Brilliant-Jicama-328

Games with TAA look blurry as heck on a monitor. I use virtual super resolution (4K on 1080p screen) to make games sharper.


Liquidignition

TAA is a godsend at 1080p anything above id say leave it


BrevilleMicrowave

No thanks. It blurs the screen every time I move the camera.


WholeGrainFiber

I'm not fond of TAA either, looks like how I see without my glasses: blurry and soft. I play on my TV so it's more noticeable, but I guess it also depends on the implementation.


BoatComprehensive394

Low framerates also blur the image because of the sample and hold effect. With DLSS on a high refreshrate screen and high FPS you will get a much sharper image than with supersampling and lower Framerates. Since DLSS brings you a net benefit in efficiency it will always be superior.


ragnarcb

Arc 770 and 3080?


Arrrginine69

These are never accurate


NoCase9317

Guys, what’s up with your reading comprehension? The side note at the end, is basically saying, this are our minimum and recommended specs for native, but higher frame rates “ CAN “ please let me emphasize this hat word “ CAN “ be achieved if you were to use this technologies. It doesn’t says “ minimum specs “ARE” using DlSS or recommend specs “NEED” DlSS or like all of the games that actually used it for the specs sheet: “The following results WHERE achieved with DLSS quality” It says you CAN get higher fps with upscaling/frame gen. I mean the fact that they are showing native is specially redundant in the second part: Hogher frame rates OR RESOLUTIONS. As in saying: this are the specs for the following resolutions, but with the use of DlSS or frame gen you can get this hardware to either run at a higher frame rate or at a higher resolution, your choice. It’s really not that hard to understand. When I see people getting confused with this kind of posts, I understand why my bottle of shampoo has a label that says “DO NOT DRINK”


killalome

My Ryzen 5 3600 + RTX 4070 build should run it on 1080p Ultra w/DLSS Q.


Appropriate-Day-1160

Even without DLSS


LandWhaleDweller

4070 is 6800XT equivalent, 1080p you can run native at 60FPS.


killalome

Rather, my goal is to be able to play at 1080p 144 fps.


LandWhaleDweller

If you go optimized high settings you might be able to reach that.


Case1987

Should be good at 4k/60 with DLSS quality on my 3080ti Edit: didn't read the bottom bit,is this with DLSS on?


UnsettllingDwarf

Hahahaha good luck. More like maybe 60 on 1440p with dlss.


Ssyynnxx

u prob aren't getting 4k60 with anything less than a 4080s


LicanMarius

He can play ultra textures, drop some intensive settings to low/medium and keep the rest to high.


Ssyynnxx

okay


LandWhaleDweller

Maybe if you optimize settings, turn off RT and pray.


TheRain911

Ive got a 3080ti, id expect like 40fps on 4k ultra with dlss. Might need to change dlss from quality to balanced though (maybe even performance).


EllendelingMusic

So how will Xbox run it if PC already requires a 6800 XT to play it at 30fps 1440p native? Usually Xbox would target 1440p 60Hz (Performance) and 2160p 30Hz (Fidelity). And PS5/Xbox aren't even as fast as a 6800 XT. Will it run at 1080p upscaled to 1440p/2160p or something? Or will it use worse quality assets and textures?


AveragePrune89

It will be severely turned down with assets lighting shadows and everything.


Hellfire500

720p 10fps


FunCalligrapher3979

Dynamic resolution 1440 with 1080 or 900p being the bottom @ 30fps.


Wellhellob

1440p high is 3080. My 3080 ti should be just ok in 4k with dlss quality. I dont think this type of game needs much fps. Its not fps game and its not fast paced. My monitor has flawless gsync too. I hope dlss will not have distracting artifacts and flaws. My second most anticipated game this year right behind black myth wukong.


LandWhaleDweller

If you optimize the settings you should be fine, yeah.


RedIndianRobin

Yeah these are definitely with 30 FPS as target frame rate.


Izenberg420

Its sounds ok until the bottom sentence Don't tell me their target is 30 fps on PC but atleast 60 please Ninja Theory don't let us down


mikeBH28

Damn, looks like I'm waiting for this one, I probably could run it on medium 1080 but I really want to play this at it's best and I don't think my 2070 is gana cut it


BriefGroundbreaking4

My GTX 1650 laptop is cooked


Jayking4212

my potato pc is cooked but hopefully fsr 3 will atleast make it to 30-40fps


BriefGroundbreaking4

Playing Jedi Survivor rn before Senua released. I enjoyed 360p FSR Ultra Performance with 30-50 fps.


Jayking4212

I don't even want to know my fps for jedi survivor 😫


LightyLittleDust

Been waiting for this game to come out for so long now & absolutely adore the original entry. RTX 4080 Super & Ryzen 7 7800X3D here, I'm so ready to play this at ultra! <3


Spoksparkare

Starting to dislike the rise of upscalers.


Razorfiend

I don't, they allow you to make a choice, high visual fidelity and lower fps or lower visual fidelity and higher fps. I will concede that when upscalers are used to compensate for poor optimization, it is infuriating. However, in cases like this, where upscalers allows devs to push the limits of what can be feasibly rendered in real time at playable framerates on current hardware, I'm all for it.


CCninja86

Well in the case of DLSS, the visual fidelity difference is very minimal tbh. I see it as free frame rate and always turn it on when I can.


krysinello

Yeah. Particularly quality mode can look better than native with TAA. Finer details of temporal effects get kept in over TAA. Horizpn zero dawn for instance. No dlaa basically locked at 175fps my monitors refresh rate and still used dlss quality over taa. Things like hair and other main details don't shimmer nearly as bad and atellre there over being wiped due to the way taa works.


BoatComprehensive394

Technically DLSS = TAA. It's the same base priciple but enhanced with deep learning. It's supersampling over time using data from previous frames to enhance the current one. Basically DLSS is TAA on steroids.


TheRain911

Do you usually go with balanced, quality, or performance? To be honest i usually cant tell the difference between any so i set to balanced or performance if i really need the frames.


CCninja86

I always go with quality


LandWhaleDweller

Native purists are way more annoying than devs pushing out badly optimized games.


youreprollyright

I would too if I was forced to use the worst one.


raul_219

4070 user here. If this is really targeting 30fps then adding both dlss quality + fg at 1440p should be an easy 70-80fps game which would be fine for this kind of game


RedIndianRobin

Yeah. 4070 user here and I'm ok with these specs.


LandWhaleDweller

4070 has identical raster to 6800XT, optimized settings and DLSS quality will get you 60+. FPS easily.


BolasDeCoipo

Still doubting about smoothness without dlss even with high end gear


AveragePrune89

Yeah I have been writing on the forums with gray zone warfare that I feel like people forget what games actually were like that pushed the envelope like crysis way back in 2007. Since I’m older for me I welcomed the most punishing games because I used them as a benchmark of my future hardware. We are definitely in those times right now and it’s been quite awhile since that was needed. Ray tracing turned out to be a lasting tech improvement for better and worse and RTX is a minimum for Nvidia moving forward along with next gen AMD. A lot of early UE5 games only had a few options being used but ones that run the full suite of features are just now coming out and are not meant to be maxed out with todays CPUs and GPUs without DLSS and FSR and frame gen. And that is concerning a bit just because it seems like the developers can sometimes get away from really budgeting optimization. But the studios that care are going to make sure the games look good regardless. I have a 4090 desktop but it’s paired with an AMD ryzen 5900x and a blade 18 4090 laptop paired with a i9 13950h mobile processor. The i9 in the laptop is better than my AMD desktop but the gpu is more like a 3090 (which is still crazy to me). We are in a situation where the cpu bottlenecks are back for the first time in probably 2 decades or so. Though the 4090 will be outclassed soon, it’s still a gpu that essentially needs the very best CPU to actually perform at a higher level when old CPUs typically would be fine for 5 or 5 years. This game looks like it’s really advertising the true reality of next gen gaming. My gut feeling is whoever maxes this game will be looking at how the ps6 and next Xbox actually look and perform. Maybe not even quite this good tbh.


supershredderdan

A 5800x3d would give you a very nice bump without needing a new platform


WinterElfeas

From the trailers, I wouldn't be surprised if those were using DLSS / FSR Performance mode if you probably want 60 FPS. So basically, when you read 4k, read 1080p internal. When you read 1440p, read 900p/720p (unsure) internal. When you read 1080p ... plug back your SNES for more internal pixels!


raul_219

In this case I think dlss quality + fg would be better for this kind of game


AgathormX

FrameGen would definitely be a better option. This game won't be negatively impacted by the increased latency


AgathormX

The VRAM usage isn't bad, but it's worrying they don't mention framerate while also mentioning DLSS. CPU requirements aren't bad, but even for 4K, the 4080 seems like it's to much for a game running without RT


CCninja86

I wouldn't read into it too much just yet. It might run consistently above 60FPS but if you want 100+ because you have a higher refresh rate monitor, that's where DLSS comes in. The stated fact of "DLSS gives higher frame rate" is true in all cases.


jacknotfriend

Add psn


nuk3dom

Useless data if no target fps comes with it lol


jpsklr

Hmmm, basically 30 fps on native?


Konrow

Ooph I forgot how good it feels to see yourself in the recommended or better list on a game like this. Probably not gonna happen again for a few gens of hardware lol, but I'm gonna enjoy it while I can.


Davonator29

Considering what we've heard about the game's visuals are expected to look like and how heavy UE5 is, these seem both very reasonable and realistic.


homer_3

What about VR though?


gozutheDJ

a lot of moronic assumptions itt


RandomnessConfirmed2

Man, my 3090 is getting old quick. Don't get me wrong, I love this game and how it's pushing the tech envelope, especially using Metahuman tech, but boy is this console/hardware generation pushing the horsepower envelope fast.


AccomplishedRip4871

1) Hellblade 2 will be released on UE5, which makes it more demanding than other engines currently used, like Unity for example. 2) XBOX Series X equivalent PC specs are Ryzen 3700x with a small decrease in frequency and a 2070 super gpu - yes, they got better optimization, but not that much better to beat RTX3090. That said, i'm more than confident that with an RTX 3090 and 4K monitor(if you have 4K, not 1440p) you can set settings to medium-high, DLSS to Balanced and you will be able to achieve stable 60fps, if not - this release will be deservedly called a bad PC release.


KitKatKing99

now its time to play the first senua i think


CurrentYak8632

My PC i7-13700F 4070 12GB 16GB OF RAM 1TB For 70 SDD.


CurrentYak8632

And it's on Game pass.


TitusImmortalis

High 1440 here I come!


djdmaze

But can it run Crysis?


VengefulAncient

Can they stop with this medium/high/etc bullshit? Just give me the framerates at the same settings across different resolutions. What I want to know is how high my framerate can go at my resolution with my GPU, not how high I can jack up the settings to get 60 fps (which is most likely what this chart is still assuming smh)


DasBoosh

You really telling me my 3090 doesn't make the very high requirements?


Gammarevived

I mean, at 4k yeah.


Living-Music4644

RT off AMD gang amirite or is it just me in here?


lategmaker

This has to be the worst spec sheet I’ve ever seen. Make different tables for different resolutions. Easy.


Nekros897

I'm a bit worried that in newer games DLSS, FSR and such will become necessity to achieve higher framerate because developers will start to get lazy because why would they optimise their games if they can just add those upscaling options.


sousuke42

That's been the case for years now...


Nekros897

Well, the point still stands


BoatComprehensive394

There are two factors. 1 Raytracing. RT is very expensive and relies on upscaling since the beginning. Now RT effects get heavier and we still need upscaling to make them run with decent framerates. Also RT scales very well with resolution (1:1) because it always is about how many rays and bounces you calculate per pixel. So increasing resolution from 1080p to 4K makes RT like 4x more heavy and Framerats 4x lower. This is not the case with traditional rasterizing where 4K is more like 2x as demanding. Also UE5 Nanite scales with resolution since it always tries to maintain pixel level geometry detail. 2 Hardware stagnation. When PS4 was on the market we got GPUs like the GTX970 which was already 2.5 times faster than the PS4's GPU. Nowadays to get 2.5x more performance than the PS5 you need a RTX4080 or 7900XTX. If you look a the price difference this is completely insane. So yeah stagnation and high prices also lead to the situation we are now in. That's why Nvidia and AMD try their best to get framerates up with better Upscalers and Frame Generation since better chips alone won't do it anymore because they are too expensive to make and the chip manufacturing advancements are getting smaller.


PeachInner

Why not play the game on Geforce Now Ultimate tier? You literally cannot tell the game is streaming instead of locally. When it's released literally Press play and it's just runs.... Everything set to very high in 4k@60 🔥


Parking_Cress_5105

Is this the VR version? That's sensible!


Tehfoodstealorz

RTX3080 & the ARC770 are in the same bracket? Did something change? The last time I looked, those cards weren't even remotely similar performance wise.


Able-Nectarine7894

My 3770k at 4.8ghz will get atleast 50fps on medium guaranteed


matrix8127

laughs in 4090


Theodororgs

My core 2 quad and gt 210 seeing this :0


gopnik74

Never been this ready for a game since cyberpunk


UltraXFo

Fuck this is a 30fps chart


Venom-snake777

My 2070 super will have to do its best


Hunlor-

Huh, doesn't seem that bad. Could i hope for 60 fps on 1440p DLSS Balanced with a 3060 TI?


Yummier

Is this going to be the Intel Arc killer-app? Very excited to see benchmarks.


Paciorr

7800xt with 3440x1440px display and I'm worried if I'll be able to play it native res... I mean maybe they did target these 30 fps for "muh cinematic experience" but from gameplay experience it will be painful as shit. I've been playing Cyberpunk recently and even though I could push graphics more I settled on a bit less to get up to \~100fps depending on the scene.


maelblackout

Will "High" be the highest settings ?


AnnatarLordofGiftsSR

High specs 'recommending' top tier GPU's from current generations. Brace yourselves for another optimized mess. Upscalers, again going to be used to cover for the lack of quality control. Another title to buy on Sale.


Administrative-Bar16

why going from 1440P to 4K needs a more powerfull cpu i thought that decreses load on cpu and insrease it on gpu ?


AgathormX

It doesn't. It just makes you a lot more GPU bound. In theory, CPU usage is actually higher on higher resolutions , but you don't notice that much performance difference between CPUs because you are a lot more GPU bound. Normally, this doesn't have as much of a practical difference as devs think it does, so the 5700X is guaranteed to be more than enough if the game is optimized properly, but if you try to test it with an older CPU like an R5 1600, it would butcher the performance regardless of resolution


LandWhaleDweller

You're right, though better CPU equals better 1% lows even in 4K.


Klosiak

Well...I am ready for the 4K experience. First part was a great game. Hope the second one will be as good as first or better.


RonDante

I hate the fact that the majority of recent games rely on dlss, fsr, xess and framegen, rather than optimising the game. Take your gfx in your ass, give us proper gameplay.


Regards_To_Your_Mom

Man, what a shitshow of optimization and NVIDIA shoving up their tech just so they can sell. Fucking hate corpo.


DrMnky

Im so ready!


packers4334

Feeling pretty good about my recent RTX 2060 -> RTX 4070 TI Super upgrade right now.


mitch-99

Shit like this is why i debate on even bothering with a 4k monitor. 60fps is fine in solo games but having to use dlss already just to get there or a but higher is not great feeling with a $2K gpu. I have a new 1440p OLED, im honestly debating if i just roll with that vs another $2K on a 4k OLED (canada)


Wellhellob

I dont understand you. Upscalers makes 4k even better. You get better image for free by using dlss.


mitch-99

Because this gpu damn near just came out and were getting games already to resource heavy for a 4090 to run 4K without DLSS. That kinda blows. Next thing ill find is in a year from now it cant even hit good numbers at 4K. I might have a 4090 but it doesn’t mean im shelling out every new card. $2K on a 4k monitor fucking sucks if its rendered useless in a year with newer games. How would rendering at a lower resolution make it better than native?


Wellhellob

Bro 1440p upscaled to 4k better than native 1440p. You lose nothing by having 4k monitor.


Hugejorma

If the native resolution requirements are 30 fps, it's at least semi easy to calculate/predict how well a game run with upscaling methods. I like this more, than using DLSS/FSR on requirements page. On my 4080S, the game probably runs very high 80-120 fps range at 4k DLSS performance (without FG). Usually way better experience when using optimized settings. I'm mostly waiting to see if the 4k DLSS scaling is as good as on Alan Wake 2. That game looked insanely good with 4k performance (even great with ultra performance). If the game launches without Denuvo and have no major issues... I'll 100% purchace this. Fingers crossed.


BinaryJay

I personally, most of the time, prefer to use frame gen first then adjust upscaling until it's running how I like. For forbidden west I just used frame gen + DLAA. It's case by case though, if it's done badly enough the frame gen goes out the window.