r/pcmasterrace Desktop | RTX 3070 | Ryzen 5 3600 Feb 07 '23

The Hogwarts Legacy experience on PC Meme/Macro

Post image

1.2k comments sorted by


u/RopAyy Feb 07 '23

I was reading this like why what's the issue? As I watched an owl fly over a cut scene and it lsods me into the castle doors opening and I hit 7fps for a good few minutes 😂


u/KlimCan Feb 08 '23

Is my 3070 obsolete already?


u/BIMASO2 Feb 08 '23

Whoo baby dual graphics card meta is back


u/LunaMunaLagoona Feb 08 '23 Gold

They follow the Alpha-Beta-Live model.

Except the Alpha is now all of us testing it the first month, the Beta is the patches we get the first 6 months, and the final Live version is the GOTY version you buy next year.


u/Lev_Astov Lev_Astov Feb 08 '23


u/RdRunner Specs/Imgur Here Feb 08 '23

Every time


u/Golisten2LennyWhite Feb 08 '23

Shit I'm over here playing gta 5, mini motorways, and cities skylines/civ6 like a madman on a 1050ti. 4k display makes them look amazing.


u/AdonisGaming93 Feb 08 '23

not just that but any GOTY version will be 50%-75% off within 6-9 months of release on average for Steam sales. Buying games on release is just not the smart route anymore. You are essentially PAYING to be a beta tester, when that used to be something that the corporations paid people to do as their job. Game tester was a job that had an income etc, now WE pay to be the beta testers, so that the final consumers in 9 months can enjoy a finished game at half the price.


u/Hegemon_Smith Feb 08 '23

I have been gaming since the Apple IIC and am moderately embarrassed to say that the game that finally broke the hasty uninformed purchasing habit on PC was Callisto Protocol. I’ve bought plenty of stinkers over the years but I could and didn’t mind a middling game as long as there was fun to be had.

Now married with a kid, another on the way, and working on a house/generally being adult life busy I am never going to finish that game. I played too much trying to salvage the feelings of playing Dead Space way back when to return it so there it abides. Heck I barely have enough free time as is, spending it on a full price lackluster title just grinds my gears and it is entirely preventable.

Argh. Onwards and upwards!


u/AdonisGaming93 Feb 08 '23

That's the other thing, now as an adult, we have full-time jobs, (im single so not kids though) and adult responsibilities, there just isn't enough time to game as much as I used to as a kid.

Today I kinda have to go out of my way to schedule when I'm gonna be able to actually load up a game and dedicate some significant time to it. Being single with no kids does add to the free-time but still my job takes up almost my whole day.


u/TrueHawk91 i5 8600k, RTX 2080, 32gb RAM Feb 08 '23

Felt this hard with Cyberpunk, got it for christmas when it came out and didn't really play it until a year later, combo of my poor ram config and the game just being poorly optimised lead to me really hating the game until the performance patches made it better. Good game but fuck CDPR for releasing such a shitty performing title

→ More replies (4)
→ More replies (10)


u/jtypin Ryzen 1700/Quadro K4200 "Never go AS5 to mouth." Feb 08 '23

I fucking hate it here.


u/PostTail Feb 08 '23

Stop buying on release and let the beta testers spend money

→ More replies (5)
→ More replies (2)


u/IdealIdeas 5900x | RTX 2080 | 64GB DDR4 @ 3600 | 10TB SSD Storage Feb 08 '23 edited Feb 08 '23

Nvidia, AMD and Intel would have to add support for Dual cards again first.

The post got locked, so im editing it in here now, Nvidia doesnt support multiple gpus for GAMING anymore. Other applications like what you specified are still getting multi-gpu support.


u/maxdamage4 Feb 08 '23

They're gonna have to come to my house and stop me because I tell you my motherboard is chock full of slots to stuff cards into and I ain't holdin' back.


u/TheRealRolo R9 5900X | RTX 3070 Feb 08 '23

Was it manufactures that stopped supporting it or was it developers? I’m pretty sure AMD still supports Crossfire.

→ More replies (2)
→ More replies (1)
→ More replies (16)


u/Tuckertcs Feb 08 '23

Gotta upgrade for triple the necessary price I guess


u/chipforclips23 Feb 08 '23

Triples makes it safe, triples are best.


u/savagethrow90 Desktop Feb 08 '23

That nova deal is a sure thing now

→ More replies (1)


u/paul_park Desktop Feb 08 '23

Planned gamed obsolescence


u/Dr_Jabroski Feb 08 '23

No, the devs just discovered this one neat trick to make a bunch of money by mining on your card while you play the game.

For legal reasons this is a joke.


u/Boo_R4dley Feb 08 '23

No, it’s just poor optimization.

The PS5 CPU is basically a Ryzen 7 3700X, and the GPU is pretty close to a 5700 XT with Ray Tracing added.


u/alphaevan Feb 08 '23

I was asking the same question I have the 3070ti and I over clock the bitch like how is that not enough to run the game on at least medium 60 fps. But in reality they just did us pc players bad again and didn’t give a fuck to test or optimize the games final touches before they released it to us.

I know it 3 days before release but still I payed extra to play the actual game early not to do a 10 dollar play test


u/majora11f Feb 08 '23

My 3080 was dropping to single digits a few times.

→ More replies (12)


u/OrphanWaffles GTX 970 | i5 4690k | 8 GB Feb 08 '23

Are you talking about the very first time you walk into Hogwarts? That dropped to a slideshow for me too, but was fine after that

→ More replies (2)


u/JoshuaNLG Feb 07 '23

I'm convinced the game has a memory leak, the longer i play it, the worse it gets.


u/puredotaplayer Feb 08 '23

Just check in task manager if private bytes are increasing as you play (and probably stay in the same area to ensure nothing is streamed in) for some period of time. This should confirm it.


u/ChopperGunnerNL PC Master Race Feb 07 '23

It’s called Denuvo unfortunately


u/lessfrictionless Feb 08 '23 Silver

Just dont cast Denuvo then.


u/Few_Occasion1891 Feb 08 '23

Experienced the same thing in Dying Light 2. Denuvo sucks


u/Minimum-Hearing-418 Feb 08 '23

Srry but wtf is Denuvo


u/Galaxymicah Feb 08 '23

Anti pirate software.

On a veeeeery basic level it checks if you bought the game. And then If you did it's like cool you can play. If not it locks the game up.

Denuvo is particularly... obstructive is probably the best word. But it tends to tank the performance of the game. It's really difficult to bypass tho so they keep putting it in games making them objectively worse but keeping them from being pirated for those first few months when the most sales happen


u/MrrQuackers PC Master Race Feb 08 '23

Don't downvote this guy for asking a question.


u/Cephalopod77 Feb 08 '23

Anti-Piracy tool


u/ShallowBasketcase CoolerMasterRace Feb 08 '23

That's weird because Denuvo has definitely made me pirate more games than it has prevented me from pirating.


u/marbar8 Feb 08 '23

Yeah, and I don't know how all these games with anti-piracy end up on my computer, super weird! I must be getting hacked.


u/howiMetYourStepDad Feb 08 '23

That slow down Pc and overwrite on ssd which reduce his lifespan. It is suposed to stop piracy and cheater


u/MrrQuackers PC Master Race Feb 08 '23

And Callisto Protocol.

→ More replies (3)


u/kor34l Feb 08 '23

lol yeah, the "anti-piracy" malware that forces me to pirate the games, because it doesn't work in linux.

→ More replies (3)


u/Ghost_In_A_Jars Feb 08 '23

I'll buy the game in a month when denuvo cost too much to justify having it still.


u/fairguinevere Feb 08 '23

Lmao, imagine paying full price to preorder a game that not only sucks, but also has Denuvo. It's like paying to kick yourself in the balls.

→ More replies (6)


u/DvisionX Feb 08 '23

If it's a memory leak, then saving and restarting should fix the problem. If not, then it's probably not a memory leak.


u/2drawnonward5 Feb 08 '23

Unless the memory leaked into a pensieve and when you load the game, it pops out like a Jack in the box!


u/CtrlAltViking AMD 7900x | NVIDIA RTX 3080 FE | 32GB DDR5 5200 | Evolv Shift Feb 08 '23

Having played it on PS5, have noticed a few times where it got sluggish and doing this fixed it. May be the same on PC.

→ More replies (2)


u/SuperCool_Saiyan Eye 5 13600Kay | Em Ehhs Eye Are Ekks 6600 Feb 08 '23

Memory leak? Nah unused Memory is wasted Memory /s


u/mrfat187 i9 9900k 3080thai 120hz Feb 08 '23

Just need some flex tape is all, should fix that memory leak for ya


u/leaderx22 Feb 08 '23

The game has denuvo

→ More replies (11)


u/KodaksMoment Feb 07 '23

Arnt these games developed on a PC? Why do they run like ass? Genuine question.


u/CantBeMadifYouBad Ryzen 9995WX | RTX 5090 Ti | 2TB DDR5-6900MHz | Windows 12 Feb 07 '23

Seems like a lot of them lately get made for console then ported to PC.


u/CloudWallace81 Ryzen 7 5800X3D 32GB DDR4 3600MHz C16 RTX2080S VG248Q 144Hz Feb 07 '23



u/Kiltymchaggismuncher Feb 07 '23

The batman port was fucking awful. I had a 1070 which was still a fairely beefy card at the time. I had to turn the settings down to low, all the background cosmetics were practically 2d.


u/vidoardes 3700X | RTX 2070S | 32GB Feb 07 '23

I remember that port,I got the game free with my graphics card. It was like they ported it, but no one then thought to boot it up and see if it actually worked, they just clicked the "port to x64" button and went fuck it, job done mate.


u/Kiltymchaggismuncher Feb 07 '23

Warner brothers only made the console versions. They paid another studio to do the port for them. This 3rd party had no history, building a game like this. It was a total cluster fuck.

I got black flag free with my graphics card. An infinitely more enjoyable title. Though plenty of the assasins creed games have also been broken on launch...


u/Thickboijuice Feb 07 '23

At least they had the courtesy of giving you a refund early on. But damn it's been 8 years and it's only gotten worse


u/Kiltymchaggismuncher Feb 07 '23

Mm, now we have huge publishers releasing their titles as early access, which means it's acceptable to be riddled with bugs and poorly optimised (apparently).

They are also 25-30% more expensive than before at launch, yet tend to have a mirriade of day one dlc's. And several different editions, that you need to Google to find the difference between.

I especially liked how we saw the season pass for all dlc, come to the stage where it only covered the first x number of dlc's. After that, surprise! You need another season pass

This is why I wait at least 6 months. Preferably a year or two, where we can buy the game of the year edition. Not that it won game of the year, but publishers decided that's a good title to give the edition that has the most essential dlc's included. And then I can also download the community created patch, which fixes all the shit the developer decided wasn't worth their time.


u/Mighty_Hobo Ryzen 5 5600X | GTX 1080 | 32 GB 3600 DDR4 Feb 08 '23

This is why I wait at least 6 months. Preferably a year or two, where we can buy the game of the year edition.


You always win when you wait. Either the game is good and still good or better in 12 months, or it’s bad and might be good in 12 months. Even if it is bad you are not out the full price.


u/Koitenshin http://steamcommunity.com/id/Koitenshin/ Feb 08 '23 edited Feb 08 '23

Or it was good only to find out it was removed from Steam and never added back. Q_Q

EDIT: Or a substandard "Remaster" was added instead (like Grand Theft Auto Vice City)


u/Mighty_Hobo Ryzen 5 5600X | GTX 1080 | 32 GB 3600 DDR4 Feb 08 '23

The Fable 3 situation.

This kind of thing is far less likely these days thanks to the death of GFWL but it’s still annoying.


u/Koitenshin http://steamcommunity.com/id/Koitenshin/ Feb 08 '23

So many games Capcom said they'd remove GFWL from and put the games back on Steam but never did.

→ More replies (1)
→ More replies (1)


u/Sega-Playstation-64 Feb 08 '23

Plus, once they fixed it, they REALLY fixed it. I commonly use that games benchmark tool to see how my systems run.


u/TT_207 5600X + RTX 2080 Feb 08 '23

Do you mean Arkham Knight? I recently started playing it (left it a long time after the original bad press) and honestly it runs great, I've never had an issue. They really fixed it up over time I think.


u/Cireme https://pcpartpicker.com/b/PQmgXL Feb 08 '23

They fixed it long before the GTX 1070 even came out. The last time the game was patched was early march 2016. Obviously his issues had nothing to do with the port.

→ More replies (1)
→ More replies (2)
→ More replies (2)


u/Cireme https://pcpartpicker.com/b/PQmgXL Feb 08 '23 edited Feb 08 '23

Something doesn't match. The GTX 1070 was released in july 2016, a year after Arkham Knight, and by that time the PC port was already fixed and in his current state. Even a GTX 970 could run it very well at 1080p with everything maxed out.

→ More replies (1)


u/x7universe Feb 08 '23

fyi arkham knight runs perfectly on pc now, I have a decently high range set up and really enjoyed it

→ More replies (2)


u/nogap193 Feb 07 '23

I still can't get black ops cold war to run on my 3070


u/johnbowser_ RX 5700 | R7 3700x | 16GB ram | 1080p 165hz Feb 07 '23

That game seems to just depend, on my 5700 it runs at 100fps on high most of the time


u/ButtPirateer PC Master Race Feb 07 '23

My 1660ti ran it fine on high, too. Had a friend with a 1080ti who was having issues, tho.

→ More replies (1)
→ More replies (3)
→ More replies (7)
→ More replies (8)


u/Vizjun Specs/Imgur here Feb 08 '23

Right, people say lately. This has been going on for the last 10 years.

→ More replies (5)


u/subarcticeel48 Feb 07 '23

This has seemingly been standard practice for a long time. Consoles are less powerful so they need a lot of time put into optimizing games to run the best it can on them. PCs however are often more powerful, and can make up for a lot of the lack of optimization with that raw power. Unfortunately this means that PC optimization doesn't get nearly as much time put into it as consoles. Purely speculation on this part, but I'd guess that sales also play a part. If consoles sell more, then it makes sense to spend the most time on that version rather than PC.

Edit: Console hardware is also the same across each device, so it's much easier to optimize specifically for that hardware, unlike PC where most people's builds are different.


u/PeacefulKnightmare Feb 08 '23 edited Feb 08 '23

There's also the lack of standardized hardware on PC. It's tricky to optimize for the infinite combination of components, which have their own variance in quality control. However with consoles each one is basically the same.


u/greg19735 Feb 08 '23

Yeah you design it for the base console Series X or S or whatever. THen you make tweaks for the better one to just up the FPS a smidge or add some shadows. Then you're done.

For PC i've had the best consumer CPU on the market with lots of good ram and a shit tier graphics card.


u/ThrowAwayBBY46 Feb 08 '23


Lol...you mean "optimize" sir.


u/PeacefulKnightmare Feb 08 '23

Haha! Yes I did, that was the epitome of how one or two things make all the difference.


u/Krogothian 12600k 980Ti Feb 08 '23

Yeah but consoles run X86 APUs now and Xbox runs a stripped down Windows kernel.

We can't act like porting is anywhere close to how it was with the PS3 or something like that. If this was all just an issue in hardware support then AMD APU's would run this game the same as a console.

It's just because they expect you to have a system so powerful it'll just push through the problems and they didn't want to spend the time on the PC version that they did on PS5/XBSX.

→ More replies (5)
→ More replies (6)


u/chubbysumo 7900X, 64gb of 5600 ddr5, EVGA RTX 3080 12gb HydroCopper Feb 07 '23

which makes zero sense, since modern consoles are literally just PCs. the xbox series x(and even the older xbox one/one x) use modified version of windows operation systems. The Series X uses a modified version of windows 11. yes, its stripped down, but the hardware is just a computer, so you would think that porting it to literally another PC would just be an issue of making the game runs, but this is usually not the case.

because the consoles are running direct metal(no drivers), programming them is significantly different than on a PC, where the game would interact with drivers before and API calls instead of direct instructions. this means that game devs need to do a lot of work to convert their games over, to add to that, studios don't want to put in the money, so they just don't optimize.


u/Hello_I_need_helped Feb 07 '23

i get what you're saying but idk if it's really fair to say they have "no drivers", they run through software developed with graphic card manufacturers that does get updated, it's not like game devs are programming assembly to get harry potter on the playstation

→ More replies (6)


u/-cocoadragon Feb 07 '23

This describes the AAA problem with Switch to a T as well. Complete lack of optimization. Even from Gamefreak, which was only ever gonna be on one system lolz. Why would you not go bare metal max there?

→ More replies (8)
→ More replies (4)


u/20fpsgameryep PC Master Race Feb 07 '23

yeah that makes sense

→ More replies (18)


u/bogas04 Feb 07 '23

If we ignore shader compilation stutter, PC and console architecturally aren't very different these days except the unified memory on consoles.

The issue comes when the pre configured high/med/low settings aren't properly done and/or you as a player try to crank everything to max thinking the settings scale linearly. They usually don't. The higher the settings the less returns you get in image quality but at higher cost. Console ports are usually "optimised" by fine tuning each setting, sometimes even going lower than lowest to ensure stable frame rate throughout the game. This is easier for a fixed platform but harder for an open platform like PC. But usually using optimised settings from Digital Foundry/manually testing each setting for your hardware comes close to getting a stable experience.

One issue that's very hard to overcome is the CPU bottleneck though. No amount of changing graphics settings can help with that, and ray tracing is quite taxing on CPU too. Gotham Knights was one such example.


u/ItsMeSlinky Ryzen 5600X / X570 Aorus Elite / Asus RX 6800 / 32GB RAM Feb 07 '23 edited Feb 07 '23

The PS5 OS and graphics API is quite different than Windows or Xbox.

DX12 and Vulkan are both considered "light" by PC standards because they reduce CPU overhead and require more manual graphics programming.

The PS5 graphics API makes DX12 and Vulkan look "heavy" in comparison, which means that graphics programming on the PS5 requires more expertise and precision but you get better performance because the OS and the API incur less overhead than the bloated Windows environment.

You can't look at hardware while ignoring the software environment.


u/Obosratsya Feb 07 '23

Ps5 has high level and low level APIs. The low level API isn't too far off from dx12 or Vulkan actually. The OS overhead on ps5 is better in some regards but much worse in others. Its better in that it requires fewer CPU resources but worse in that it takes out almost 4gb of memory, leaving a bit over 12gb available to games. Having a single spec surely helps, but usually for AAA games Sony sends their engineers to assist in optimizing, same for MS. On PC you can have Nvidia or AMD send their engineers but it happens way less frequently. Overhead on PC is usually not as bad as most people think, PCs have much faster CPUs so the overhead isn't much of an issue. In regards to multiplats its usually dev time that makes or breaks the PC port.

→ More replies (3)


u/MVPizzle 13700k @ 5.5 GHz | RTX 3080 | 32GB DDR5 Feb 07 '23

Since you’re in the mood… Can you explain to me the difference of Vulkan and DLSS like I’m a newborne golden retriever?


u/ItsMeSlinky Ryzen 5600X / X570 Aorus Elite / Asus RX 6800 / 32GB RAM Feb 07 '23 edited Feb 08 '23 Eureka!

A graphics API is how a graphics programmer "talks" to the GPU. It's how we make it do, well, anything. If we didn't have an API, then we'd have to write a whole bunch of GPU-level machine code and fuck that.

Direct3D (a part of DirectX) and Vulkan are APIs. On PC, DX11 and DX12 are most commonly used. On the Steam Deck and Linux, the Proton/WINE compatibility layer "translate" DirectX commands into Vulkan. PS5 has its own APIs, since Sony writes its own operating system and tools for the PlayStation platform.

DLSS and FSR are something else all together. DX12/Vulkan are how the game engine talks to the GPU and makes it draw stuff; DLSS and FSR take the stuff that the GPU draws, and they try to upscale it to a higher quality (specifically, they try to take a lower resolution and reconstruct it to a higher resolution). They're a newer but separate piece of the chain.

BONUS ROUND: Shaders are special graphics mini-programs that run to create effects in-game (think water, fire, sparks, magic, etc). Like any program, they have to be compiled for the environment in which they're about to run. The reason consoles don't have shader compilation stutter and PCs do is because every PS5 has the same GPU, running the same drivers, so devs can just package the shaders with the rest of the game's data files.

Meanwhile, on PC, you have potentially thousands of GPU/driver version combinations, so the shaders have to be compiled for YOUR specific combo. When a game doesn't do this up front and tries to compile shaders on the fly, it hammers the CPU and creates that stutter that PC gamers are getting so tired of.


u/T0XICxN1GHTMAR3 UNRAID 10900K 48GB 3080Ti 1070 Feb 07 '23

Yeah I'd rater load the game and go make a pot of coffee and play my game in 15 minutes if it means it's gonna run smoothly.


u/swangjang Feb 08 '23

Horizon Zero Dawn does this. When starting the game for the first time, it optimizes itself based on the PC system. It took like 20 minutes to complete but gameplay had very consistent fps.


u/DeLaArapaklas PC Master Race - i7 11700k | MSI RTX 3080 Feb 08 '23

Warzone 2 does this too (it takes about 3 minutes to compile shaders on startup) and YET it still stutters like crazy when you drop from the plane.

→ More replies (1)
→ More replies (1)


u/AirOneBlack R9 5900X | RTX 3080 | 64GB RAM Feb 08 '23

Graphics programmer here. All of this is correct; I just want to add a couple of small notes:
- without graphics apis, we would have to write code for every single gpu model at machine level.
- DX11 and previous alongside OpenGL are older style graphic apis. DX11 is still technically supported and not replaced by DX12 as per microsoft specs. That's because DX12 and Vulkan are new-style APIs that give graphics programmers much more access to low level stuff, enabling us to customize how rendering will proceed and optimize much more every single part of it. This also means you can fuck this part even more and end up with worse performance. Also it's much more complex to work with.
- shaders aren't just for effects, but for everything that you can see, be it a surface or a post FX or a particle effect. There are shaders that work on the geometry, others that work on pixels and even others that do not even work directly on graphics but, instead, offload some calculations from the CPU because GPUs might be an order of magnitude faster for that specific calculation.

- Some games pre-compile shaders at the boot, that is doable if they are using uber-shaders (a single big shader that can handle a lot of use cases, compile it once and profit) or the count of shaders is low enough. Others need to compile at runtime. This also depends on the engine and how the rendering pipeline is structured. There is no one-size-fits-all solution for this and compiling all shaders at game boot might actually be impossible (without rewriting way too much stuff) due to how the engine works.

→ More replies (1)


u/SPECTR_Eternal Feb 08 '23

To add to your comment:

Upscalers like FSR and DLSS don't technically "upscale to the higher quality", but "upscale to higher pixel density. They take an image (a frame) rendered by the GPU at (for example), 720p and using temporal logic (by comparing to previous frames and frames that are not yet shown but rendered) try to "adapt" it to a higher pixel density output (like 1080p) without just stretching the image. Results are images that generally keep their level of detail but suffer from invented pixels shimmering at object edges/creating "shadowy paths" behind moving objects.

Also, from personal experience modding in UE4 and writing my own shaders: shader is... A material setup. It's an internal instruction for the engine, that tells the renderer how to draw an asset with said "material" applied. A shader is a material, that guides the engine into drawing a correct set of textures, tiled accordingly, with height detail (tesselation/parallax-mapping), normal map detail and roughness set specifically for the intended use case. For example, I wrote a shader for asset wetness and basic fake raindrops effect. All it is is a set of instructions that tell how renderer should manipulate given resources. Shader instructions complex enough take some time to compile/build.

→ More replies (1)
→ More replies (8)


u/MLG_Obardo 5800X3D | 4080 FE | 32 GB 3600 MHz Feb 07 '23

I assume you mean DX12 (or 11, 9, etc) and Vulkan. DLSS is a resolution upsampling technique and not really comparable.

→ More replies (3)
→ More replies (2)
→ More replies (2)


u/thereAndFapAgain Feb 08 '23

Sometimes it's just a bad port but most of the time people aren't expecting the settings to work the way they do and don't have a full understanding of what each setting does in the games graphics menu.

Like if you take rdr2 for example, many people were saying it was unoptimised, but that wasn't the case, it was just that many of the games lowest settings were either equivalent or higher quality than the settings on the console version, so people were cranking everything to the maximum and it ran like crap so they lowered it to medium on their 1060 and it still ran like crap.

After digital foundry did an analysis and releases their console equivalent settings, players that used that as a base for their own settings found that the game ran really well.

I suspect that will be part of the reason people are having issues here.


u/GhostsinGlass Intel - AMD pays me not to shitpost on their sub. Feb 07 '23

I imagine the workstations they're using are oodles more powerful than the average home PC.

Hell, in just a few years time computing power has become such that a lot of what can be done in creative 3D can just be bruteforced. When I first started UV unwrapping was a religion, a cult with monk-like focus and as strict as a german nun.

Now you can get stupid dumb good results with any half-assed auto unwrap, hell why even bother unwrapping just get out the procedural materials and fuck baking them because GPU goes brrrrrrr.

Not that I'm complaining

The shit part is that video games don't really benefit as much as creative tasks because of the nature of the two. Video games are garden hose filling a pool, flow is fast, flow is slow, but flow isn't enough to completely fill the gigantic space that water COULD get in to the pool. Creative stuff is different, it's like filling a pool with a pool. One big dump to compute.


u/carnathsmecher RTX 4090 Asus TUF OC/I9 13900K/64GB DDR5 Feb 08 '23

look at the gta 6 leaks,rockstar is mainly using rtx 2060,gtx 1080 only rarely i saw a 3070.

→ More replies (7)


u/Alone-Monk Core i7 10700 / Radeon RX 6650 XT / 32GB DDR4 Feb 07 '23

For real. Microsoft Flight Sim 2020 was probably the biggest offender in this respect, those menu screen frame rates were something else, like there is no reason my GPU should be getting up in the 90 to 100 degree range from just a menu screen.

Personally I applaud Death Stranding for their optimization, consistently running at 90 to 130 fps at high settings on an okay GPU, meanwhile it still looks beautiful.

→ More replies (2)


u/FerretsAteMyToes Feb 07 '23 edited Feb 07 '23

Same reason why consoles with outdated hardware can look/run way better compared to a PC with similar hardware. It's 100x easier to optimize for set specs that consoles have where as with PC gaming you have to try to optimize for tons of different combinations.

→ More replies (20)
→ More replies (43)


u/MrXenonuke R7 5700x | RTX 3070 | 16GB 3600mhz Feb 07 '23

This is why I always wait for Digital Foundry's pc performance video.


u/That-Soup3492 Feb 08 '23

Never fucking pre-order digital content. People will just never learn.


u/QuickNature Feb 08 '23

And here I am just being cheap and it works. I wait till games go on sale like a year after they are released for the most part. Not only do I get the game cheaper, but lots of issues are usually fixed by then.


u/nbunkerpunk Ascending Peasant Feb 08 '23

I can't wait to play Hogwarts legacy...in six months when it's on sale and had bug fixes. Same thing I do with pretty much every game.


u/Rampant16 Feb 08 '23

Even if you wanted to play it day 1, now you know you should probably hold off for a bit.

→ More replies (11)
→ More replies (13)


u/danivus http://steamcommunity.com/profiles/76561197998450381 Feb 08 '23

Apparently there's a day 1 patch that claims to fix the frame drops and stuttering, but I believe because this is still the early-access release that patch isn't coming until the 10th.

→ More replies (5)


u/ProfPring Feb 07 '23

I'm a little confused? After the first hour the game is crap? Is this just down to it crashing or performance issues?


u/cmgr33n3 Feb 07 '23 edited Feb 08 '23

The first hour is the opening story where you walk around and fight a little in small enclosed environments. After that you enter Hogwarts and, presumedly (I saved and had to step away once I got there earlier today), the first zone of the open-world environment. People are saying that's killing their system's performance. It's still indoors so I'm not sure why it would be so taxing but like I said I've not played through any of it yet.


u/LBozoYBBetterRatio Feb 08 '23

The first hour is in a small area. After that is when you actually enter Hogwarts and when your fps starts to dip


u/Gunningham Feb 08 '23

r/patientgamers get better performance as well as better prices.

Cyberpunk 2077 isn’t so bad now.


u/LifeOnMarsden 3700x / RTX 3080 / 32GB 3600mhz Feb 08 '23

“If you preorder a game, you’re paying the highest price for the worst version of that game”- TotalBiscuit

It’s so sad that TB fought until literally his dying breath and no one has fucking learned


u/AsmRJ Feb 08 '23

Reminds me of the Cartman quote.

"Sir Kyle, pre-order doesn't mean shit, ok? When you pre-order a game you're just committing to paying for something that some assholes in California haven't even finished working on yet. You know what you get for pre-ordering a game? A big dick in your mouth."

→ More replies (1)


u/gh0u1 PC Master Race Feb 08 '23

Guess it's a good thing I'm absolutely flat broke right now. By the time I have money for a new game it'll hopefully be in a better state

→ More replies (3)


u/kevmaster200 Feb 08 '23

I just started playing half life 2. Damn the physics in that game are so good!

→ More replies (23)


u/rubixd PC Master Race Feb 07 '23

Badly optimized huh?


u/vidati Feb 07 '23 edited Feb 08 '23

Well sort of, first time I booted the game with my 5800x3d, 32gb 3800 cl 14 and 2080ti I was pleasantly surprised by the performance, dlss quality at the highest settings was hovering at 85-90fps. When I got to Hogwarts I was at about the same fps but suddenly would dip to 15 and stutter until I pressed escape and back into the game and it would be back to 70ish fps.

Turning RT and DLSS balanced I get about 50-70fps which is playable for me.

Edit: Runing the game with highest setting and RT couse a... "memory leak" maybe as performance crushes to 7fps at times. At the moment i have turned RT off and the performance is amazing and no memory leak for now, i hope that was the cause.


u/Senkoin Desktop 5800x 3080 32gb ultrawide gang Feb 07 '23

Is this 1440p?


u/vidati Feb 07 '23

Oh sorry I'm on 3440x1440


u/Senkoin Desktop 5800x 3080 32gb ultrawide gang Feb 07 '23

Based 3440 gang.


u/Strimp12 Feb 08 '23

There’s dozens of us! DOZENS!


u/wozman Feb 08 '23

3840x1600 gang here, well not a gang it's just me /shrug

→ More replies (1)
→ More replies (5)
→ More replies (2)


u/R0GUEL0KI Feb 08 '23

I have a feeling if you turn RT off you’re gonna have a better time.


u/vidati Feb 08 '23

You are right!

Runing the game with highest setting and RT couse a... "memory leak" maybe as performance crushes to 7fps at times.

At the moment i have turned RT off and the performance is amazing, and no memory leak for now, i hope that was the cause.


u/whothefoofought Feb 08 '23

Yeah, I didn't want to be "that person" but I played most of the evening with a pretty average bordering on shitty computer and I have not had any performance/graphical/memory leak issues so I'm not sure why this thread is popping off so hard.

→ More replies (10)
→ More replies (3)


u/baka_arin R9 3900XT@4.4Ghz | RTX 3080 Feb 07 '23

It run like power point slide show at Hogwarts.


u/RichiPete Desktop | RTX 3070 | Ryzen 5 3600 Feb 07 '23

yep, first time they showed the great hall in a cutscene that shit turned into a powerpoint presentation


u/bambiindistress PC Master Race Feb 08 '23

Had the same problem. 3080 is too weak. Time to upgrade boys



u/N-aNoNymity Feb 08 '23

Waited for almost a year in line to get my 3080, and now I need a 4090 because optimization doesnt matter in 2023. NAH

→ More replies (2)
→ More replies (3)
→ More replies (1)
→ More replies (26)


u/JakeMac96 PC Master Race Feb 07 '23

do people who buy games at launch never learn? wait for reviewers to inform you of the technical state of the game before purchasing.


u/pkisbest pkisbest Feb 07 '23

I always tend to wait 2-3 weeks. Usually enough time for emergency patches to fix all the critical issues.


u/klubsanwich AMD Ryzen 7 5800X | GTX 3080 10GB | 32 GB RAM Feb 08 '23

I always wait until the game is at least 50% off


u/ok-confusion19 Feb 08 '23

So like 6 weeks unless it's Nintendo, then never.

→ More replies (2)


u/Kontrolgaming 1st gen i5 760, 970 GTX Feb 08 '23

People don't care, they will forever pre-order games.


u/RichiPete Desktop | RTX 3070 | Ryzen 5 3600 Feb 07 '23

I mean it's chill, steam gives you a refund if you play less than 2 hours


u/coffeejn Feb 07 '23

Give me a demo instead. 2 hours might just get you out of some games tutorials.


u/chubbysumo 7900X, 64gb of 5600 ddr5, EVGA RTX 3080 12gb HydroCopper Feb 07 '23

2 hours might just get you out of some games tutorials.

I have noticed a trend that a lot of modern games do, is that the tutorial area is supposed to take 2 hours, or slightly more. this is to force you to never experience the actual game. forspoken was about 1 hour of what I would call tutorial, but could easily end up being 2 if the game is buggy or crashes a lot on a PC.


u/Key-Tie2214 Feb 07 '23

Steam often allows for refunds above that time. I think a friend got a refund like 5 hours into a game? That might be massively ballooned since it was ages ago and could be exaggerated by him.

Steam only enforces a strict 2-hour time period if you do it constantly.

→ More replies (9)
→ More replies (3)


u/yes19991 M1 Mac mini Feb 07 '23

Is this why every game now has a stupid long tutorial?


u/sur_surly Feb 07 '23

Lol yup.

→ More replies (7)


u/GrandJuif R9 5950x, RX 6900 XT, 64GB 3400MHz Feb 07 '23

2h is really short in my experiences. By the time of the companies intro end, tweaking the settings, game intro, making character (if applied), re tweaking settings if there is issues and you'll already be beyond the 2h.

→ More replies (22)
→ More replies (12)
→ More replies (17)


u/Traze- Feb 07 '23

These posts make me feel like I’m the only one who hasn’t had performance issues, at least not to this extent.

I should mention I have a 6600xt with only 16 gigs of ram and I’m running high at 1080p


u/Twinkies100 Desktop Feb 07 '23



u/T0BIASNESS RX 6600 XT / RYZEN 5 3600 Feb 07 '23



u/gestalto 5800X3D | RTX4080 | 32GB 3200MHz Feb 08 '23

Yeah, but...it's a smooth 9

→ More replies (11)


u/houdinikush FX-6300 @ 3.5GHz| R9 270 OC | 8GB DDR3 Feb 08 '23



u/tiediesunrise Feb 08 '23

No, this is Patrick.

→ More replies (1)


u/dont_ban_me_bruh Feb 08 '23

It's all just reporting bias; people whose games are running fine aren't on Reddit whining about it.


u/Youngsaley11 Feb 07 '23

No I’m with you played about an hour and haven’t had any issues at 4K ultra with a 3080ti

→ More replies (11)
→ More replies (20)


u/FreeDeerSociety Feb 07 '23

I'm on Ryzen 7 5800X, RX6800 and 32GBs of RAM at 3600hz. I play on 1440p, with everything on High (no RT). I'm not experiencing any issues. The average fps AMD Adrenalin shows is 101 FPS.


u/FreeDeerSociety Feb 07 '23

And yeah no upscaling (AMD FSR 2.0).


u/sur_surly Feb 08 '23

Maybe those having issues are using 16GB ram


u/[deleted] Feb 08 '23 edited Feb 08 '23

No it’s fucking RT and dlss and nvidia folks just don’t want to admit it. And before anyone replies to me I have a 3070. Turned raytracing off and dlss and not a single issue.

→ More replies (3)
→ More replies (2)


u/Icemasta Feb 08 '23

Same except RX6800 XT.

1080p with FSR2.0 set to maximum quality.

Lowest dip I've seen so far is 88fps, generally at 144.


u/Ocronus Q6600 - 8800GTX Feb 08 '23

Could it been a trend with AMD? Maybe just anecdotal but I'm seeing a bunch of people with AMD cards saying the game runs fine.


u/ICallFireStaff Feb 08 '23

Yeah mines running amazing on my 6800, 1440p high I don’t think I’ve been below 60 fps once

→ More replies (1)
→ More replies (6)


u/Soggy_Reflection7990 i5 12400f RTX 3070 Trio 32GB 3600Mhz Feb 07 '23

That’s weird. Everyone’s saying they’re getting fps drops at hogwarts, I’m running at 100fps max settings on a 3070 and an i5 12th gen.

→ More replies (25)


u/lopnk Feb 08 '23

I have not had any issues in a few hours of play..let's hope it stays that way!

Sorry some of you have. That sucks!! I will keep an eye out.

8700k + 3060 with 16gb ram. Was streaming on discord for my sister to watch me play.


u/MrLuckyTimeOW Feb 08 '23

I’m glad to hear someone else with a 3060 have some good performance. I also have a 3060 and was starting to get worried that it wasn’t going to perform well on my Pc.


u/Gwynbleidd2077 PC Master Race Feb 08 '23

I have a 3060 and it ran fine once but I closed it to mess with the lighting because it was so dark and now it won’t get past the shaders menu

→ More replies (2)


u/RichiPete Desktop | RTX 3070 | Ryzen 5 3600 Feb 07 '23

Got pretty consistent frame drops to around 15fps in the cutscenes once Hogwarts started, and walking around the school I get usually around 90fps but frequent drops to 20-30.
First time buying a big release on launch for PC and of course it does this :)

Heard other people having the same issue too, cmon Warner Bros..


u/KOZTIC88 PC Ryzen 5600x | RTX-3060 Feb 07 '23

pretty much every fucking release this year has been optimized by cavemen. dont stress too much about it bro


u/RichiPete Desktop | RTX 3070 | Ryzen 5 3600 Feb 07 '23

Yeah, I'm not too worried about it. Refunded it and gonna wait for some patches or something because it was shaping up to be really fun

→ More replies (16)


u/TomLeBadger PC Master Race Feb 07 '23 edited Feb 08 '23

OK, so I'm mostly chilling at 100 FPS. The lowest I've seen, inside the castle is 70. I've got a 7700k and a 6700XT, I left it at the auto settings (all medium) 1440p and turned on FSR2 Quality.

It is running better than I was expecting. Try FSR.

Edit: played some more this morning and was more conscious of FPS, I'm actually sitting at my 144 cap more often than not, still the lowest I've seen is 70~ FPS. average FPS during my hour of playing this morning (in and around the castle) is 126 FPS. It's entirely playable, but only because of FSR. If I run at native 1440 it fucking dies, so yes, it is another case of FSR / DLSS being used in place of optimisation, but the games good to play and still looks good, so I'm overall still happy.

When drivers release, I'd probably opt for upping quality and keeping FSR on, as there's no noticeable loss in fidelity, but it runs at triple the FPS with FSR on.


u/Ryamus Ryzen 7 5700X/32 GB 3000Mhz/6700 XT Feb 08 '23

6700XT/5700X at 1080p all ultra and dipped to 70 once in castle. Avg anywhere from 100-130fps. FSR2 puttin in work I guess

→ More replies (23)
→ More replies (34)


u/salomonisch Feb 07 '23

Considering other games with similar FPS drops at different points, even in cutscenes... It sounds like the typical problem of denuvo. Should EMPRESS break denuvo according to her announcement (true remove and not 'just' bypass), we can make a performance comparison.


u/Ganda1fderBlaue Feb 07 '23

How, how are people still surprised that games don't work well at launch


u/Kosen_ Feb 08 '23

So. I decided to be an absolute degen and play for 12 hrs today because im sick. The performance gets steadily worse the longer I played to the point where going into combat I had 7fps with a 3070 and 32 Gb of RAM.

On the bright side in really enjoying it. I just hope there's some way to increase the performance.


u/getgroovyloony PC Master Race Feb 07 '23 edited Feb 07 '23

How are yall even playing the new hogwarts, I was looking at it on steam the other day and it wasn't belong unlocked until the 10th.


u/astrorebel Feb 07 '23

Deluxe edition owners get early access, 72 hrs in advance


u/WolfgangSho Feb 08 '23

Paying more to stress test it for everyone else. Hilarious.

→ More replies (2)
→ More replies (5)


u/CarnageMunky Feb 07 '23

Having no issues here on max settings 1440p 75fps locked. Some minor graphical bugs, but flawless for me

→ More replies (11)


u/AdditionalOne8319 Feb 08 '23

Finding a well optimized modern AAA game is like finding a needle in a haystack these days.

I have to say, as someone who plays both console and pc, at least I know I’m getting a playable game on console when I buy it, whereas with pc there way more variables and a higher likelihood that it’s not playable


u/happymemories2010 Feb 07 '23

Gamestar showed proof about this yesterday already for the PC review. The Performance on PC has problems. You were warned.


u/Kot4san Feb 07 '23

You were Warned Brothers

→ More replies (2)
→ More replies (1)


u/PRSMesa182 Feb 07 '23

For shoots and ladders I went full send on the graphics settings with everything to ultra, ray tracing also on ultra with all 3 options enabled and DLSS off...running around hogwarts on a 7950x/4090 combo on my C1 OLED and saw frame dips to around 40ish but generally stayed 60-70ish...tossed on DLSS2 quality preset and frames were back at 80-90...this game is so poorly optmized its wild lol


u/bogas04 Feb 07 '23

4090 is pretty powerful but running games on max with ray tracing without any upscaling is asking a lot from the hardware. Consoles appear to get better ports not cuz they're magically faster or the code is drastically different for consoles, but because they use low-mid settings, upscaling from sub 1440p resolution, with at most 1 ray tracing option at 30fps. I'm not defending the game but I'm just saying that getting ~60fps at max isn't "poor".

→ More replies (1)


u/Neoreloaded313 Feb 07 '23

There goes my hopes of just brute forcing it to run well with the power of the 4090.


u/N-aNoNymity Feb 08 '23

Lol, you still havent upgraded to the 5090? Thats what you get! /s


u/PRSMesa182 Feb 07 '23

Dropping ray tracing and I’m basically locked at 120 fps which is the refresh rate of the C1, if use my 1440p panels I can hit around 200 fps.

→ More replies (6)


u/Lord_MagnusIV i6-1390KSF, RTX 1030 Mega, 14PB Dodge Ram Feb 07 '23

I have been watching my friend play it over the last 3 hours and he has a 3060, it runs smoothly on normal graphics at 75-90 fps

→ More replies (3)


u/Desert_Walker267 Feb 08 '23

we still have the day one patch coming up


u/Pandeamonaeon Feb 08 '23

I’m running it with a 2080ti on ultra 1440p and I only got fps drop at hogsmeal village, happened 3-4 times in a 6 hrs game session. This is not that bad tbh


u/lordofspearton Feb 07 '23

Idk why people are still surprised by this. Near every major AAA release in the last 10 years has been a fucking buggy disaster, PC or otherwise.

It's gotten to the point that it's more noteworthy to me when a game isn't broken on launch.

→ More replies (4)


u/SunMcLob Feb 08 '23

Fiancé is playing currently on a budget build. Ryzen 5 5600x / 16GB Ram / Radeon Rx580 on medium graphics @ 1080p. She's having no issues at all and running smoothly - haven't benchmarked yet but looks about 60fps.


u/eightleafclover_ Feb 07 '23

haha wait until you get to hogsmeade... RTX 4090 dips to 35 at 1440p without DLSS max settings


u/Educational_Box_4079 Lenovo Legion 5 Pro | AMD Ryzen 7 5800H | RTX 3070 | 16gb Feb 08 '23

true. Playing it on rtx 3070 mobile. In hogwarts i was getting 50-60 fps 4k dlss performance. Hogsmade 30-35 fps

→ More replies (5)