Yeah punk. Living that 5700g life. Rocking 3700U as a daily driver, what? Yeah I have a fast graphics card, it’s called a 780m you might have heard of it from can barely play Baldurs Gate 3 but it counts as playable. Gaming on integrated graphics is cool because it’s a lower TDP option for the environment. Fight me. I’m over here playing 2017’s greatest offerings at nearly 60fps 720p and I’m enjoying myself. Oh, Cult of the Lamb? Yes, hahaha, I get over 40 fps in Cult of the Lamb. The only other gamers who can @ are switch and xbox series s gangsters. Mortal Kombat 9? Ah, “the good one” of course.
It runs, I’m gaming on it. I don’t care bro.
Soon APUs will be strong enough that half of us will be gaming on mini PCs barely bigger than a phone.
I don’t want a massive PC that runs Crysis 6 at 7680p and 480fps.
I want a SFFPC that runs 1080|60 and we’re getting closer all the time!Goat Simulator, 62fps 1080p, 70% GPU use, 2W TDP avg, 5W peak.
My GPU? A smartphone’s Mali-G68 😁
I spent my early gaming years breaking games to make them run on lower than minimum settings so I could maybe get 15fps.
Now I’m a performance snob and get annoyed seeing it dip under 100 at 4k
I used to be, still am. 1080p 73 fps.
“Once you try 144hz in 4K you’ll never want to go back”
Me underclocking my GPU, upscaling from 720p and doing 40FPS on a 60hz FHD monitor in the summer because the room gets hot:
My setup (R5 3600 and GTX 1660super) can play Cities: Skylines 2 just fine on Windows, but it runs like shit under Linux. I guess this is because of the Shared GPU memory.
We really don’t talk enough about how the worst rated game of the Tomb Raider reboot from the B studio for the series ended up being the default benchmark for gaming for the better part of a decade.
Good for Eidos Monteal. Guardians of the Galaxy deserved better, too.
GTX 1650 mobile works cool for me. 1980’s–mid/late 2010’s and solo dev games, yeah😁
yeah i played through 2013 Tomb Raider with my A6-5400K, very playable