I just think developers are reducing game quality on PC due to console issues.
I just think developers are reducing game quality on PC due to console issues.
In all my life I've been a PC gamer and I can't describe my passion for playing games and building PC's. However watching YouTube videos of games showcasing how they look in the Beta and after their release has led me to believe that developers downgrade games because consoles give them more money, because you cannot "pirate" file share games just like you can on PC and because corporations like MS and Sony buy game studios to make console exclusive games. Because of this, devs have not brought anything revolutionary to gaming. IDK I could be wrong but the last games that were revolutionary are Half-Life and Crysis, I don't see any revolutionary game today. Devs just put tons of unnecessary stuff into a game, which graphics cards can't keep up with and then people call those games "graphically" demanding, while I don't see anything graphically demanding. Prove me wrong, just look at Crysis 2. The game was made in 2012 and still looks better than a lot of games released today. Games that were specifically made for PC to be top notch always melted console hardware and looked ugly on the consoles like Crysis, Crysis 2, Battlefileld 3 and probably other games that I can't think of right now.
But games that have been epic on the PC are few compare to consoles, which leads me to the question, WHY? Why focus making games better on the console with limited hardware instead on the PC? Why when M&K is superior in almost every way to a controller (OK it's better to play a racing game with a controller). Look at the Silent Hill franchise. The original Team Silent made the games specifically for the PS2 while the PC ports were and are still crap. They didn't even bother making the games better on the PC (By "better" I mean optimization, ports). I mean we can see the same thing persisting even today RDR, Borderlands 3, Sekiro, you name them.
I'm really disappointed of what devs have been doing in these past 20 decades or more with PC launches, ports and optimizations and then they themselves cry over "piracy" while giving us half baked products for half baked game with DLCs. The day I'll see a revolutionary game is when devs stop favoring a backwater hardware instead of the PC. The PC is already a multi platform so there's no point pushing for consoles only because of them to get half baked products. Sitting and getting filled with money because of console hardware won't get devs to innovate. The clearest example is Intel.
I started this thread to know what your thoughts and opinions are on this subject because I can't be the only one that thinks and feels the same like this.
Gaming and PCs aren't really good for productivity, so even if I did agree, I wouldn't. I'm thankful for what I have and grateful to live in a comfortable place where I can enjoy myself. The problem with people like you is entitlement—thinking consoles limit you is just an ego issue. You spend more, so you automatically feel entitled to something. Gamers are gamers, consoles don<|pad|>, especially with the future direction they're heading.
Keep in mind that sometimes, especially with games from the 6th generation, console-made titles didn't always perform well on PC due to how they leveraged the hardware. The opposite is also true, as seen with MOBAs moving to consoles or Valve titles. Silent Hill 2 on the PS2 is a case in point, using its high fill rate to enhance atmosphere. The PS2's capabilities made alpha effects like fog work great, whereas Xbox and PC hardware lacked that same advantage. It's worth noting that its Windows version falls short in comparison.
In addition to Arika's insightful comment, it becomes clearer when you understand the specific hardware you're working with. Optimizing for that particular platform is much more straightforward. Consider where to stop supporting certain options—review the market trends and you'll notice the most common build or parts list, which is similar to consoles but with enough differences that achieving the same level of optimization requires significantly more effort.
I don't accept any compensation for you playing Tetris with various color blocks. You're someone who would defend, for example, CDPR after reducing The Witcher 3 graphics and continuing to mislead customers by claiming they hadn't done so. You didn't start a "rant" and I haven't referenced phrases like "PC master race" or "console plebs."
This discussion isn't about PC master race, and please stop exaggerating. Who knows—perhaps they're not committed to PC because there are other profitable hardware options? It's unfair to criticize the game's quality and then suddenly lower its graphics just months before release due to financial reasons. All the effort they invested ends up being ignored, and later they'll lie to customers about the game not being downgraded. Don't make promises only to let them slip away for profit.
'Dark Souls II' Was 'Unplayable And Broken' On Consoles Before Graphics Downgrade
Dark Souls 2 | April 2013 Vs Final Game Graphics/Lighting Comparison
CD Projekt addresses The Witcher 3 downgrade concerns directly
The Witcher 3 Downgrade 3 years later – How CD Projekt Misled PC Gamers
Ubisoft responds to Assassin's Creed: Unity downgrade claims
Modder finds pre-downgrade graphics files in 'Watch Dogs' PC code [Updated]
Games often experience development adjustments due to various factors. All three titles likely underwent graphical reductions for specific reasons, particularly with AC:Unity, which consistently performed poorly across all platforms.
Not every justification for lowering graphics is valid, but changes can happen for numerous reasons.
Take Super Speedway from Gran Turismo 3: A-spec as an example. Its early versions appear markedly different from the final product, with the demo version representing a revised take on the original Gran Turismo 2 track. The reasons behind these modifications remain unclear, but it's evident that developers frequently alter games during the development phase for reasons that aren't always obvious.
This also applies to in-game footage, which doesn't match the actual game experience. Most Battlefront II 2017 trailers were filmed within the game itself, using in-engine methods. The final product feels like a minor improvement since the developers aimed for playability. A 30fps render for trailers or demos doesn't equate to a satisfying gaming experience for the average PC player without expensive hardware. Of course, console and lower/midrange PC gamers still prioritize their own preferences. I understand why CDPR mentions "demo build subject to change" for Cyberpunk 2077, just to prevent any future redesigns they might need to handle.
I'm not clear about the whole "downgrade" concept.
Depending on the game, development begins with an Xbox and PlayStation optimized version. When those are in beta, a "port" of the Xbox version is adapted for PC (I included quotes because the Xbox version runs on standard PCs and then gets ported to Xbox).
Therefore, a PC version is typically an Xbox version with improved resolution textures and sometimes better models.
There are numerous PC exclusives available, and some of them look impressive. Alongside that, several titles are still in development, advancing the graphics capabilities beyond what we usually see (For example, Star Citizen recently released a new Alpha that unlocks the first jump hole and a second system).
Moreover, when a game is showcased at events like E3, it’s not the full experience. Usually, it’s a highly refined version—a polished slice—sometimes only half the game. This approach lets developers focus heavily on one aspect, resulting in a visually impressive but incomplete product.
This strategy often leads to game downgrades because the enhanced graphics may cause performance issues, or because developers lack time or resources to fully realize the potential of the new visuals. Occasionally, graphics are reduced for arbitrary reasons or due to poor management. However, this isn’t the standard practice.
Even when a developer creates a game that works on multiple platforms, it doesn’t change the fact that the console version overshadowed it. The console release of GTA V actually surpassed the PC version in quality. There are many other cases where PC versions have significantly improved graphics.
In short, the core issue seems to revolve around expectations versus reality, and the importance of understanding what developers truly aim to achieve.
Ooooh, Crysis 2. The title that nobody talks about because it was avoided at launch for targeting consoles, even though this strategy proved a major advantage when released on PC. It looks much better graphically compared to the original, and it actually includes solid optimization.