Greetings, glorious testers!
Check out Alpha Two Announcements here to see the latest news on Alpha Two.
Check out general Announcements here to see the latest news on Ashes of Creation & Intrepid Studios.
To get the quickest updates regarding Alpha Two, connect your Discord and Intrepid accounts here.
Check out Alpha Two Announcements here to see the latest news on Alpha Two.
Check out general Announcements here to see the latest news on Ashes of Creation & Intrepid Studios.
To get the quickest updates regarding Alpha Two, connect your Discord and Intrepid accounts here.
Question for Cyber Monday - video card spec recommendations?
Kickaha99
Member, Alpha Two
Hi! Just in time for Cyber Monday deals, I am looking for a new video card. I know AoC is still in A1 and moving to A2 (soon-ish) but what is the minimum and recommended video card specs for AoC by release? Any thoughts?
0
Comments
Are you a 1080P, 1440P, 4k, Ultrawide
just a thought.
These are the (old) system requirements as per the wiki: https://ashesofcreation.wiki/System_requirements
But honestly, I would ignore those for now.
And yeah, you need to specify the resolution you'll be playing at as well, as Enigmatic Sage mentioned. Also, your CPU needs to not bottleneck your GPU choice too much, and like Kotter is saying, MMORPGs are typically CPU heavy. I am pretty sure AoC will be very CPU heavy in large battles like sieges.
Nvidia does not like to drop their prices but instead stop the production of older models.
Worst part is the lack of vram standards preventing the lower tiers for performing in relative "future proofing".
We shouldn't have to be using tracing/pathing, upscalers, frame generation to make a game look good. Most of those features shouldn't be selling points either. Then Vram issues with resolutions and performance. Market is in such a stupid place right now.
So, when the time comes, we overspend on CPU before GPU? Or do we just shout Screw It and overspend on both?!
Definitely the latter.
Here is link to screenshot of my question and answer, so you all know its legit.
GTX cards are essentially end of the line for modern made video games as they just cant perform past 1080p for competitive frame rate and standard graphic settings.
Just make sure you have enough VRAM in your card choice as that is very important in terms of resolution and performance. Anything under 12 GB for modern gaming is going to take performance hits.
EDIT:
I'm still using a 1660 super from 2019 lol. Going to squeeze every last bit out of it until I get something to bridge me from now to 4k gaming in a couple years as the market is going to make a big shift.
Hey man, I had been gaming 4K on my GTX 1080 Ti for several years. It was doing quite ok! I am not one of those "anything below 120 FPS is literally unplayable" snobs of course, so that helps But aside from the 1080 Ti perhaps, yeah, the GTX series is getting a bit old now for sure.
I played A1 in 4K on my 1080 Ti, and it ran the game really well. UE4 of course, but still. The larger siege battles we had with close to 300 people still had 30-60 FPS depending, and that might have been CPU bottlenecked (overclocked 8700K)
Nothing to do with snobs, it's just facts with proof on raw optimisation performance. It's playable just not good enough for competitive gaming as a standard yet. 1440p is essentially replacing 1080p in terms of pc gaming. Literally 3% of the market uses 4k gaming and most games are not even optimised for it as there is no point. I wouldn't waste my time with 30-60 fps on 4k gaming. Give it a few more years as it makes a footing in the market and industry.
GTX cards can't really handle modern anymore even if 1080p or 1440p, older games sure. In terms of Nvidia, most of the 20xx, 30xx and some 40xx with vram less than 12 GB just dont perform either for high end games coming out. They'll begin to shit bricks due to lack of vram. DLSS and Ray Tracing features have so many problems still as to why I said if they just optimise games properly it would be better. Rasterization still out-performs and lumen is pretty much making ray tracing pointless. Frame generation causes latency issues. So many problems they "fix" but cause more with game latency and graphics issues.
I can game at 1440p sit back from my screen a couple more feet and it looks just like 4k lol
Really depends on your available funding. Looks like an RTX 3070 now would serve just as well.
sales were awful, not really much for deals if you ask me.
for Nvidia, they're announcing their 40xx super cards in January. Essentially better than the normal 40xx but less than the TI.... Unless you want a TI SUPER lol
fucking marketing lol
The snobs thing was tongue in cheek I don't really want this discussion to turn into a whole thing, so I will only say this: For pros in shooters and other games based on super fast twitch skills, sure, every little millisecond helps. I don't dispute that.
Going from 30 to 60 FPS is a fairly big deal for most. From 60 to 120 much less so. It's not the extra 8 ms that decides the outcome of matches for 99.99% of the players out there. For online play, ping matters more, and general skill level obviously. The 8 ms is but a small fraction of the ping for most. Things look visually smoother and more pleasing at 120 FPS, sure, especially with a matching monitor, but it's not what is making you win or lose fights, unless maybe if you are in that very small top tier of players. From 120 to 240 FPS is completely pointless in terms of competitive play outcomes for the vast, vast majority.
For an MMORPG like Ashes, 30 FPS is playable, but not super enjoyable. It will not affect most fights if it's at a stable 30 FPS. 60 FPS is better for sure and may affect some fights a little perhaps, for some classes. Like perhaps for the ranger and rogue. 120 FPS is nice and looks smooth, but it will affect exactly no combat outcomes compared to 60 FPS.
Not quite
Frames matter in all games even MMORPG's regardless of PvE or PvP, tab or action combat. It's almost 2024, the proof is already evident from many sources and/or players. Also, higher frames can reduce noticeable dips people can expect to see in mid frame rates. VRAM plays a big part in that as well which is why I brought it up.
Don't get me wrong, I've played lots of game at 30 - 60 fps over the years regardless of resolution and/or HDR perhaps even lower depending on the decade and game lol.
Personally, I don't care what people play the game at what hardware they get. It's their money and their choice. I just know what I would get/want if I didn't need a super high end card for work or because I can afford it just because.
Consumerism and marketing.