Greetings, glorious testers!

Check out Alpha Two Announcements here to see the latest news on Alpha Two.
Check out general Announcements here to see the latest news on Ashes of Creation & Intrepid Studios.

To get the quickest updates regarding Alpha Two, connect your Discord and Intrepid accounts here.

Question for Cyber Monday - video card spec recommendations?

Kickaha99Kickaha99 Member, Alpha Two
edited November 2023 in General Discussion
Hi! Just in time for Cyber Monday deals, I am looking for a new video card. I know AoC is still in A1 and moving to A2 (soon-ish) but what is the minimum and recommended video card specs for AoC by release? Any thoughts?

Comments

  • edited November 2023
    I'm guessing probably at least 2 years from release. All graphic card tiers will lower until then and NVidia will FOMO their product series as always for profits. No idea on your current or future build but your GPU can only push so hard.

    Are you a 1080P, 1440P, 4k, Ultrawide
  • KotterKotter Member, Founder, Kickstarter, Alpha Two, Early Alpha Two
    many MMO's are CPU Heavy.
    just a thought.
  • NerrorNerror Member, Alpha One, Alpha Two, Early Alpha Two
    edited November 2023
    Definitely don't buy a GPU for AoC release right now. Buy one if you need one now, sure, but only for games that you are playing right now or predict you'll be playing within the next few months. Unless you are just looking for an excuse to get a 4090, then go for it :wink: I am fairly sure it'll be able to handle AoC at release, but it will probably be a generation or two old by then.

    These are the (old) system requirements as per the wiki: https://ashesofcreation.wiki/System_requirements

    But honestly, I would ignore those for now.

    And yeah, you need to specify the resolution you'll be playing at as well, as Enigmatic Sage mentioned. Also, your CPU needs to not bottleneck your GPU choice too much, and like Kotter is saying, MMORPGs are typically CPU heavy. I am pretty sure AoC will be very CPU heavy in large battles like sieges.

  • edited November 2023
    @Nerror RTX 6090 will be out by then for probably the same price as the 4090 now lol

    Nvidia does not like to drop their prices but instead stop the production of older models.

    Worst part is the lack of vram standards preventing the lower tiers for performing in relative "future proofing".
  • RazThemunRazThemun Member, Alpha Two
    What is your budget? What all do you do with your pc system? Reality is you could buy like a 2080 and pry be fine. Or you can purchase a 3070 or 4060 for a higher priced dependent where you purchase. All about what kind of coin you wish to drop. I would recommend a 3000 or 4000 series card though if going Nvidia so you can still get driver updates... You do not have to spend high amounts of money to have a good system to still enjoy multiple games. Especially since ashes will not be a 4k game as it were.
  • edited November 2023
    4k gaming is kind of dumb in 2023. Makes up 3% of players. Unless you can get 120 + FPS with proper optimisations over tracing/pathing, upscalers, frame gene etc then there really is no point. Those features create a lot of back end latency so you should only ever use on single player games and even then they have a lot of problems. Even with the RTX 4090's you're paying way too much and relying on that tech. You're going to spend $1000's on a GPU and Monitor at proper pixel density for something that is just a few feet in front of you.

    We shouldn't have to be using tracing/pathing, upscalers, frame generation to make a game look good. Most of those features shouldn't be selling points either. Then Vram issues with resolutions and performance. Market is in such a stupid place right now.
  • daveywaveydaveywavey Member, Alpha Two
    Kotter wrote: »
    many MMO's are CPU Heavy.
    just a thought.

    So, when the time comes, we overspend on CPU before GPU? Or do we just shout Screw It and overspend on both?!
    This link may help you: https://ashesofcreation.wiki/


    giphy-downsized-large.gif?cid=b603632fp2svffcmdi83yynpfpexo413mpb1qzxnh3cei0nx&ep=v1_gifs_gifId&rid=giphy-downsized-large.gif&ct=s
  • NerrorNerror Member, Alpha One, Alpha Two, Early Alpha Two
    daveywavey wrote: »
    Kotter wrote: »
    many MMO's are CPU Heavy.
    just a thought.

    So, when the time comes, we overspend on CPU before GPU? Or do we just shout Screw It and overspend on both?!

    Definitely the latter.
  • SaixgoneSaixgone Member
    edited November 2023
    I asked this exact question when I was building my gaming rig like 6 months ago. I got official DEV answer saying they are playing the game in studio on maxed out graphics with 80-100FPS on 1440p with RTX3070 and RTX2070. The game is well optimized as they say. So don't worry about spending much money. I personally build my i-9 with RTX3090ti under 1k euro, its cheap now.

    Here is link to screenshot of my question and answer, so you all know its legit.

    jqy0cq9m4qhm.png
  • NerrorNerror Member, Alpha One, Alpha Two, Early Alpha Two
    Nice, good to see some actual info on what their target is and where they are at.
  • edited November 2023
    Developers have a plethora of graphic cards they test their game with for optimisation. Seems logical that they would pick more modern cards for a modern game that is a few years away from release.

    GTX cards are essentially end of the line for modern made video games as they just cant perform past 1080p for competitive frame rate and standard graphic settings.

    Just make sure you have enough VRAM in your card choice as that is very important in terms of resolution and performance. Anything under 12 GB for modern gaming is going to take performance hits.

    EDIT:

    I'm still using a 1660 super from 2019 lol. Going to squeeze every last bit out of it until I get something to bridge me from now to 4k gaming in a couple years as the market is going to make a big shift.
  • NerrorNerror Member, Alpha One, Alpha Two, Early Alpha Two
    edited November 2023
    GTX cards are essentially end of the line for modern made video games as they just cant perform past 1080p for competitive frame rate and standard graphic settings.

    Just make sure you have enough VRAM in your card choice as that is very important in terms of resolution and performance. Anything under 12 GB for modern gaming is going to take performance hits.

    EDIT:

    I'm still using a 1660 super from 2019 lol. Going to squeeze every last bit out of it until I get something to bridge me from now to 4k gaming in a couple years as the market is going to make a big shift.

    Hey man, I had been gaming 4K on my GTX 1080 Ti for several years. It was doing quite ok! I am not one of those "anything below 120 FPS is literally unplayable" snobs of course, so that helps :wink: But aside from the 1080 Ti perhaps, yeah, the GTX series is getting a bit old now for sure.

    I played A1 in 4K on my 1080 Ti, and it ran the game really well. UE4 of course, but still. The larger siege battles we had with close to 300 people still had 30-60 FPS depending, and that might have been CPU bottlenecked (overclocked 8700K)
  • edited November 2023
    Nerror wrote: »
    GTX cards are essentially end of the line for modern made video games as they just cant perform past 1080p for competitive frame rate and standard graphic settings.

    Just make sure you have enough VRAM in your card choice as that is very important in terms of resolution and performance. Anything under 12 GB for modern gaming is going to take performance hits.

    EDIT:

    I'm still using a 1660 super from 2019 lol. Going to squeeze every last bit out of it until I get something to bridge me from now to 4k gaming in a couple years as the market is going to make a big shift.

    Hey man, I had been gaming 4K on my GTX 1080 Ti for several years. It was doing quite ok! I am not one of those "anything below 120 FPS is literally unplayable" snobs of course, so that helps :wink: But aside from the 1080 Ti perhaps, yeah, the GTX series is getting a bit old now for sure.

    I played A1 in 4K on my 1080 Ti, and it ran the game really well. UE4 of course, but still. The larger siege battles we had with close to 300 people still had 30-60 FPS depending, and that might have been CPU bottlenecked (overclocked 8700K)

    Nothing to do with snobs, it's just facts with proof on raw optimisation performance. It's playable just not good enough for competitive gaming as a standard yet. 1440p is essentially replacing 1080p in terms of pc gaming. Literally 3% of the market uses 4k gaming and most games are not even optimised for it as there is no point. I wouldn't waste my time with 30-60 fps on 4k gaming. Give it a few more years as it makes a footing in the market and industry.

    GTX cards can't really handle modern anymore even if 1080p or 1440p, older games sure. In terms of Nvidia, most of the 20xx, 30xx and some 40xx with vram less than 12 GB just dont perform either for high end games coming out. They'll begin to shit bricks due to lack of vram. DLSS and Ray Tracing features have so many problems still as to why I said if they just optimise games properly it would be better. Rasterization still out-performs and lumen is pretty much making ray tracing pointless. Frame generation causes latency issues. So many problems they "fix" but cause more with game latency and graphics issues.

    I can game at 1440p sit back from my screen a couple more feet and it looks just like 4k lol
  • MhythMhyth Member, Founder, Kickstarter, Alpha Two, Early Alpha Two
    If you avoided the urge to buy a card this Black Friday season(it's a month long now) the hypothetical price to performance value spot for a long while is going to be the 4080 Ti coming out early next year. Getting a 4080 Ti sometime next year should carry you through both testing and release. RTX 5K cards are rumored to not be coming out until fall of 2025.

    Really depends on your available funding. Looks like an RTX 3070 now would serve just as well.
  • Mhyth wrote: »
    If you avoided the urge to buy a card this Black Friday season(it's a month long now) the hypothetical price to performance value spot for a long while is going to be the 4080 Ti coming out early next year. Getting a 4080 Ti sometime next year should carry you through both testing and release. RTX 5K cards are rumored to not be coming out until fall of 2025.

    Really depends on your available funding. Looks like an RTX 3070 now would serve just as well.

    sales were awful, not really much for deals if you ask me.

    for Nvidia, they're announcing their 40xx super cards in January. Essentially better than the normal 40xx but less than the TI.... Unless you want a TI SUPER lol

    fucking marketing lol
  • NerrorNerror Member, Alpha One, Alpha Two, Early Alpha Two
    edited December 2023
    Nerror wrote: »
    GTX cards are essentially end of the line for modern made video games as they just cant perform past 1080p for competitive frame rate and standard graphic settings.

    Just make sure you have enough VRAM in your card choice as that is very important in terms of resolution and performance. Anything under 12 GB for modern gaming is going to take performance hits.

    EDIT:

    I'm still using a 1660 super from 2019 lol. Going to squeeze every last bit out of it until I get something to bridge me from now to 4k gaming in a couple years as the market is going to make a big shift.

    Hey man, I had been gaming 4K on my GTX 1080 Ti for several years. It was doing quite ok! I am not one of those "anything below 120 FPS is literally unplayable" snobs of course, so that helps :wink: But aside from the 1080 Ti perhaps, yeah, the GTX series is getting a bit old now for sure.

    I played A1 in 4K on my 1080 Ti, and it ran the game really well. UE4 of course, but still. The larger siege battles we had with close to 300 people still had 30-60 FPS depending, and that might have been CPU bottlenecked (overclocked 8700K)

    Nothing to do with snobs, it's just facts with proof on raw optimisation performance. It's playable just not good enough for competitive gaming as a standard yet.

    The snobs thing was tongue in cheek :wink: I don't really want this discussion to turn into a whole thing, so I will only say this: For pros in shooters and other games based on super fast twitch skills, sure, every little millisecond helps. I don't dispute that.

    Going from 30 to 60 FPS is a fairly big deal for most. From 60 to 120 much less so. It's not the extra 8 ms that decides the outcome of matches for 99.99% of the players out there. For online play, ping matters more, and general skill level obviously. The 8 ms is but a small fraction of the ping for most. Things look visually smoother and more pleasing at 120 FPS, sure, especially with a matching monitor, but it's not what is making you win or lose fights, unless maybe if you are in that very small top tier of players. From 120 to 240 FPS is completely pointless in terms of competitive play outcomes for the vast, vast majority.

    For an MMORPG like Ashes, 30 FPS is playable, but not super enjoyable. It will not affect most fights if it's at a stable 30 FPS. 60 FPS is better for sure and may affect some fights a little perhaps, for some classes. Like perhaps for the ranger and rogue. 120 FPS is nice and looks smooth, but it will affect exactly no combat outcomes compared to 60 FPS.
  • edited December 2023
    Nerror wrote: »
    Nerror wrote: »
    GTX cards are essentially end of the line for modern made video games as they just cant perform past 1080p for competitive frame rate and standard graphic settings.

    Just make sure you have enough VRAM in your card choice as that is very important in terms of resolution and performance. Anything under 12 GB for modern gaming is going to take performance hits.

    EDIT:

    I'm still using a 1660 super from 2019 lol. Going to squeeze every last bit out of it until I get something to bridge me from now to 4k gaming in a couple years as the market is going to make a big shift.

    Hey man, I had been gaming 4K on my GTX 1080 Ti for several years. It was doing quite ok! I am not one of those "anything below 120 FPS is literally unplayable" snobs of course, so that helps :wink: But aside from the 1080 Ti perhaps, yeah, the GTX series is getting a bit old now for sure.

    I played A1 in 4K on my 1080 Ti, and it ran the game really well. UE4 of course, but still. The larger siege battles we had with close to 300 people still had 30-60 FPS depending, and that might have been CPU bottlenecked (overclocked 8700K)

    Nothing to do with snobs, it's just facts with proof on raw optimisation performance. It's playable just not good enough for competitive gaming as a standard yet.

    The snobs thing was tongue in cheek :wink: I don't really want this discussion to turn into a whole thing, so I will only say this: For pros in shooters and other games based on super fast twitch skills, sure, every little millisecond helps. I don't dispute that.

    Going from 30 to 60 FPS is a fairly big deal for most. From 60 to 120 much less so. It's not the extra 8 ms that decides the outcome of matches for 99.99% of the players out there. For online play, ping matters more, and general skill level obviously. The 8 ms is but a small fraction of the ping for most. Things look visually smoother and more pleasing at 120 FPS, sure, especially with a matching monitor, but it's not what is making you win or lose fights, unless maybe if you are in that very small top tier of players. From 120 to 240 FPS is completely pointless in terms of competitive play outcomes for the vast, vast majority.

    For an MMORPG like Ashes, 30 FPS is playable, but not super enjoyable. It will not affect most fights if it's at a stable 30 FPS. 60 FPS is better for sure and may affect some fights a little perhaps, for some classes. Like perhaps for the ranger and rogue. 120 FPS is nice and looks smooth, but it will affect exactly no combat outcomes compared to 60 FPS.

    Not quite
    Frames matter in all games even MMORPG's regardless of PvE or PvP, tab or action combat. It's almost 2024, the proof is already evident from many sources and/or players. Also, higher frames can reduce noticeable dips people can expect to see in mid frame rates. VRAM plays a big part in that as well which is why I brought it up.

    Don't get me wrong, I've played lots of game at 30 - 60 fps over the years regardless of resolution and/or HDR perhaps even lower depending on the decade and game lol.

    Personally, I don't care what people play the game at what hardware they get. It's their money and their choice. I just know what I would get/want if I didn't need a super high end card for work or because I can afford it just because.

    Consumerism and marketing.
Sign In or Register to comment.