I want that game to look THE BEST IT CAN LOOK, so if some AMD people can't get the full fidelity - sorry, but they'll just have to wait until AMD includes hardware raytracing acceleration (maybe next gen of cards?). IMO at this point there's no need not to include raytracing just for the sake of AMD's current gen cards.
Besides, it'll probably run like garbage anyway, so we'll ALL enjoy it next gen better. I know this happened with Witcher 3 for me, where 2 or 3 card gens later I got to enjoy it fully at 5K Ultra xD
Turing is Pascal with GDDR6, RT and Tensor core support at "12nm", IPC improvements? None... And Pascal is Maxwell with the SMX split in two to be able to put more FP64 and FP16 cores for the Quadro GPUs, IPC improvements over Maxwell? None...
Pascal to Turing was a big leap. Being able to execute both FP16 and FP32 calcs at the same time gave some games a huge jump. It was something stupid like a 70% fps increase in the Witcher 3 going from 1080Ti to 2080Ti, if I recall correctly.
P.S.: I love it how Psychoman is always bitter about tech. And sometimes wrong. But mostly bitter.
IDK where that 70% comes from, maybe Nvidia, but here look:
It ended up being 30-40% higher... which is how much faster the rtx 2080ti is.
Sure that would be good if games actually started utilizing FP16 a butt tone more, but so far nope.