Reality is that there will be a new gpu series out in 12-18 months after the 20x0 series. During this time there wont be many RTX titles and those that are released why bother because your fps will be worse than the non-RTX version running on 10x0 series gpus? The 30x0 series once released will probably get over these hurdles. imho rather buy a used 1080ti off carbonite for much cheaper and then upgrade two years down the line, $/performance wise this makes way more sense to me.
Reality is also that there have always and will always be new graphics cards out soon. If you want the best NOW, get it. If you want ABS in 1978, you'd pay R 2,500,000 for it (in today's terms). The RTX cards are the best of the best, and not aimed at the budget conscious.
I'd largely ignore frame rate for now for two reasons.
Tomb Raider is a slow moving game, so 100 FPS+ isn't a necessity. You'll be spending more time taking in the environment than anything else.
Battlefield V and Metro Exodus (along with many others) will have far more time for optimization than the weeks leading up to launch. It has been said countless times that Tomb Raider was extremely unoptimized, and the devs are still working on the optimizations to get a larger-than-necessary frame rate. Such frame rates would be necessary in the likes of BFV, which is where you'll see the added optimization time come into effect.
In short, slow paced games can focus on the beauty of the environment, while faster games can focus on optimization. Don't base an entire technology on one game demo.
I'm not against Ray Tracing.
They introduced the card too early and it costs a lotta money.
No Average gamer can afford that.
No matter when they introduced it, it would cost a lot of money. The market is absolutely ready for it, and the early adopters will bring the price down for everyone.
If they introduced it ten years from now at a high price you'd still say it came too early. You need to learn how technologies progress from cutting edge to mainstream. The first mainstream GPS was the Magellan NAV 1000 in 1989, costing over $ 2,000 (back then). Now GPS is available in most entry level cellphones costing under R 2,000 today.
+1
Not reading further back, so unsure of what you're commenting from, but it can have infinite more fps, but if those frame rates are unplayable, then it's not worth it anyways.
I'd still take a 1080 Ti over a 2080, but I am definitely not in that market as I am still gaming at 1080p and maxing it all out with a GTX 1060 and my other R9 280X also near maxes out most (only exception I can think of is SoW where it couldn't due to memory, think I played on high instead).
See my reply to ponder (top of this post).
Games are going to take 2-4 years to be developed with ray tracing (minus like 5-20 that are coming in 2019). Games developed with ray tracing are going to be more expensive, we don't know by how much that is yet to be seen. Indie game devs aren't going to have the cash/skills to develop with ray tracing for 4-6 years.
Evidence for this? Battlefield V isn't unusually expensive and has RTRT support. Indie games are rarely about cutting edge technology, so that's also a bit of a moot point.
Nvidia guaranteed will launch a new card in 2019, 2020, 2021 that will be vastly superior to what the RTX cards are now.
It's been that was since day one - you can't keep holding out forever.
The GTX 480 is coming soon, don't buy a GTX 280. Hold out another year.
The GTX 580 is coming soon, don't buy a GTX 480. Hold out another year.
The GTX 680 is coming soon, don't buy a GTX 580. Hold out another year.
The GTX 780 is coming soon, don't buy a GTX 680. Hold out another year.
At some stage you have to click the buy button, and unless you're RIGHT at the end of a product's life cycle the best time is generally as soon as you can afford it.
Basically buying one of these cards now is expensive (R7000 more expensive then the 1080ti) and by the time you can use ray tracing your card will be 1-2 gens old and won't be able to keep up with the ray tracing demands so its pretty much a money sink for something that is obsolete (even though its the best card ever).
You can use ray tracing almost right away - there are games being released before the end of the year that will support ray tracing.
c'mon Nvidia could drop the price of their cards by 25% (probably more) and still make massive profits. This just doesn't sit right with me but they have no competition at the moment and they are one of the most successful business in the world so what can the consumer do other then pay I guess.
If need to understand where the money from the MSRP goes. A chunk goes to the retailer. A chunk goes to the distributor. A chunk goes to the vendor. A chunk goes towards components other than the GPU itself, such as memory, PCB manufacturing, VRMs. coolers, the physical connectors, etc. A chunk goes towards the fabrication of the GPU itself. NVIDIA doesn't get an insignificant amount, but it's nowhere
near the full amount, and of that not all of it is gross profit. On top of that, NVIDIA has to recoup 10 years worth of R&D that went into RTX. Dropping the price by 25% would knock off most of the opportunity to recoup the money they've already poured into it.
On the topic of AMD vs Nvidia all AMD has to do is make a 1080ti equivalent for R9000-R14000 and they are set for the next 2-3 years. Can they do this who knows.
That's way too expensive, and by the time their next cards are ready NVIDIA is gearing up for their next gen as well. The next gen's xx60 card should equal or even outperform the GTX 1080 Ti while costing far less. If you doubt this, compare a GTX 1060 to a GTX 780 Ti, or any other previous generations.
On the ray tracing side Ive only watched a few vids and don't understand it all what I understand is RT is 1 ray that brings in a light source and makes things pretty with a fancy denoiser. What they overcome with the denoiser is doing what a CPU in 10-20 years time will be able to do which is pretty cool but its still only 1 ray. This part is a bit confusing to me and I'm probably way off here.
Ray tracing traces the path of light each photon of light will take between the light source and the camera. The light can be modified several times along this path, such as reflection, refraction, diffusion, chromatic aberration, etc. This is done once for each pixel on the screen, not just once per frame.
I'm still waiting for the dx12 hype train to arrive 2 gpu generations later
DX12 adds very little compelling reason to force its use, so don't hold your breathe too long.
Good points.
Is ray tracing not one of those features that you can put into your game, but only leave it accessible to top range machines, while everyone else can still play it without it?
Or is it more of a cost/benefit thing, where it's not worth pouring the resources into developing the games with ray tracing until it becomes more mainstream and accessible across the graphics card range?
Yes, at least for now it will be - just like way back when 3dfx's proprietary Glide was the in-thing you could normally choose between software rendering, Glide, or OpenGL. This continued through the market success of Glide.
As RTRT isn't just an alternative software library for rendering but rather an entirely different render method, we may see traditional rendering (rasterization) being phased out one day.
I'd be interested to know how Nvidia convinced Square Enix to even do the work to add ray-tracing to Shadow of the Tomb Raider (for the demo version). There's not much upside for a dev house to include tech that only a tiny part of the audience can run - it just doesn't make sense. Add in PS and Xbox to the non-RT capable group and it becomes even less compelling. It was probably paid for, imo or part of some kind of existing partnership. It's going to be cool to see how it plays when \ if they release the patch to enable RT on that game.
NVIDIA works closely with game developers and puts a lot of money and resources into helping advance the market. Until it becomes mainstream, NVIDIA is likely to continue to assist with RTRT, as they've poured ten years of R&D into it already so they're not likely to walk away and let it die out.
NVIDIA working with game developers isn't something new. Remember the Batman: Arkham Asylum debacle, where anti-aliasing was only available on NVIDIA cards? The world was quick to cry that NVIDIA paid off Rocksteady Studios to disable anti-aliasing on AMD cards, but the truth was the game engine didn't support anti-aliasing out of the box. NVIDIA worked with the devs to implement something for NVIDIA cards which otherwise would not have been available to anyone. GameWorks (unfairly called GimpWorks by those without any understanding of it) is just the same.