AMD reveals Radeon RX 5700 graphics cards

Bryn

Doubleplusgood
Joined
Oct 29, 2010
Messages
14,246
#4
The power usage on these cards are insane considering they use 7nm node process.
7nm isn't the full picture though. Intel's 10nm is much denser (nearly twice as much) with its contents compared to most other 10nm chips. AMD's 7nm tech could be early days atm and far from being particularly optimised.
 

Stevewarren

Well-Known Member
Joined
Sep 5, 2010
Messages
452
#5
7nm isn't the full picture though. Intel's 10nm is much denser (nearly twice as much) with its contents compared to most other 10nm chips. AMD's 7nm tech could be early days atm and far from being particularly optimised.
They use TSMC 7nm process which is comparable to Intel's 10nm process.

I think it's more of an architectural thing and I hope that there next RDNA cards will have better power usage. Still it's not a bad card, competes with RTX2070 and it's cheaper but that it uses more power than the RTX2070 is mind boggling to me considering the RTX uses 12nm.
 

Herr_Koos

Well-Known Member
Joined
Nov 20, 2008
Messages
321
#6
These are GPU's; comparing to Intel's fab is not really relevant. You need to compare this to the efficiency of Nvidia's 12nm FINFET process, which is very refined.

The prices are super disappointing. Given how AMD cards are priced in RSA, these will end up being way more expensive than the Nvidia RTX competitors. I was really hoping one of these would be a viable RX580 replacement, but not for that kind of money.
 

Johnatan56

Honorary Master
Joined
Aug 23, 2013
Messages
24,535
#8
7nm isn't the full picture though. Intel's 10nm is much denser (nearly twice as much) with its contents compared to most other 10nm chips. AMD's 7nm tech could be early days atm and far from being particularly optimised.
1560239926815.png
https://wccftech.com/analysis-about-intels-10nm-process/

TSMC 5nm products should show up next year though, iPhone 2020 is set to have it according to rumors.
https://9to5mac.com/2019/04/08/5nm/

1560240018300.png
https://www.anandtech.com/show/1272...-scaling-but-thin-power-and-performance-gains
Note that's power or performance, not both (or at least that's my understanding of it after reading discussions).


AMD does have a node advantage compared to Intel (and note above graphs can't be taken at face value, 14nm got optimized so much that it beats Intel's current 10nm product line).
Will be interesting as in graphics, Nvidia last week decided to go with Samsung as they outbid TSMC: https://www.tomshardware.com/news/samsung-nvidia-7nm-ampere-tsmc,39583.html
 

Neoprod

Honorary Master
Joined
May 21, 2004
Messages
16,729
#9
The price-performance (even using AMD's probably cherry-picked numbers) aren't attractive enough to make a dent in Nvidia's marketshare.

And that's without considering the implications of "Super". How many 5700's are going to be sold if the 2060 (already cheaper) gets a ~30 dollar price cut?
 

Herr_Koos

Well-Known Member
Joined
Nov 20, 2008
Messages
321
#10
The price-performance (even using AMD's probably cherry-picked numbers) aren't attractive enough to make a dent in Nvidia's marketshare.

And that's without considering the implications of "Super". How many 5700's are going to be sold if the 2060 (already cheaper) gets a ~30 dollar price cut?
Very true, sadly. Nvidia doesn't really need to do anything, but odds are they will release "Ti" versions of the 2070/2060 and cut prices on existing cards. That leaves the 5700 dead on arrival.
 

Johnatan56

Honorary Master
Joined
Aug 23, 2013
Messages
24,535
#11
Very true, sadly. Nvidia doesn't really need to do anything, but odds are they will release "Ti" versions of the 2070/2060 and cut prices on existing cards. That leaves the 5700 dead on arrival.
I doubt they'd do that as currently the price is close enough that Nvidia can just leave it and a vast majority of consumers will by Nvidia cards. Even when Nvidia had a worse product, consumers still bought Nvidia.

AMD cards might be interesting due to e.g. Radeon Chill, and I expect that they would go on sale quite quickly. If they're near equivalence, I'd be tempted AMD instead of Nvidia, if the difference is >10%, Nvidia, which it looks like it's going to be.
 

Fulcrum29

Honorary Master
Joined
Jun 25, 2010
Messages
28,722
#13
I believe the 2 GPUs are in a good position, but NVidia will hit back. Seeing the AMD benchmarks, both those cards are destroying the Vega 54/64 in the games.

This hardware Unboxed review includes close to all the high-end cards up to the 2070 and you can see where the 64 is sitting in competition, and I believe they use the i9-9900K CPU.


going by the AMD slides and seeing the improvements that the 5700 XT are showing over the 2070 then the 5700 XT are close to 40% better than the 64 which I am unable to grasp.

See the Apex Legends bench (1080p):

2070 = 154 FPS
2060 = 142 FPS
64 (LC) = 131 FPS
64 (non-LC) = 131 FPS

and the slide shows the 5700 to be 9% better than the 2060 which uhm is on level with the 2070, but they don't show the Apex Legends percentage in the 5700 XT slide. Then The Division 2, Hardware Unboxed shows,

64 (LC) = 118 FPS
2070 = 110 FPS
64 (non-LC) = 102 FPS
56 = 100 FPS
2060 = 95 FPS

and AMD's slides are indicating the 5700XT to be 26% better than the 56 and 3% better than the 2070, but compared to the Hardware Unboxed review it doesn't sit right... and so I can go on with the other games.

Also, there World War Z live benchmark still indicate that Vega loves that game way more...

The new 5700 series are a big step up, though it is still strictly speaking GCN which AMD is hard trying to avoid as this is basically an upgraded and enhanced GCN architecture with the next variation being a more native-based RDNA architecture. Maybe this was the trouble at AMD that AdoredTV spoke about when he said the engineers are having troubles…

I still don’t know why some AMD people are raving about Radeon Chill and Enhanced Sync which are probably the worst two settings to enable in a quick reactive game and they now promoted Anti-Lag to competitive players as they did with Enhanced Sync. Cuts input latency by up to 35%... I want to see these case scenarios. Strangely they didn’t show CSGO numbers, but Apex Legends are at 33%, The Division 2 is at 33%, Fortnite is at 35% and TC: RB 6 Siege is at 31% which are all enormous input latency cuts to boast about. I’m saying it now, these are all gimmicks, though Radeon Chill does have a purpose it can’t be used as an always-on setting.

Fidelity and the image sharpening looks nice, which is announced as an alternative to RTX in marketing terms, but is in truth an alternative development tool to NVidia GameWorks so AMD moved the goalposts here. I wonder since current RDNA is still the latest GCN, well a hybrid, that Polaris and Vega will support Fidelity and image sharpening or whether it will be locked down in AMD Radeon Settings?

All in all, I’m happy with this announcement, but good guy AMD is playing the man where bad guy NVidia is playing the ball. On the most positive note, I really believe that Navi 20 is going to be awesome.

Now to see whether NVidia will launch a 2070Ti and what Super is all about, maybe a super price reduction :laugh:
 

Herr_Koos

Well-Known Member
Joined
Nov 20, 2008
Messages
321
#14
Best to wait for third party benchmarks; until then we won't really know where these cards perform relative to existing models. Stage demo's from the manufacturer tend to skew towards optimism. Either way, these are going to be out of my price bracket. Give me something that beats the 1660Ti at a lower price point, and I'm there.
 

Stevewarren

Well-Known Member
Joined
Sep 5, 2010
Messages
452
#15
These are GPU's; comparing to Intel's fab is not really relevant. You need to compare this to the efficiency of Nvidia's 12nm FINFET process, which is very refined.

The prices are super disappointing. Given how AMD cards are priced in RSA, these will end up being way more expensive than the Nvidia RTX competitors. I was really hoping one of these would be a viable RX580 replacement, but not for that kind of money.
I don't follow. Nvidia don't make there chips they use TSMC also. So it's more about the architecture?
 

Fulcrum29

Honorary Master
Joined
Jun 25, 2010
Messages
28,722
#16
Best to wait for third party benchmarks; until then we won't really know where these cards perform relative to existing models. Stage demo's from the manufacturer tend to skew towards optimism. Either way, these are going to be out of my price bracket. Give me something that beats the 1660Ti at a lower price point, and I'm there.
Back in the ATi X-series days they had the editions like this,

Entry-level: SE, (null), GT

Mid-range: GTO, PRO, XL

High-end: XT

Enthusiast: XT PE

They now launched the 5700 and 5700 XT (High-end)

I guess since there are 5 registered 5700 cards that we may still see a 5700 PRO, XL and perhaps SE. Their ‘Ti’ version may be dubbed the 5750 and 5750 XT.

Navi 20 may then be called 5700 XT PE and 5750 XT PE.

All assumption though, but AMD did register 5 GPUs. My best guess is that the 5700 XL may slide in with the 1660Ti, the 5700 PRO with the 1660 and the SE will ultimately replace the 570/580 to show NVidia exactly why they should never have launched the 1650. NVidia will hit back with price cuts and new direct competitors.
 

Herr_Koos

Well-Known Member
Joined
Nov 20, 2008
Messages
321
#18
All assumption though, but AMD did register 5 GPUs. My best guess is that the 5700 XL may slide in with the 1660Ti, the 5700 PRO with the 1660 and the SE will ultimately replace the 570/580 to show NVidia exactly why they should never have launched the 1650.
One can live in hope. Fully agreed that the 1650 is an utterly useless card at the current price point. Nvidia will still sell a metric crapton of them though. Either way, the lower midrange of the market has not yet been hit with Navi, and AMD are traditionally very strong at the $200 price point.
 

Fulcrum29

Honorary Master
Joined
Jun 25, 2010
Messages
28,722
#19
One can live in hope. Fully agreed that the 1650 is an utterly useless card at the current price point. Nvidia will still sell a metric crapton of them though. Either way, the lower midrange of the market has not yet been hit with Navi, and AMD are traditionally very strong at the $200 price point.
There was also an odd rumour a while back that AMD is planning another Polaris rebrand… the 590 was already pushing it… This is a rumour, and now with Fidelity and image sharpening which sounds to be an RDNA attached thing, I doubt that Polaris will be pushed any more than it currently is due to developer toolkits and driver optimisations to cope with the developers. My question is rather what will happen to Vega? Radeon 7 is still their GPU king, and it will be bitter to see it being denied access to Fidelity and image sharpening, and the 56/64 is in the exact same boat.

I’m still saying to this day, GCN is where it is due to unoptimised 'game' drivers and developer optimisations. See games like World War Z and Dirt 4 (using CMAA), it loves GCN. Desktop/Workstation RDNA is still GCN which is why I can’t see AMD dropping GCN in optimisations. AMD tested path tracing, and even released demos, on Vega 56 so Vega better supports Fidelity and image sharpening.

The console makers are working with true RDNA, we will Ray Tracing here as the console makers have said many times, and the developments here taken to Navi 20, but this is highly pending on the IP and IP licensing agreements between AMD, Sony and MS.
 

Herr_Koos

Well-Known Member
Joined
Nov 20, 2008
Messages
321
#20
There was also an odd rumour a while back that AMD is planning another Polaris rebrand… the 590 was already pushing it… This is a rumour, and now with Fidelity and image sharpening which sounds to be an RDNA attached thing, I doubt that Polaris will be pushed any more than it currently is due to developer toolkits and driver optimisations to cope with the developers. My question is rather what will happen to Vega? Radeon 7 is still their GPU king, and it will be bitter to see it being denied access to Fidelity and image sharpening, and the 56/64 is in the exact same boat.

I’m still saying to this day, GCN is where it is due to unoptimised 'game' drivers and developer optimisations. See games like World War Z and Dirt 4 (using CMAA), it loves GCN. Desktop/Workstation RDNA is still GCN which is why I can’t see AMD dropping GCN in optimisations. AMD tested path tracing, and even released demos, on Vega 56 so Vega better supports Fidelity and image sharpening.

The console makers are working with true RDNA, we will Ray Tracing here as the console makers have said many times, and the developments here taken to Navi 20, but this is highly pending on the IP and IP licensing agreements between AMD, Sony and MS.
I honestly cannot see them pushing Polaris any further; the rumour may relate to the Polaris cards being pushed further down the product stack to replace the entry level cards such as the RX550; that would then simply be a rebrand of existing silicon. As for what happens with Vega... Well that's a good question. Given that the margins on Navi are likely to be better, I don't see them making more Vega based consumer cards. The new drivers may enable some of the new features on Vega silicon; we have less than a month to wait to find out.
 
Top