The latest Ryzen processor series has been out for six weeks and yet just about a week ago we were able to touch the Ryzen 7 3800X for the first time. The delay caused us to receive numerous comments asking us to review it and compare it with the 3900X, 3700X and 3600, all of which we have reviewed so far.
And what's the deal? Why was the 3800X so difficult to get, how different from the 3700X, and why did the TDP increase by over 60% for a 100 MHz gain in gain? Obviously, it's about expulsion and profitability, which may not be as good as AMD had hoped, or maybe demand also played a role.
The silicone lottery recently released some data on the Ryzen 3000 for binging, which suggests that higher quality silicon is reserved for the 3800X. The top 20% of all 3800X processors tested passed their 4.3 GHz AVX2 stress test, while the top 21% of all 3700X processors were stable at 4.15 GHz only. Also, all 3800X processors passed the 4.2 GHz test, while 3700X processors were only good at 4.05 GHz, which means that the 3800X has about 150 MHz more space when it comes to overclocking.
Silicon Lottery AMD Ryzen 3000 Binning Results
In other words, the average 3800X should overclock better than the best 3700X processors, but there is still a small 6% difference in the frequency we are talking about between the absolute worst 3700X and the absolute the best 3800X of their testing. For the more casual overclockers like us, the difference is likely to be even smaller. Our 3700X seems stable in our internal stress test and to date has not crashed even once at 4.3 GHz. This is the same frequency cap on the 3800X that we have. As for TDP, it is the least confusing, but first we will look at some performance metrics and then discuss what we think is going on.
The indicators you see below were tested on a Gigabyte X570 Aorus Xtreme with 16GB DDR4-3200 CL14 memory. The chosen graphics card for testing the processor is, of course, the RTX 2080 Ti, so let's get into the numbers.
First we have the Cinebench R20 and hold your hats, we're looking at a 3% increase in 3800X multi-core performance, but at least it's faster than 9900K now.
In terms of single-core performance, we're seeing about a 2% increase here, which allows the 3800X to match the 3900X and 9900K.
The coding time in Premiere is reduced by 3%, shaving 12 seconds out of our test, given what we saw in the Cinebench R20, this result is not surprising. 
We also see a 3% reduction in rendering time when testing with Blender Open Data, so there's not much more to say. Let's take a quick look at a few games before skipping energy and thermals.
We are really looking at the margin of error type boundaries in Assassin's Creed Odyssey between 3700X, 3800X and 3900X. Needless to say, they all deliver a similar gaming experience to this title.
A similar thing is seen with Battlefield V testing, the 3800X is 1-2 frames faster than the 3700X thanks to a very small increase in operating frequency, again there is no way to notice this performance difference.
Almost identical performance is also seen when testing with The Division 2, we look at Core i7-8700K as performance in this title.
We finally have Shadow of the Tomb Raider and here the 3800X was 2fps faster than the 3700X, so let's move on to power consumption.
Here we see a 9% increase in energy consumption, surprisingly, given that we are only talking about a 2-3% increase in operating frequency. However, the increase in frequency will come with an increase in voltage, and this is likely to cause the 3800X to be a little smoother than power than you might expect.
As a result, increased energy consumption of the 3800X works about 3 degrees hotter with the cooler box and 4 degrees hotter with the Corsair H115i Pro. Interestingly, the 3800X cooler box has a 100 MHz clock speed, but only 75 MHz higher with the Corsair AIO.
Activating PBO still saw the 3800X run about 2-3 degrees hotter, but now it's only 25-50 MHz higher than 3700X, so if you want to turn your 3700X into 3800X, just activate the PBO .
The power consumption shows that the 3800X draws far more power than the 3700X, at least in terms of the extra features it offers. We have noticed a 3% improvement in productivity and this costs us about a 12% increase in energy consumption.
The margins remain almost the same with the PBO on, the 3800X still consumes about 12% more energy than the 3700X.
What is it?
This was not the most exciting session with indicators so far, but it answered the question: what's the difference?
As it turns out, not much. During heavy loads, the 3800X clocks between 100 – 150 MHz higher, which is an increase in frequency from 2.5 to 4%. This increased the CPU's power consumption by about 12%, which meant that it rose a few degrees hotter, potentially making it a little stronger.
For this small increase in productivity, AMD increased MSRP by 21%, from $ 330 to $ 400, so the biggest percentage increase, if we ignore TDP, comes from the price. And we believe that's all you need to hear, you'll get 3% more productivity at your best, spending 21% more on your money.
If you are interested in this deal, we have a bunch of old Xeon systems … you can have them at a really good price.
Moving on, let's quickly talk about the 105w TDP, which is up 62% above the 65w TDP rating of 3700X. AMD seems to be basically saying this: with a cooler designed to dissipate 65 watts of heat, the 3700X will run no lower than its base clock. The 3800X, which is clocked at 300 MHz for the base, may fail to maintain 3.9 GHz with a 65 watt cooler.
The confusion crept when AMD leaps over its 95-watt 105-watt rating with 3800X. We accept that the 3800X may not be able to support 3.9 GHz with a 65 watt cooler, but it certainly can with a 95 watt cooler.
As far as we can tell, TDP is an indicator for OEMs who usually try to cut as many angles as possible. If an OEM manufacturer puts a 65w cooler on a 95w part and a buyer says "I don't hit 3.9 GHz," AMD may go "fine, OEM doesn't meet the basic specifications for the cooler."
At the end of of the day we don't understand why TDP is something they advertise at all when it's less than useful to consumers. AMD may be better off simply advertising the cooler rating they provide with each processor, informing customers what higher cooler performance they will need if they want to upgrade. For example, Wraith Prism is a 105w cooler, so you will want something estimated above that if you want to upgrade, 150 watts for example.
Buying advises briefly: We strongly recommend avoiding the 3800X and take the 3700X instead, If you deem it necessary, upgrade the box cooler with the money saved. The Ryzen 5 3600 remains the king of value nobody, and the 3900X offers more performance kernels, games may not be as helpful as we saw in our GPU scaling indicator.