Jump to content

Ongoing Discussion 16 Core 32 Threads @5.1GHz main stream Zen 2

Recommended Posts


Where did you find this!? if the 3700X is really this @ that price it will blow the 9900K out of the water... so long as you can effecftively OC at 5.0 Ghz. This could be the shift that AMD has been pushing for a few years now. 

Link to post
Share on other sites
  • 3 weeks later...

AMD Will Be Launching The Ryzen 3000 Series CPUs, APUs And A Radeon GPU At CES

Ryzen Family Ryzen 1000 Series Ryzen 2000 Series Ryzen 3000 Series Ryzen 4000 Series
Architecture Zen (1) Zen (1) / Zen+ Zen (2) Zen (3)
Process Node 14nm 14nm / 12nm 7nm 7nm+
High End Server (SP3) EPYC 'Naples' EPYC 'Naples' EPYC 'Rome' EPYC 'Milan'
Max Server Cores / Threads 32/64 32/64 64/128
Link to post
Share on other sites
  • 5 months later...

No more inaccurate rumor mills.  The real info is out and its simply not as good as some of you thought.


  • The Ryzen 9 (3900X) has 12 cores at 4.6GHz instead of 16 cores at 4.7GHz and costs $499 instead of $449
  • There is no 16 core version, nothing goes to even 4.7GHz, let alone 5.0 or 5.1 GHz.  Maybe there's a 16 core version at a later date though.
  • The Ryzen 7 (3800X) has 8 cores at 4.5GHz instead of 12 cores at 5.0GHz, and costs $399 instead of $329
  • The Ryzen 5 (3600X) has 6 cores at 4.4GHz instead of 8 cores at 4.8GHz, and costs $249 instead of $229

The IPC is purportedly significantly better, but they still only beat the equivalent Intel chip by 1% in single thread in one benchmark cherry-picked by AMD under unknown circumstances.  A beat is still a beat, but will it still beat Intel in all or even the majority of benchmarks, let alone gaming, when AMD doesn't get to cherry-pick? History suggests, not a chance.


Still, competition is good, and some of you are going to pre-order these because you're AMD fanbois. If this applies to you, I suggest you get either:

  • The Ryzen 7 3800X @ $399.  It has eight cores at a higher base clock than the 3900X and its boost clock is only off that one's by .1GHz.
  • The Ryzen 5 3600X @ $249.  It has six cores at the same base clock as the 3900X and its boost clock is only off that one's by .2Ghz
  • Avoid the Ryzen 9 3900X, at least until it shown there is no penalty for its chiplet design.  I suspect there will be a penalty though, or else all these chips would have chiplets.
  • Avoid the Ryzen 7 3700X and Ryzen 5 3600, these two are down-clocked versions that will be significantly worse for gaming than either the 3600X or 3800X.


Above all, I urge you all to avoid the hype and wait for reviews and gaming benchmarks before ordering.  Pre-ordering is bad.  The release date is July 7th, that's barely more than a month away, You can wait.  If these are legit, I'll be getting one myself.  But I have significant doubts.

Link to post
Share on other sites
  • 3 weeks later...

The 3000 series' 16 Core Ryzen 9 is revealed: https://www.anandtech.com/show/14516/amd-16-core-ryzen-9-3950x-up-to-4-7-ghz-105w-coming-september

Compared to the rumor mill's 16 core CPU, this one matches the lower end model pretty close instead of the higher end one listed.  Still 16 cores / 32 threads, but base clock is 3.5GHz instead of 3.9 or 4.3 GHz, boost clock is 4.7GHz instead of 5.1GHz, and price is a whopping $749 instead of $449 or $499.  Its also a chiplet design, like the 12 core.  And it won't be sold until September.

I can't think of a good reason to get this $749 16 core instead of one of the 12 core chips that are significantly cheaper. Unless you're live streaming 4K video, you don't need it. Or unless you're not into price/performance, but if that's the case, just go Intel.  And like the 12-core Ryzen 3000s, I think this chip will be significantly worse for gaming than one of the 8 core or 6 core Ryzen 3000s.

Link to post
Share on other sites
  • 2 weeks later...

New Ryzen lineup is great. First 12 and 16 core CPUs in mainstream pretty much and with good pricing. Releasing next month except 3950X will release in September.


New Navi GPUs, priced at 379$ and 449$, providing slightly higher performance than 2060 and 2070 pretty much. Note that Nvidia will also be releasing their Super RTX Series and cutting prices on old RTX cards to compete.


Link to post
Share on other sites
  • 3 weeks later...

And the reviews are popping up on the web.  Here's Anandtech's:



Gaming Performance

When it comes to gaming performance, the 9700K and 9900K remain the best performing CPUs on the market. Even without an IPC advantage anymore, Intel's high clockspeeds and supporting elements such as the core ringbus still give them the best performance in the kind of lightly-threaded and tightly-threaded scenarios that games often follow.

That being said, the new 3700X and 3900X are posting enormous improvements over the 2700X. And we can confirm AMD’s claims of up to 30-35% better performance in some games over the 2700X. So AMD has not been standing still.

Ultimately, while AMD still lags behind Intel in gaming performance, the gap has narrowed immensely, to the point that Ryzen CPUs are no longer something to be dismissed if you want to have a high-end gaming machine. Intel's performance advantage is rather limited here – and for the power-conscientious, AMD is delivering better efficiency at this point – so while they may not always win out as the very best choice for absolute peak gaming performance, the 3rd gen Ryzens are still very much a very viable option worth considering.


Now why didn't Anandtech review the 3800X instead of the 3700X?  The 3700X is the downclocked version and the 3800X was the one I thought was most interesting, but they didn't review it (yet?).

Still it went exactly as I predicted.  If you're a gamer, the already-existing i7 and i9 are better, albeit only slightly.  But they're also no more expensive.  You can certainly make a case for AMD though for other tasks, like video encoding. In fact, if you software encode your gaming sessions on one PC instead of using NVEnc/Shadowplay or using a 2-PC setup, the 3900X is a clear winner.

I don't plan to spring for these, I'm sticking with my i7-4770k and am waiting for Intel's long-delayed process node shrink.  Intel better not wait too long, because at this rate I'll be getting AMD's 4th Gen Ryzens, if they come out first.


Link to post
Share on other sites

Anandtech didn't review it because they likely received 3700X and 3900X from AMD just like all the other reviewers.

3600, 3700X and 3900X look like the best price/performance picks in the lineup. With limited manual overclock options and improved IMC, 3600X and 3800X doesn't seem to have a measurable difference compared to 3600 and 3700X so far but new BIOS/microcode from AMD might change this in the future.

But even if the differences end up being as advertised, it doesn't seem to be worth 50 or 70$ going from one to another.

3600, 3700X and 3900X look like great picks tho, especially combined with a 5700XT, which seems to be placed really well(400$) against Nvidia's latest offerings. Do expect more Navi GPU releases this year or the next. AMD might not go against the highest end trying to match Nvidia at 2080Ti/Titan level but they always fight for mid range and high end stuff.

3600/3700X + 5700XT seems to be a great combination. Asus X470 Crosshair VII Hero is also a good motherboard you can go for which has daisy chain topology for superior memory overclocking performance, good onboard audio, solid NIC and great VRMs instead of expensive X570 options.

Link to post
Share on other sites

Yeah, its unclear to me that 3800X is "worth it" or not.  Certainly not from a price/performance aspect, but possibly from a "performance at not that much more" aspect.  Its going to be faster than the 3700X, obviously, its stock clocks are higher.  But will it be much faster than the 3900X?  The chiplet design doesn't seem to have any real drawbacks from what I've read, so that fear seems to be unfounded.

I found the 3800X in stock a few days ago, hopefully some review sites bought it and are in the process of reviewing it.  Right now we have little more than a couple anecdotes.

Then there's the whole overclocking aspect, which is still a complete unknown. I wish HardOCP was still in business, they'd have all this done by now, and if not, they'd be transparent without being asked about what was holding them up.

Link to post
Share on other sites

Yea so we had more info that have been rolling in from credible sources and I don't think 3900X is a good buy since one of the 2 chiplets(CCD - Core Complex Die) are badly binned. You are currently guaranteed 1 good and 1 bad chiplet on it pretty much. One might hit max boost at some loads, the other likely won't.

3600X and 3800X still don't seem to be really worth it for a 100 MHz frequency increase either.

We also had more overclocking info come in and it seems Ryzen 3000 performs the best with a IF(Infinity Fabric overclock). It caps at around 1900 MHz so that means 3800C16 or 3800C14 timings for RAM depending on the kit you have. 2 CCD CPUs(3900X-3950X) seem to be more amenable to 2 DPC(DIMM-per-channel) overclocking(4 DIMMs). Lower SKUs will likely prefer less DIMMs.

16 GB Micron E-die kits, SR(Single Rank), 1 DPC, are pretty cheap and they'll hit 3600C16 just fine. For 3800C14/3800C16, you might require a 1 DPC SR B-die kit.

I'd also like to put this screenshot here:


So this is a 3900X working on a X370 Taichi(~2.5yr old mobo from first gen) with the latest BIOS updates. It's running 2 DPC DR(Dual Rank) Micron-E die kit(4x 16GB) at 3600C16 with coupled IF(IF's running at 1800MHz). 64GB RAM is the max. amount AM4 platform supports pretty much and IF caps at around 1800 for Dual Rank kits most of the time.

This kit seems to be priced around ~300$ which is relatively cheap for 64 GB of RAM. So if you need a ton of RAM for your workload, this would be a good option.

So everything mostly seems to be working fine with 3000 series, except some SKUs falling short of their advertised boostclocks around 25 MHz or so. We also have older mobos working fine ~2 weeks after launch when applied some BIOS updates eventhough you're likely to hear otherwise.


Some RAM benches with a 3600X and a RTX2080 at 1440p:


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Similar Content

    • By Gallitin
      Anyone else read about this?  Looks like a nice upgrade at a quarter of the cost if it's really true, but sounds like it.
      The only piece I don't really understand is how is why the 2080TI's are not falling some in price, maybe they will when the 3070 is released?
    • By Boildown
      This is when I finally switch teams to AMD for the CPU (lol I don't think AMD's GPU will be any good though).

      Amazed that the clock speed keeps going up at 16 cores over 12 cores.  Need to look at it closer to see if there's any gotchas.  Will be waiting for reviews before buying as always.  But AMD is about to wipe Intel's last advantage away.  I'm still on my i7-4770K, so this will be a huge upgrade.
    • By Boildown

      In the RTX 20 series, the 2080 Ti was way ahead of the rest of the pack, even one step down, the RTX 2080 (Super or not) didn't come close.  But in the 30 series, even the RTX 3070 looks like it might have better performance than the 2080 Ti.  If true, this will be a revolutionary GPU.  I plan to buy at around the same time or maybe slightly before the next Ryzen CPUs come out end of year. 
      I think the best plan will be to order when the Ryzen release date is announced.  (On a related note, I definitely think this next release is the moment when AMD unambiguously takes the CPU crown from Intel at the high end, even for gaming.)  By later this year all the other manufacturers will have their best offerings in stock at normal prices, so I'll be able to get my usual triple fan version.  Not sure yet if I'll get 3070, 3080, or 3090, will try to get best price performance and only go 3090 if its head and shoulders above the 3080 like the 2080 Ti was above the 2080.
    • By Gremlich
      Recently, I had an issue getting the most recent Geforce display drivers to install. I kept getting "the data is invalid" as an error.
      I tried all kinds of solutions, none worked and I thought I'd have to reinstall Win10.
      Contacted EVGA, tried their solutions - no joy.
      then, I remember seeing where evga had recommended using "Display Driver Uninstall" and asked them if I should try it. They said, sure, go ahead.
      So, I got DDU from Guru3D, ran it in Safe Mode, ran the driver installer, and Voila! The new driver installed.
    • By WarWulf
      Is immersion, and are haptics, still relevant?

      Sony has recently revealed their new controller... and it looks like they've replaced the usual rumble pad with actual haptics and force feedback. In their blog post, Sony detailed the "variety of powerful sensations" haptic feedback can bring to gameplay, like "the slow grittiness of driving a car through mud." Haptic feedback is the same technology behind the Nintendo Switch's HD rumble and we know the Xbox Series X controller will make use of it as well.
      Probably the most famous haptic controller in this community will be the Microsoft Sidewinder Force Feedback 2, which hasn't been available commercially since 2003. This controller is now much sought after by a relatively small group of people, and although haptic feedback is not widely supported, it is still incorporated into some of the more detailed flight sims even after 20 years.
      So what were some of the features that this provided? It's been a long time since I let mine go, but I do remember the demos which were part of the joysticks toolkit. This allowed you to feel simulated experiences such as a something similar to dragging a pencil over corrugated cardboard (intermittent resistance), feeling a difficult spot within the joysticks range of motion, resistance in one direction, etc. It might be nostalgia, but I definitely recall being impressed by these demonstrations. I was even more impressed when I fired it up in my first flight simulation haptic experience. I had never had the experience of fighting the winds before and it was obviously something that Microsoft had spend time refining.
      So why, if this controller was as legendary as its fans would tell you, did it die off? Patents. The company Immersion Technology had developed the technology and owned the patents to it, and Microsoft and Sony were not paying to license the patents. It would be easy to see this is a patent troll manoeuvre, however the truth is that without these licenses, innovations would probably be fewer and far between. There were a number of court cases, Microsoft settled early paying a dividend and purchasing a portion of the company. Sony were more resistant. The court cases went on for a number of years, during which they were appealing a court decision that would have ceased production of the PlayStation line until the patented technology was removed. Eventually it all closed down and while it never really made any big headlines, even those outlets following the situation lost interest. No console had haptic feedback, favouring the rumble technology instead, and immersion continued to produce technology aimed towards industrial production and military products.
      Now we're caught up in the timeline, what is the latest development? The patents for a number of applications for haptic technology, including those used in the Sidewinder FF, have finally expired after being extended by the holding company. Those die hard fans have been theorising about a resurgence of this niche technology. However there has largely been radio silence on the subject. Theory-crafting was simply that. Until the consoles announced that they were going to be providing their own haptics.
      This changes the lay of the land. As any good PC gamer knows (even if they may not like it) a large portion of the industry is motivated by console technologies above those on PC. There are always areas which are PC-centric, and the PC is always at the forefront of the technology curve, but frankly consoles and mobiles are where the big money is these days. And you can't argue with profits.
      So with this possibly becoming something that will be developed for and supported again, the big question is; Is it still relevant? And how would we want it to be implemented?
      This isn't an attempt at an opinion piece, although my opinion will bleed into it. I'm genuinely interested in whether or not this has piqued enough curiosity for people to vote with their wallets. How will people want it to apply to Star Citizen, if it is implemented? Feeling the buffeting of atmosphere and the kiss of the tarmac when flying on atmosphere and landing? Simulating faults in the system making controls sluggish and less responsive? Or perhaps something more in-depth?
      Is this even relevant enough for people to notice?
      I look forward to your comments, if any.
  • Create New...