Intel vs amd vs apple

   3594   10   2
User Avatar
Member
7 posts
Joined: Jan. 2019
Offline
hi,

I’m thinking to update my current Pc and I don’t know which processor is better for houdini.

Intel 13900k
Amd 7900x
Amd 7950x
Amd 79003d vcache
Apple silicion (mac pro/mac studio).
My preferences are amd or apple. I don't know if the 79003d is just dedicated to gaming and I don't know if there is much difference in performance between the 7900x and 7950x either.
Currently I have an amd 3900x.

Thanks!
User Avatar
Member
466 posts
Joined: Aug. 2014
Offline
I find it curious that you're including a Mac on your list, because if I'm not mistaken, and I'm most likely not because I double-checked it, M1 and M2 CPUs (except of maybe M1 Ultra 20 Core) are slower than your current 3900X. In fact all amd64 processors from your list would probably eat all Apple's CPUs on breakfast, and unlike Apple's hardware, they are reasonably priced and don't lock you up in Apple's jail of its ecosystem. If you ask me, I would never buy any of Apple's ludicrously overpriced products and bind myself to their proprietary software.

The "3d" version of Ryzen 7 series comes with a MUCH larger L3 cache and a lower base frequency (and in consequence, lower TDP) than standard models. According to AMD, this gives a significant ~20% boost of processing power in 3D computer games and around 11% in 3D applications (source [youtu.be]). The non-technical justification of lowered base frequency is:

AMD (https://youtu.be/0yJZjegRDpA?t=254)
It doesn't really need to boost super high in order to get that game performance because there's much memory on the chip.

These models of AMD's CPUs might be quite a good choice in the light of the sick and twisted, centrally driven green energy policy of our EU "garden" (term coined by the unelected Josep Borrell, not me), which results in skyrocketing electricity prices in all member countries, and ravaging inflation in general. The only problem is that those CPUs have just had their La Grande Première, so it will take a while for them to hit the retail stores en masse, and for their prices to settle on an acceptable level once the mass production pipeline is fully established.

Like you, I'm currently in the middle of choosing components for my new workstation. I'm strongly leaning towards 7950X, unless its "3d" version hits the stores before I make the purchase, it won't cost a whole lotta more, and if it's already supported by the Linux kernel version which I'm currently using (I haven't checked on that yet).

Intel processors are a no-no for me. I will never forget the Meltdown and Spectre CVEs debacle with Sandy Bridge CPUs. Some wise man once said that one can build trust for decades, but his effort can be lost in a single day. Well, that's what happened.
Edited by ajz3d - Feb. 12, 2023 20:00:47
User Avatar
Member
7 posts
Joined: Jan. 2019
Offline
Thanks for your reply. Im waiting for to know the price of the 3d version and then I will choose between this one or the 7950x.

About the graphic card, I know the best solution is 4090 but the price is very high. I dont know if the amd solution is a good solution for houdini. Nvidia 4070ti, 4080 or 4090, depending on my budgetary, are the best devices.
User Avatar
Member
466 posts
Joined: Aug. 2014
Offline
Regarding video cards, I think the choice is obvious, as you have already noticed. There was a thread about this somewhere on this forum. I don't have time to search for it at the moment, but the conclusion was something in the line of NVIDIA being the best option because of widely accepted CUDA and Optix in our niche. Besides, Karma XPU at its current stage of development supports Optix devices only. Therefore, for the time being I suggest buying an RTX with as much video memory as you can fit in your budget. I own an RTX 3070 and its 8GB VRAM fills up very fast when I'm texturing, rendering or simulating stuff.
Edited by ajz3d - Feb. 13, 2023 11:06:36
User Avatar
Member
466 posts
Joined: Aug. 2014
Offline
I stumbled upon what possibly is the first Blender benchmark of 7950x3d [opendata.blender.org]. To my surprise, when compared to 7950x the result is not in favor of 7950x3d, which I find curious for a processor that is intended to be $100 more pricey. Of course, we are talking about seven hundred 7950x benchmarks versus only one, nevertheless it makes me wonder if benefits that an extra L3 cache brings aren't in fact negated by limited base frequency. I guess in order to know that, we should wait for other benchmarks that I'm sure will soon follow.

This benchmark is referenced by the recent piece [www.tomshardware.com] on 7950x3d that I found on Tom's Hardware site, where they also mention GeekBench results. This test's results turned out lower too.
User Avatar
Member
7770 posts
Joined: Sept. 2011
Offline
ajz3d
Of course, we are talking about seven hundred 7950x benchmarks versus only one, nevertheless it makes me wonder if benefits that an extra L3 cache brings aren't in fact negated by limited base frequency. I guess in order to know that, we should wait for other benchmarks that I'm sure will soon follow.

wait for the special version of Houdini that fits entirely in the L3
User Avatar
Member
7 posts
Joined: Jan. 2019
Offline
Ok! Thanks for your information!! It was very useful for me
User Avatar
Member
248 posts
Joined: May 2017
Offline
ajz3d
I stumbled upon what possibly is the first Blender benchmark of 7950x3d [opendata.blender.org]. To my surprise, when compared to 7950x the result is not in favor of 7950x3d, which I find curious for a processor that is intended to be $100 more pricey. Of course, we are talking about seven hundred 7950x benchmarks versus only one, nevertheless it makes me wonder if benefits that an extra L3 cache brings aren't in fact negated by limited base frequency. I guess in order to know that, we should wait for other benchmarks that I'm sure will soon follow.
think of it as a 7950x running on a tuned ECO mode with much more cache. Default TDP is 170w, ECO mode is 105w, the 3D is 120w - it all adds up.

Generally speaking if the non-3d model is cheaper where you at, it is not a bad idea to get that and run it in ECO-105, you will loose around 7%-8% of pure horsepower for 40% power. Keep in mind that AMD TDP numbers are not factual, but have a funny equation to look smaller, so effectively the 175w TDP results in about 250Wish of heat and noise.
Edited by osong - Feb. 19, 2023 00:00:53
https://twitter.com/oossoonngg [twitter.com]
User Avatar
Member
466 posts
Joined: Aug. 2014
Offline
osong
it is not a bad idea to get that and run it in ECO-105, you will loose around 7%-8% of pure horsepower for 40% power.
This indeed sounds like a good idea. I guess manually reducing power consumption, but having that additional power in reserve at any time, is better than being locked in a permanent eco-mode.

Those 3D versions are still unavailable in retail stores anyway.
User Avatar
Member
466 posts
Joined: Aug. 2014
Offline
What can you folks say about the difference between X670 and X670E motherboards? As I understand it, both come with two PROM21 chiplets. A configuration which is supposed to increase I/O throughput, while at the same time cost less than a single X570 chip of the previous generation. But I can't find any informative article about the actual gains that E-version of X670 chip offers over your standard issue X670. I believe that "E" stands for "Extreme" and some sources I found suggest that X670E offers better overclocking capabilities. If that's the case, I don't think I want to pay extra for that, because I never overclock my machines. Also, most (perhaps even all) X670E motherboards tend to come with Wi-Fi, which I also don't need.

I have an eye on Asus Prime X670-P, mostly because it's on the cheap side in terms of X670 mobo prices (which are high enough!), it has plenty of SATA3 ports (6), and because so far I've had good experience with Asus hardware running on GNU/Linux.

Prime's X670E version is much more expensive ($400 vs $310), has less SATA3 ports (4), less PCI-E x16 ports (2 vs 3), but they've put Bluetooth and Wi-Fi in it, which of course can always be bought as separate hardware, and I prefer modularity to monolithic designs. There are also some other, rather minor differences between the two, like the complete lack of USB 2.0 ports in favor of 3.2, but frankly, nothing that would justify such a huge price difference.

So, is there any good reason to choose E over standard X670?

PS. @jmmrfx, I hope I'm not stealing your thread. If I'm doing it, I apologize.
User Avatar
Member
7 posts
Joined: Jan. 2019
Offline
Jajaja. Don’t worry. Feel free to discuss about everything related to the topic (new pc). It is also additional information for me.
  • Quick Links