Mark Broadbridge, the Senior Render Wrangler at YellowDog, shares insight from his team’s investigation into the assumption that rendering across multiple brands of CPU may produce inconsistent results.
There is a lot of myth and speculation surrounding the effect that rendering with different brands of CPU has on the consistency of the final rendered output. Does it matter? Or should you stick to one brand? You’ll find online evangelists on both sides of the fence, with most opinions seemingly based on theories and gut-feelings.
Prompted by our own suspicions and after I failed to find any conclusive evidence one way or the other, I challenged our team to conduct a pseudo-scientific investigation. We ran a series of tests using the two market leaders: Intel and AMD, to try and lay this issue to rest.
We ran a series of simple renders on YellowDog’s Intel Xeon 2.6Ghz 32-core cloud nodes, and also on AMD EPYC 2.0Ghz 32-core nodes. We also sanity-checked our results across a variety of high-end production scenes, which reflected the results displayed in this report below.
We compared the results of the output renders using the image comparison tool online-image-comparison.com. The tool works by overlaying two images and highlighting any pixels with variations in their RGB value. This allows us to see minute variation in two seemingly identical images. An example image is below.
The following scenes were rendered using the same Intel Xeon cloud node. We rendered a scene in Maya with V-Ray, 3ds Max with V-Ray, and Cinema 4D (C4D) with the physical render engine.
Here are the outputs of those renders before using the comparison tool:
We then rendered the same scenes again using the same settings and the same Intel CPU. The outputs were fed into the online comparison tool and the pixels that showed any RGB variance were highlighted in red. We ran the same test scenario on multiple scenes of varying complexity including YellowDog customer scenes where permission was granted. The pixel RGB variance reported was consistent with the results of the sample scenes throughout.
By their very nature, biased render engines calculate, estimate, and determine a level of pixel accuracy that will be acceptable to the naked eye. Despite total consistency in hardware from one rendered frame to the next, there is a variance when rendering with biased engines and analysing using a comparison tool.
This investigation suggests that if you stick to the same hardware there is a negligible distinction in consistency between rendering on AMD or Intel. We encountered a small increase in pixel RGB variance when using the AMD CPU compared to our Intel CPU, but more testing over a range of scenarios is required to establish this perspective.
5 bonus points to you if you spotted the single red pixel in the Maya scene for this test.
So….what happens if you want to mix things up in your render farm? YellowDog works with many studios who have a mixture of different CPU types so we wanted investigate. The following test shows the variation of the same scenes rendered once with Intel and once with AMD.
The image comparison tool shows up a noticeable difference: markedly increased from the tests that we ran on identical CPU types earlier in this investigation.
Of note is the clear difference along the edges of the bucket calculations in the Maya image. Parts of both the Max and Maya image are so varied that, for the first time in the investigation, we observe an output variance that is visible to the naked eye when the two frames are played in looping video format.
Are Test 3 results due to the difference between AMD and Intel or is it due to a broader difference in any hardware configuration?
We decided to run the same tests but this time compare Intel to Intel: rendering with Intel Xeon 2.6Ghz 32 core nodes and rendering with Intel Xeon 2.3GHz workstations.
The variance in pixel RGB value, while decreased compared to Test 3, is remarkably similar. Flicker is still present in some areas of the scene as we observed in Test 3.
So what does this mean for your local render farm or for your chosen cloud render provider? If you want consistent renders, then the evidence from this investigation suggests that you should render with a consistent batch of hardware. Whether the CPU is AMD or Intel is largely irrelevant on the basis of consistency; you just need to pick one and stick with it throughout a production in my view. You should also ensure that the hardware configuration is consistent; we have shown that rendering with a consistent CPU type but inconsistent hardware configuration produces the same results as if you had changed between AMD and Intel. The ideal best practice is to ensure total consistency of render nodes between frames, between shots, and between scenes.
Maintaining consistency and quality will become more important than ever in the years to come for rendering due to increasing demands for higher resolution productions immersive experience which puts render quality closer to the eyes of the viewer than ever before.
YellowDog renders productions with identical cloud nodes: containing identical CPU and hardware configurations for every customer to mitigate quality risk. The same is true of our GPU technology.
You are seeing this because you are using a browser that is not supported. The YellowDog website is built using modern technology and standards. We recommend upgrading your browser with one of the following to properly view our website:Windows
Please note that this is not an exhaustive list of browsers. We also do not intend to recommend a particular manufacturer's browser over another's; only to suggest upgrading to a browser version that is compliant with current standards to give you the best and most secure browsing experience.