By: Ireland (boh.delete@this.outlook.ie), January 25, 2017 9:11 am
Room: Moderated Discussions
RichardC (tich.delete@this.pobox.com) on January 25, 2017 4:26 am wrote:
> And googling around, I found a description of Pixar's render farm network from 2010 which mentioned
> 300 10Gbit ports and 1500 1Gbit ports, which sounds very much like 1Gbit ports for most of the
> rendering boxes. Maybe they have some shared data on fileserver boxes which need the 10Gbit ?
> Or maybe those are just for the higher-level interconnect between switches. Anyhow,
> this is the creme de la creme, and it is (or recently was) predominantly 1Gbit.
>
The thing to remember about a rendering farm, is that it's like a coal furnace. It's impossible to keep the thing fed all of the time, with creative story telling that find their representation as electronically created universes. It's like movie director Ridley Scott said not too long ago, we probably make about twice the amount of movies per year (many of them containing large dollups of CGI now), than we have stories that are worth telling in moving picture format. Go to any 'introduction' book to computer animation, and it will start with things like Toy Story One. The animation was rough around the edges, but the story was incredible.
A rendering farm is really 'two' problems, not 'one' problem.
There's a ratio of maybe 10 : 1, between non-production, instant feedback test rendering to be done (that is what's flowing constantly through the 1Gbit side of the system), compared to final quality rendering that is flowing through the 10Gbit side of the same infrastructure. Like, before you even try to 'press the button' on final test rendering (where you do add on, all of those extra layers of calculation to get greater richness and depth in the images), you've de-bugged those frames as many as nine times prior to that, using a lower level of calculation load. The advantage being, that if you find yourself running down a 'blind alley' from an animated story-telling point of view, you learn about that blind alley that you're running down, a lot quicker. I.e. On 'Monday' or 'Tuesday', as opposed to on 'Thursday' or 'Friday'.
The problem with the 10Gbit side of the infrastructure, is low utilization. As I explained, the story tellers simply cannot come up with good animated motion picture stories to tell quickly enough, in order to keep the 10GBit side of the infrastructure (the 'coal furnace'), going all the time. You need a lot more smaller furances that are constantly burning, and one big one, that you 'pack and stack' your rendering jobs that flow through it - sort of like 'air traffic' control. You can bet two, that your large furnace will burn 24 hours around the clock, and never be switched off. Because, you don't need human beings around while that thing is burning. On the other hand, when people do arrive in to work at Pixar at some respectable part of that 24 hours in the clock, you need enough of small furnaces available for human beings, to burn small amounts of their 'tinder' in, and get instant feed back - so that they can more easily direct themselves, and the efforts of their teams/work groups. You see?
You will observe some version of that work process that I described at Pixar, in the scientific laboratory nowadays today too.
One cannot divorce the work process from the technological process - and discuss one in isolation, without the other. Which is what tends to happen, when technologists sit around a table. It's in the smaller furnace space, that the ARM-based technology will first begin to appear I reckon - and mainly just to cut down on overheads, as it consumption of electricity. Nothing more complicated than that. The ARM-based technology may even be useful to add a 'third layer' in the infrastructure - where one would have 'large furnace', smaller furnaces and then smaller again.
This thing all applies to other computations too. After all, a rendering algorithm is nothing more than a mathematical computation based around principles of physics. It doesn't matter whether one is searching for black holes in the cosmos, or putting a shine on a ceramic tea cup image. All the best.
> And googling around, I found a description of Pixar's render farm network from 2010 which mentioned
> 300 10Gbit ports and 1500 1Gbit ports, which sounds very much like 1Gbit ports for most of the
> rendering boxes. Maybe they have some shared data on fileserver boxes which need the 10Gbit ?
> Or maybe those are just for the higher-level interconnect between switches. Anyhow,
> this is the creme de la creme, and it is (or recently was) predominantly 1Gbit.
>
The thing to remember about a rendering farm, is that it's like a coal furnace. It's impossible to keep the thing fed all of the time, with creative story telling that find their representation as electronically created universes. It's like movie director Ridley Scott said not too long ago, we probably make about twice the amount of movies per year (many of them containing large dollups of CGI now), than we have stories that are worth telling in moving picture format. Go to any 'introduction' book to computer animation, and it will start with things like Toy Story One. The animation was rough around the edges, but the story was incredible.
A rendering farm is really 'two' problems, not 'one' problem.
There's a ratio of maybe 10 : 1, between non-production, instant feedback test rendering to be done (that is what's flowing constantly through the 1Gbit side of the system), compared to final quality rendering that is flowing through the 10Gbit side of the same infrastructure. Like, before you even try to 'press the button' on final test rendering (where you do add on, all of those extra layers of calculation to get greater richness and depth in the images), you've de-bugged those frames as many as nine times prior to that, using a lower level of calculation load. The advantage being, that if you find yourself running down a 'blind alley' from an animated story-telling point of view, you learn about that blind alley that you're running down, a lot quicker. I.e. On 'Monday' or 'Tuesday', as opposed to on 'Thursday' or 'Friday'.
The problem with the 10Gbit side of the infrastructure, is low utilization. As I explained, the story tellers simply cannot come up with good animated motion picture stories to tell quickly enough, in order to keep the 10GBit side of the infrastructure (the 'coal furnace'), going all the time. You need a lot more smaller furances that are constantly burning, and one big one, that you 'pack and stack' your rendering jobs that flow through it - sort of like 'air traffic' control. You can bet two, that your large furnace will burn 24 hours around the clock, and never be switched off. Because, you don't need human beings around while that thing is burning. On the other hand, when people do arrive in to work at Pixar at some respectable part of that 24 hours in the clock, you need enough of small furnaces available for human beings, to burn small amounts of their 'tinder' in, and get instant feed back - so that they can more easily direct themselves, and the efforts of their teams/work groups. You see?
You will observe some version of that work process that I described at Pixar, in the scientific laboratory nowadays today too.
One cannot divorce the work process from the technological process - and discuss one in isolation, without the other. Which is what tends to happen, when technologists sit around a table. It's in the smaller furnace space, that the ARM-based technology will first begin to appear I reckon - and mainly just to cut down on overheads, as it consumption of electricity. Nothing more complicated than that. The ARM-based technology may even be useful to add a 'third layer' in the infrastructure - where one would have 'large furnace', smaller furnaces and then smaller again.
This thing all applies to other computations too. After all, a rendering algorithm is nothing more than a mathematical computation based around principles of physics. It doesn't matter whether one is searching for black holes in the cosmos, or putting a shine on a ceramic tea cup image. All the best.