[SLUG] 3D Graphics, SGI and Linux

From: Derek Glidden (dglidden@illusionary.com)
Date: Tue Jun 05 2001 - 14:30:41 EDT

Bob File wrote:
> So does this mean that the Intel CISC architecture has caught up with the SGI
> and (Solaris) RISC architecture? I have only seen SGI workstations from afar,

In some ways yes. If you look at the core of any Intel or AMD chip made
today, it's really a RISC chip. What's happened is that the core of the
Intel-compatible architecture has become RISC, with a lot of extra
circuitry that, in effect, translates on-the-fly from the "legacy" Intel
CISC instructions into what the CPU really talks. The reason that
Intel-type chips are still considered CISC is because of all that extra
circuitry to read and translate the legacy Intel instruction set. (I
read an article somewhere once that put the ratio at something like 1/4
of the chip was actual instruction units, the rest was all this
"translation" layer.)

(And it's particularly funny to see that, in their quest to build a
"true" RISC CPU, Intel, in building their IA-64 architecture has
designed a chip that's larger, consumes more silicon, runs hotter, and
has much higher power requirements than any IA-32 chip ever made...)

Transmeta are doing essentially the same thing, but have taken the
approach of stripping out that 3/4 of the silicon needed to do the
translation and turned it into software. Which results in a chip with a
lot less silicon, lower power requirements, generates a lot less heat
and is software upgradable to new instruction sets, but is really doing
the same basic thing as any Intel-compatible CPU on the market.

There are some things about the Intel architecture that make it very
appealing to 3D people as well. RISC has always taken the approach (by
definition really) of having as few hardware opcodes as they can get
away with. Unfortuntately, that means there's no hardware op for
"matrix multiply" so it has to be "simulated" with a series of other
ops. Intel, in their eternal quest to sell more CPUs to the end user,
decided at one point that having a hardware "matrix multiply" op would
be cool, because then you can sell your chips to 3D gamers by promoting
the fact that there's hardware ops for doing a lot of the things games
do. (While completely overlooking the fact that most of those ops happen
in the 3D accelerator card nowadays...) This has the side effect of
being of benefit to the 3D community, because those same hardware ops
apply to many different things, including rendering. Unfortunately, it
also makes the CPU much larger, more expensive and with higher power
requirements and more heat output than the average RISC cpu.

But Alphas still totally kick butt over Intel in terms of
floating-point, no matter what you're trying to do. That's why a lot of
3D studios are looking to run Alpha render farms. Alphas are still
relatively expensive compared to Intel-compatibles, though, which is the
other area where CISC is "winning" over RISC - there are millions more
Intel CISC CPUs out there than anything MIPS or Sun or Digital ever

> but allways kind of lusted after them especially when I was doing a lot of
> cad work. Also I thought that SGI in particular paid a lot of attention to
> the video display and put a lot of power into that...

Yes they do. And they have much higher-quality hardware overall
compared to the PC architecture. There is something to the "you get
what you pay for" saying. But when it comes down to it, it's better to
get 5 PC-class workstations for $5000 each that are each slightly less
powerful than each of the $25000 SGI-class workstations you would
otherwise have to buy.

And think on this fact: the fellows who founded nVidia came from SGI's
graphics hardware development team.

They saw the coming marketplace shift away from expensive workstations
towards cheap commodity PC hardware, where the majority of SGI did not,
or did not believe it. They knew that if they could produce a graphics
chip that they could sell for $200, but that they could sell to the huge
PC market instead of the small SGI market, it would be better for them
in the long term. And they did. nVidia's chips may not have as many
bells and whistles as a high-end SGI graphics solution, but as they say,
would you rather have 50% of a 1,000,000 unit market, or 10% of a
100,000,000 unit market?

There is really one place where SGI still rules over the PC and that's
in straight I/O. PCI sucks compared to an SGI's backplane crossbar bus
architecture. 200Mhz DDR is nothing compared to switched 2Ghz directly
to your memory. And the architecture is cool enough that if you have a
system that supports it, adding another backplane doubles your overall

But the thing is, PCI and 200Mhz memory bus is "Good Enough" for 99% of
the market. Particularly when it's 1/5 the price.

> Maybe it's the combo of the two (Unix on Intel) that does it. I would think
> that it made it a lot easier to port the software, since most of it was
> already 'nix of some sort. Anyone have any direct experience comparing SGI to
> Intel hardware?

It's hard to compare them directly because they're so different. You
could as easily compare Macintosh to PC and be about as accurate.

The thing to focus on is that SGI got really smart and, after they
realized they missed the boat when the market shifted away from really
expensive workstation-class machines to commodity PC hardware, decided
they _wouldn't_ miss the Linux boat and have thrown a lot of resources
into supporting Linux on their hardware. Porting code from IRIX, with
all the hardware support that SGI machines give to the OS isn't easy,
but they've done it in several cases, and have done an excellent job of
trying to port most of their low-level stuff to Linux so that, when time
comes to port the big software, the underlying infrastructure will be

The most important fact to keep in mind: SGI gifted the computer world
with the _real_ opening up of the OpenGL reference implementation and
the GLX spec/protocol. If it weren't for SGI deciding to really be open
about these so-named "Open" specs, XFree86 4.0 would not have the
excellent 3D support it has now, and Linux would not nearly be the very
tempting platform for 3D work it has become.

IMNSHO, this is probably one of the smartest things SGI has ever done.

They saw Linux on the horizon and bet the farm - at a time when they
were floundering in the marketplace, they released what, to them, were
very important technologies that they had kept very close, in the hopes
that in the future they'd be able to capitalize on the fruits of their
offering. And it looks like their longshot bet is going to pay off
bigtime. The 3D industry is looking seriously at moving to using Linux
as their primary platform for 3D/CG work. While SGI may lose some
workstation hardware sales out of it, they now have a whole new market
to which they can sell software, services and expertise, and because
that new market is based on commodity hardware and free software, the
market can really grow like it was never able to when you had to have
expensive proprietary hardware to break in.

Good for SGI.

#!/usr/bin/perl -w

usage: qrpff 153 2 8 105 225 < /mnt/dvd/VOB_FILENAME \ | extract_mpeg2 | mpeg2dec -

http://www.eff.org/ http://www.opendvd.org/ http://www.cs.cmu.edu/~dst/DeCSS/Gallery/

This archive was generated by hypermail 2.1.3 : Fri Aug 01 2014 - 17:57:10 EDT