Larrabee and rasterization

This is a story that has needled me for a while, but never to the point of writing about it. Until today, when I read a blog entry by Tom Forsyth about how Intel’s Larrabee will actually be able to do rasterization and not just real-time raytracing:

I’ve been trying to keep quiet, but I need to get one thing very clear. Larrabee is going to render DirectX and OpenGL games through rasterisation, not through ray-tracing.

I’m not sure how the message got so muddled.

Yeah, gee, I wonder why. Maybe because Intel has been pushing real-time ray-tracing so hard? Take a look at this fair and balanced article called “Rendering Games with Raytracing Will Revolutionize Graphics”. It wasn’t written by anyone from Intel, but it describes Intel’s efforts in that area – note the picture of an Intel poster saying ‘Ray-tracing: The future for games!’.

Ray-tracing for games doesn’t make any sense to me. The visual advantages of ray-tracing over rasterization are tiny and not worth switching to a completely new rendering approach for. Ray-tracing always evokes mirrored balls on infinite chessboards to me, and hey look! Intel’s latest screenshots contain mirrored balls! You need to grab people by the neck and point them at the stuff ray-tracing is better at than rasterization. And I am not the only one who is skeptical: so are John Carmack and Crytek. They should know (more than me, I haven’t done graphics programming in ages), and their opinion counts.

What Intel is trying to do with Larrabee seems awfully transparent. NVidia is moving into general purpose processing with CUDA, so Intel is moving into graphics processing with x86 cores. Let’s not pretend this is automatically good for developers or for gamers. I listened to Intel’s hype about MMX in 1995, and MMX turned out to be completely useless for games. They’re not fooling me twice. I’d be even more skeptical except Kim Pallister is working on Larrabee, and I have a lot of respect for him. Still… Larrabee makes me sneer. I couldn’t care less about it.

Am I wrong? Is Larrabee or real-time ray-tracing the wave of the future? Tell me in the comments. Don’t tell me Tom Forsyth is not responsible for Intel’s PR, I know that. He’s a good guy.

Comments 6

  1. Aras Pranckevičius wrote:

    By the same TomF on mollyrocket forums:

    “Don’t confuse the whims of marketing departments with technical reality”.

    Posted 18 Jun 2008 at 6:44
  2. Jurie wrote:

    Yeah, this rant is a more aimed at PR than at developers.

    Posted 18 Jun 2008 at 7:00
  3. Andreas wrote:

    I wouldn’t bet my money on the ray-tracing horse quite just yet…

    Posted 18 Jun 2008 at 7:24
  4. fluffy wrote:

    IMO, the best approach is a hybrid one. Raytracing is great for reflections and refractions, while rasterization is good for everything else. Some sort of architecture which allows a fragment shader to also cast a ray and use the intersected fragment’s output would be (in theory) great, although of course it raises a lot of HUGE concerns about how one would actually go about doing that (since the GPU will need to know all the scene geometry up-front, which largely conflicts with how most graphics engines work, especially with respect to visibility determination).

    I think some sort of deferred shading model might be able to solve it, though – for example, do a raster pass which also collects the cast ray information into another target (possibly lower-resolution) and then do a raytrace pass afterwards, with the GPU telling the engine “hey I need geometry to cover these areas.” (Obviously with such an architecture you want to limit your multiple bounce passes quite a lot.)

    Of course what this buys you is only slightly better-looking reflections and refractions than just faking it by rendering a few cubemaps from strategically-placed origin points and then using a traditional vertex+pixel shader to combine them, so I’m not sure it’s really worth the effort.

    Posted 18 Jun 2008 at 10:32
  5. Jurie wrote:

    “Of course what this buys you is only slightly better-looking reflections and refractions than just faking it by rendering a few cubemaps from strategically-placed origin points and then using a traditional vertex+pixel shader to combine them, so I’m not sure it’s really worth the effort.”

    Exactly my point. And yes, you can come up with concepts that require it, but meh. (Did that real-time ray-tracing game that was announced some years back ever come out? I think it was called Seed.)

    Also: O.o Fluffy, I always thought of you as a more of an artisty type.

    Posted 18 Jun 2008 at 11:02
  6. Jurie wrote:

    I ran into Kim Pallister at GDC and we discussed this. He didn’t seem too miffed that I dissed Larrabee ;) More later.

    Posted 26 Jun 2008 at 2:40