Intel is no stranger to raytracing - we've seen demonstrations such as Quake IV ported to an Intel-designed raytracer along with a number of other demos in the past. The promise of raytrace renderers over today's more conventional raster engines for games and desktop 3D has always been increased realism and theoretically near linear scaling. Of course, the problem until now has been that raytracers haven't been able to maintain playable framerate at desktop resolutions. 

Yesterday Intel demonstrated a new example of raytraced graphics on the desktop using a raytrace rendered version of Wolfenstein. This time, it was based around a cloud-centric model where frames are rendered on 4 servers, each with a 32-core codename Knight's Ferry silicon at the core. 

Knight's Ferry is a Many Integrated Core (MIC) architecture part intel showed off at the International Supercomputer Conference this year with 32 logical cores. 

We saw the Wolfenstein demo raytraced at 1280x720 and averaging between 40 to 50 FPS, all being rendered on four servers with Knights Ferry at the core. Intel showed off the game visualized on a Dell notebook. All of the frames are sent to the thin client notebook over ordinary gigabit ethernet. 

Interesting visual features of the Wolfenstein raytracer include physics-correct refractions and reflections at interfaces like glass and water - taking into account the actual index of the material. and very recursive examples like a surveillance station camera with the surveillance station inside it. Check out the gallery for those examples, screenshots, and more.

Update

Fixed a major typo (thanks everyone for catching that)! For more info check out the project's homepage

Comments Locked

26 Comments

View All Comments

  • FaaR - Wednesday, September 15, 2010 - link

    Events occurring at dozens per second is slower than snail paced, even for a single-digit MHz computer of the 1980s, and much less for anything money can buy today...

    Not sure why you think this demo's so incredibly impressive, the cloud servers only draw what they're told, each frame in sequence in a predictable manner, and the client buffers the results and display them.
  • chochosan - Tuesday, September 14, 2010 - link

    that was exactly what i was going to say too, everyone bickered about how raytracing is so power hungry and super inefective, what all missed was that this demo was played on a measely laptop using cloud graphics. That, for me is the future of gaming, you wont have a powerfull computer, just something to connect to the cloud and play on demand, the TCO of a gaming platform will be hugely higher than this type of setup, especially is you factor in constant upgrades for your computer. Latency of cource is vital for that.
  • dreamlane - Monday, September 13, 2010 - link

    I think this raytracing stuff is really showing some potential!
  • iwodo - Tuesday, September 14, 2010 - link

    I am sorry but even Nvidia admit that Ray Tracing wont be in Games for another Decade, an it would be another decade for Hybrid Ray Trace + Rasterization to work out, before we use Ray Trace only. That is like 30 years, i hope i am still alive then.

    There are fundamental problems with Ray Tracing, it may be perfect for Reflection and stuff, otherwise Rasterization takes the lead.
  • Stas - Tuesday, September 14, 2010 - link

    ...ly sh*t O.O
  • Murloc - Tuesday, September 14, 2010 - link

    but you're still playing on a huge server in the same room, because through internet connection this can't work, they are too slow.

Log in

Don't have an account? Sign up now