Categories
computers data center technology wintel

Intel readying MIC x64 coprocessor for 2012 • The Register

Image representing Intel as depicted in CrunchBase
Image via CrunchBase

Thus far, Intels Many Integrated Core MIC is little more than a research project. Intel picked up the remnants of the failed “Larrabee” graphics card project and rechristened it Knights and put it solely in the service of the king of computing, the CPU.

via Intel readying MIC x64 coprocessor for 2012 • The Register.

Ahhh, alas poor ol’ Larrabee, we hardly knew ye. And yet, somehow your ghost will rise again, and again and again. I remember the hints at the 80 core cpu, which then fell to 64 cores, 40 cores and now just today I read this article to find out it is merely Larrabee and only has a grand total of (hold tight, are you ready for this shocker?) 32 cores. Wait what was that? Did you say 32 cores? Let’s turn back the page to May 15, 2009 where Intel announced the then new Larrabee graphics processing engine with a 32-core processor. That’s right, nothing (well maybe not nothing) has happened in TWO YEARS! Or very little has happened a few die shrinks, and now the upcoming 3D transistors (tri-gate) for the 22nm design revision for Intel Architecture CPUs. It also looks like they may have shuffled around the floor plan/layout of the first gen Larrabee CPU to help speed things up a bit. But, other than these incrementalist appointments the car looks vastly like the model year car from two years ago. Now, what we can also hope has improved since 2009 is the speed and efficiency of the compilers Intel’s engineers have crafted to accompany the release of this re-packaged Larrabee.

Intel shows glimpse of 32-core Larrabee beast (Chris Mellor @ http://www.theregister.co.uk)

Categories
gpu technology wintel

Intel Gets Graphic with Chip Delay – Bits Blog – NYTimes.com

Intel’s executives were quite brash when talking about Larrabee even though most of its public appearances were made on PowerPoint slides. They said that Larrabee would roar onto the scene and outperform competing products.

via Intel Gets Graphic with Chip Delay – Bits Blog – NYTimes.com.

And so now finally the NY Times nails the coffin shut on Intel’s Larrabee saga. To refresh your memory this is the second attempt by Intel to create a graphics processor. The first failed attempt was some years ago in the late 1990s when 3dfx (bought by nVidia) was tearing up the charts with their Voodoo 1 and Voodoo 2 PCI-based 3D accelerator cards. The age of Quake, Quake 2 were upon us and everyone wanted smoother frame rates. Intel wanted to show its prowess in the design of a low cost graphics card running on the brand new AGP slot which Intel had just invented (remember AGP?). What turned out was a similar set of delays and poor performance as engineering samples came out of the development labs. Given the torrid pace of products released by nVidia and eventually ATI, Intel couldn’t keep up. Their benchmark was surpassed by the time their graphics card saw the light of day, and they couldn’t give them away. (see Wikipedia: Intel  i740)

Intel i740 AGP graphics card
1998 saw the failure of the Intel i740 AGP graphics card

The Intel740, or i740, is a graphics processing unit using an AGP interface released by Intel in 1998. Intel was hoping to use the i740 to popularize the AGP port, while most graphics vendors were still using PCI. Released with enormous fanfare, the i740 proved to have disappointing real-world performance, and sank from view after only a few months on the market

Enter Larrabee, a whole new ball game at Intel, right?! The trend toward larger numbers of parallel processors on GPUs from nVidia and ATI/AMD led Intel to believe they might leverage some of their production lines to make a graphics card again. But this time it was different, nVidia had moved from single purpose GPUs to General Purpose GPUs in order to create a secondary market using their cards as compute intensive co-processor cards. They called it CUDA and provided a few development tools at the early stages. Intel latched onto this idea of the General Purpose GPU and decided they could do better. What’s more general purpose than an Intel x86 processor right? And what if you could provided the libraries and Hardware Abstraction Layer that could turn a larger number of processor cores into something that looked and smelled like a GPU?

For Intel it seemed like a win/win/win everybody wins. The manufacturing lines using older design rules at the 45nm size could be utilized for production, making the graphics card pure profit. They could put 32 processors on a card and program them to do multi duties for the OS (graphics for games, co-processor for transcoding videos to MP4). But each time they did a demo a product white paper and demo at a trade show it became obvious the timeline and schedule was slipping. They had benchmarks to show, great claims to make, future projections of performance to declare. Roadmaps were the order of the day. But just last week rumors started to set in.

Similar to the graphics card foray of the past Intel couldn’t beat it’s time to market demons. The Larrabee project was going to be so late and still was using 45nm manufacturing design rules. Given Intel’s top of the line production lines moved to 32nm this year, and nVidia and AMD are doing design process shrinks on their current products, Intel was at a disadvantage. Rather than scrap the thing and lose face again, they decided to recover somewhat and put Larrabee out there as a free software/hardware development kit and see if that was enough to get people to bite. I don’t know what if any benefit any development on this platform would bring. It would rank right up there with the Itanium and i740 as hugely promoted dead-end products with zero to negative market share. Big Fail – Do Not Want.

And for you armchair Monday morning technology quarter backs here are some links to enjoy leading up to the NYTimes article today:

Tim Sweeney Laments Intel Larrabee Demise (Tom’s Hardware Dec. 7)

Intel Kills Consumer Larrabee Plans (Slashdot Dec. 4)

Intel delays Larrabee GPU, aims for developer “kit” in 2010 (MacNN Dec. 4)

Intel condemns tardy Larrabee to dev purgatory (The Register Dec.4)

Categories
computers technology wintel

More word on Larrabee, the i740 of new GPUs

Remembering that the Intel Itanium was supposed to be a ground-breaking departure with the past, can Larrabee be all that and more for graphics? Itanium is still not what Intel had hoped. And poor early adopters are still buying new and vastly over-priced minor incremental revs of the same CPU architecture to this day. Given the delays (2011 is now the release date) and it’s size (650mm^2) how is Intel every going to make this project a success. It seems bound for the the Big Fail heap of the future as it bears uncanny resemblances to Itanium and the Intel i740 graphics architecture. The chips is far too big and the release date way to far into the future to keep up with developments at nVidia and AMD. They are not going to stand still waiting for the behemoth to release to manufacturing. I just don’t know how Larrabee is ever going to be successful. It took so long to release the i740, that the market for low end graphics GPUs had eroded to the point where Intel could only sell it for the measly price of $35 per card, and even then no one bought it.

Larrabee GPU

According to current known information, our source indicated that Larrabee may end up being quite a big chip–literally. In fact,we were informed that Larrabee may be close to 650mm square die, and to be produced at 45nm. “If those measurements are normalized to match Nvidia’s GT200 core, then Larrabee would be roughly 971mm squared,” said our source–hefty indeed. This is of course, an assumption that Intel will be producing Larrabee on a 45nm core.

via Intel’s ‘Larrabee’ to Be “Huge” – Tom’s Hardware.