SeaMicro has come up with a scenario it can win. But it’s very specific, esoteric and niche to be a winning advertising campaign. Suffice it to say, if you need SeaMicro you probably already know it and have bought one by now. If you don’t, well most likely you are doing fine with what you already have. Intel Xeon versus Atom x64 you be the judge
I still have great hopes for Tilera in the data center cloud market place. But the only real competition out there now is Seamicro’s own SM-10000×64 which is tearing up the charts with Intel’s Atom N570. Once Tilera is able to ship its chips in volume and get manufacturers to start building servers with Tilera CPUs inside, it will be a true horse race.
Most recently I have detected a disturbance in the Force. Teh Purveyors of the predominant Data Center paradigm large drive arrays costing 100,000 dollars and up are going to have their lunch eaten by a young upstart. Hitachi Data Systems, EMC, NetApp and IBM you better watch out, you better not cry, you better be faster ‘cuz I’m a tellin’ ya’ why.
Intel is doing its level best to spread Fear, Uncertainty and Doubt concerning how viable ARM based chips would be in a data center server rack. ARM is the engine of many a cell phone, but server loads? That’s the question Intel is trying to raise even as data center floor space and cooling costs become more expensive. Calxeda is trying to emphasize the lower energy consumption going forward attempting to put a greener face on its potential data center installations.
Visualizations and their efficacy always takes me back to Edward Tufte‘s big hard cover books on Infographics (or Chart Junk when it’s done badly). In terms of this specific category, visualization leading to a goal I think it’s still very much a ‘general case’. But examples are always better than theoretical descriptions of an ideal.… Continue reading Goal oriented visualizations? (via Erik Duval’s Weblog)
All the Fear, Uncertainty and Doubt (FUD) spread by big legacy manufacturers of hard drive storage in the data center is a way to stem or delay the burgeoning tidal wave of Flash memory based storage. Yes the economics of Flash based storage are not quite there yet, but for the high performance, high throughput folks the future is now.
I don’t know if you have ever heard of Relational Databases or Structured Query Language. They became di rigeur after 1977 in most corporate data centers pushing more power into the hands of users instead of programmers. But that type of structured data can only carry you so far until you bump against its limits. In this age of Social Networking and data gathering on users, we are severely testing the limits of the last big thing in databases.
What darkness lurks in the hearts of men? Only the shadow knows right? Or possibly a Tilera Chip sitting in an NSA data skimming operation located at your local Internet GigaPOP.
In spite of yesterday’s news about Intel’s “3D” transistor in its up and coming 22nm production fab, there’s other Intel research still ongoing that might prove to be groundbreaking as well. I’m talking about the experimental 80 core intel cpu that followed on from Intel’s failed attempt at a graphics processor, the i740 and it’s follow-up the Larrabee. The latter gpu notably was created using a bunch of general purpose Intel Pentium 54c cores, shrunk down and crammed together on a PCI card. From Larrabee sprung the 80 core cpu research, which shrank to 48 cores and now 24.
Amazon has a datacenter that they both use for their own internal commerce website, but also share out to anyone willing to pay hourly rates for access to the Amazon data cloud. Part of the whole constellation of services is a fault tolerant data storage (think a farm of hard drives all in racks) that will automatically detect problems and switchover to a different location without human intervention. Well that didn’t happen during an outage back in April on Amazon Web Services.