Posts Tagged ‘apple’
The schedules may help back mounting beliefs that the iPhone 5 will 64GB iPhone 4 prototype appeared last month that hinted Apple was exploring the idea as early as last year. Just on Tuesday, a possible if disputed iPod touch with 128GB of storage also appeared and hinted at an upgrade for the MP3 player as well. Both the iPhone and the iPod have been stuck at 32GB and 64GB of storage respectively since 2009 and are increasingly overdue for additional space.
Toshiba has revised its flash memory production lines again to keep pace with the likes of Intel, Micron and Samsung. Higher densities and smaller form factors seemed to indicate they are gearing up for a big production run of the highest capacity memory modules they can make. It’s looking like a new iPhone might be the candidate to receive newer multi-layer single chip 64GB Flash memory modules this year.
A note of caution in this arms race of ever smaller feature sizes on the flash memory modules, the smaller you go the less memory read/write cycles you get. I’m becoming aware that each new generation of flash memory production has lost an amount of robustness. This problem has been camouflaged maybe even handled outright by the increase in over-provisioning of chips on a given size Solid State Disk (sometimes as low as 17% more chips than that which is typically used when the drive is full). Through careful statistical modeling and use of algorithms, an ideal shuffling of the deck of available flash memory chips allows the load to be spread out. No single chip fails as it’s workload is shifted continuously to insure it doesn’t receive anywhere near the maximum number of reliable read write cycles. Similarly, attempts to ‘recover’ data from failing memory cells within a chip module are also making up for these problems. Last but not least outright error-correcting hardware has been implemented on chip to insure everything just works from the beginning of the life of the Solid State Disk (SSD) to the finals days of its useful life.
We may not see the SSD eclipse the venerable kind off high density storage, the Hard Disk Drive (HDD). Given the point of diminishing return provided by Moore’s Law (scaling down increases density, increases speed, lowers costs), Flash may never get down to the level of density we enjoy in a typical consumer brand HDD (2TBytes). We may have to settle for other schemes that get us to that target through other means. Which brings me to my favorite product of the moment, the PCIe based SSD. Which is nothing more than a big circuit board with a bunch of SSD’s tied together in a disk array with a big fat memory controller/error-correction controller sitting on it. In terms of speeds using the PCI Express bus, there are current products that beat single SATA 6 SSDs by a factor of two. And given the requirements of PCI, the form factor of any given module could be several times bigger and two generations older to reach the desired 2Terbyte storage of a typical SATA Hard Drive of today. Which to me sounds like a great deal if we could also see drops in price and increases in reliability by using older previous generation products and technology.
But the mobile market is hard to please, as they are driving most decisions when it comes to what kind of Flash memory modules get ordered en masse. No doubt Apple, Samsung and anyone in consumer electronics will advise manufacturers to consistently shrink their chip sizes to increase density and keep prices up on final shipping product. I don’t know how efficiently an iPhone or iPad use the available memory say on a 64GByte iTouch let’s say. Most of that goes into storing the music, TV shows, and Apps people want to have readily available while passing time. The beauty of that design is it rewards consumption by providing more capacity and raising marginal profit at the same time. This engine of consumer electronics design doesn’t look likely to end in spite of the physical limitations of shrinking down Flash memory chips. But there will be a day of reckoning soon, not unlike when Intel hit the wall at 4Ghz serial processors and had to go multi-core to keep it’s marginal revenue flowing. It’s been very lateral progress in terms of processor performance since then. It is more than likely Flash memory chips cannot get any smaller without being really unreliable and defective, thereby sliding into the same lateral incrementalism Intel has adopted. Get ready for the plateau.
Cisco killed off the much-beloved Flip video camera Tuesday. It was an unglamorous end for a cool device that just few years earlier shocked us all by coming to dominate the video-camera market, utterly routing established players like Sony and Canon
I don’t usually write about Consumer Electronics per se. This particular product category got my attention due to it’s long gestation and overwhelming domination of a category in the market that didn’t exist until it was created. It was the pocket video camera with a built-in flip out USB connector. Like a USB flash drive with a LCD screen, a lens and one big red button, the Flip pared down everything to the absolute essentials, including the absolute immediacy of online video sharing via YouTube and Facebook. Now the revolution has ended, devices have converged and many are telling the story of explaining Why(?) this has happened. In the case of Wired.com’s Robert Capps he claims Flip lost its way after Cisco lost its way doing the Flip 2 revision, trying to get a WiFi connected camera out there for people to record their ‘Lifestream’.
Prior to Robert Capps, different writers for different pubs all spouted the conclusion of Cisco’s own Media Relations folks. Cisco’s Flip camera was the victim of inevitable convergence, pure and simple. Smartphones, in particular Apple’s iPhone kept adding features all once available only on the Flip. Easy recording, easy sharing, larger resolution, bigger LCD screen, and it could play Angry Birds too! I don’t cotton to that conclusion as fed to us by Cisco. It’s too convenient and the convergence myth does not account for the one thing Flip has the iPhone doesn’t have, has never had WILL never have. And that is a simple, industry standard connector. Yes folks convergence is not simply displacing cherry-picked features from one device and incorporating into yours, no. True convergence is picking up all that is BEST about one device and incorporating it, so that fewer and fewer compromises must be made. Which brings me to the issue of the Apple multi-pin connector that has been with us since the first iPod hit the market in 2002.
See the Flip didn’t have a proprietary connector, it just had a big old ugly USB connector. Just as big and ugly as the one your mouse and keyboard use to connect to your desktop computer. The beauty of that choice was Flip could connect to just about any computer manufactured after 1998 (when USB was first hitting the market). The second thing was all the apps for making the Flip play back the videos you shot or to cut them down and edit them were sitting on the Flip, just like hard drive, waiting for you to install them on whichever random computer you wanted to use. Didn’t matter whether or not it had the software installed, it COULD be installed directly from the Flip itself. Isn’t that slick?! You didn’t have to first search for the software online, download and install, it was right there, just double-click and go.
Compare this to the Apple iOS cul-de-sac we all know as iTunes. Your iPhone, iTouch, iPad, iPod all know your computer not through simply by communicating through it’s USB connector. You must first have iTunes installed AND have your proprietary Apple to USB connector to link-up. Then and only then can your device ‘see’ your computer and the Internet. This gated community provided through iTunes allows Apple to see what you are doing, market directly to you and watch as you connect to YouTube to upload your video. All with the intention of one day acting on that information, maintaining full control at each step along the path way from shooting to sharing your video. If this is convergence, I’ll keep my old Flip mino (non-HD) thankyou very much. Freedom (as in choice) is a wonderful thing and compromising that in the name of convergence (mis-recognized as convenience) is no compromise. It is a racket and everyone wants to sell you on the ‘good’ points of the racket. I am not buying it.
- RIP Flip cameras.. You will be missed! (chatootsboots.wordpress.com)
- Alternatives to the dearly departed Flip camera (trafcom.typepad.com)
- Farewell, Flip Camera (www.readwriteweb.com)
- Cisco fades out Flip camera (www.consumerreports.com)
- Why Cisco’s Flip Flopped in the Camera Business (www.wired.com/gadgetlab)
“Could Apple be opening up the platform more?” he asked. “What happens to NVIDIA? Why support for cards that aren’t in Macs yet? Will the 2011 Sandy Bridge iMacs contain one or more of these new 6xxx cards?”
This is an interesting tidbit of news. A Macintosh hacker has discovered within the most recent update of Mac OS X 10.6 a number of hardware drivers for ATI graphics cards that do not ship and are currently ‘unsupported’ on the Mac. Anyone who has attempted to buy after market, third party OEM graphics cards for Macs know this is treacherous minefield to navigate. The principle problem being Apple absolutely positively does not want people sticking any old graphics card in the Macintosh Pro towers. Or even in old legacy towers going back to the first PowerPC/PCI based Macs. No, you must buy direct from Apple the bona fide supported hardware with drivers they supply. In a pinch you might be able to fake it with a PC graphics card that has had its BIOS flashed to make it appear to be a genuine Apple part.
But now if Apple is just bundling up a bunch of drivers for various and sundry graphics cards (albeit from one supplier: ATI), is it possible you could finally buy any card you wanted and it would work? That would be big news indeed for any owner of an end-user upgradeable Macintosh Pro owner and welcome news at that. I’m hoping that this news continues to develop and Apple comes out with a policy or strategy statement heralding a change in past policy towards peripheral manufacturers. More devices being supported would be a great thing.
- 10.6.7 has new AMD video card support, perhaps for new iMacs (9to5mac.com)
- Mac OS X may natively support “PC” Radeon graphics cards (arstechnica.com)
Apple’s Xserve was born in the spring of 2002 and is scheduled to die in the winter of 2011, and I now step up before its mourners to speak the eulogy for Apple’s maligned and misunderstood server product.
Chuck Goolsbee’s Eulogy is spot on, and every point is true according even to my limited experience. I’ve purchased 2 different Xserves since they were introduced. On is 2nd generation G4 model, the other is a 2006 Intel model (thankfully I skipped the G5 altogether). Other than a weird bug in the Intel based Xserve (weird blue video screen), there have been no bumps or quirks to report. I agree that form factor of the housing is way too long. Even in the rack I used (a discard SUN Microsystems unit), the thing was really inelegant. Speaking of the drive bays too is a sore point for me. I have wanted dearly to re-arrange reconfigure and upgrade the drive bays on both the old and newer Xserve but the expense of acquiring new units was prohibitive at best, and they went out of manufacture very quickly after being introduced. If you neglected to buy your Xserve fully configured with the maximum storage available when it shipped you were more or less left to fend for yourself. You could troll Ebay and Bulletin Boards to score a bona fide Apple Drivebay but the supply was so limited it drove up prices and became a black market. The XRaid didn’t help things either, as drivebays were not consistently swappable from the Xserve to the XRaid box. Given the limited time most sysadmins have with doing research on purchases like this to upgrade an existing machine, it was a total disaster, big fail and unsurprising.
I will continue to run my Xserve units until the drives or power supplies fail. It could happen any day, any time and hopefully I will have sufficient warning to get a new Mac mini server to replace it. Until then, I too, along with Chuck Goolsbee among the rest of the Xserve sysadmins will kind of wonder what could have been.
Last year, Samsung told the world it had teamed with Instrinsity on a 1GHz ARM chip known as the Hummingbird, and Samsung manufactures the ARM chips underpinning the Apple iPhone, a smaller version of the iPad. This has led many to assume that the Hummingbird architecture is the basis for the the A4.
I am sure that Apple’s ability to act quickly and independently helped win them not just design expertise, but an actual nearly finished CPU in the form of the Hummingbird project. There does now seem to be a smartphone Megahertz War similar to the bad old days of desktop computing when AMD and Intel fought it out 1 gigahertz at a time. We will see what comes of this when the new iPhones come out this Summer. A4 may not translate into a handheld cpu form factor. But looking at the iFixit teardown of the iPad makes me think the iPad motherboard is almost the size of a cell phone! So who knows, maybe A4 is scalable down to iPhone as well. We’ll find out in June I’m sure when Apple hosts its Worldwide Developers Conference (WWDC) in San Francisco, CA.