Posts Tagged ‘apple’
“Could Apple be opening up the platform more?” he asked. “What happens to NVIDIA? Why support for cards that aren’t in Macs yet? Will the 2011 Sandy Bridge iMacs contain one or more of these new 6xxx cards?”
This is an interesting tidbit of news. A Macintosh hacker has discovered within the most recent update of Mac OS X 10.6 a number of hardware drivers for ATI graphics cards that do not ship and are currently ‘unsupported’ on the Mac. Anyone who has attempted to buy after market, third party OEM graphics cards for Macs know this is treacherous minefield to navigate. The principle problem being Apple absolutely positively does not want people sticking any old graphics card in the Macintosh Pro towers. Or even in old legacy towers going back to the first PowerPC/PCI based Macs. No, you must buy direct from Apple the bona fide supported hardware with drivers they supply. In a pinch you might be able to fake it with a PC graphics card that has had its BIOS flashed to make it appear to be a genuine Apple part.
But now if Apple is just bundling up a bunch of drivers for various and sundry graphics cards (albeit from one supplier: ATI), is it possible you could finally buy any card you wanted and it would work? That would be big news indeed for any owner of an end-user upgradeable Macintosh Pro owner and welcome news at that. I’m hoping that this news continues to develop and Apple comes out with a policy or strategy statement heralding a change in past policy towards peripheral manufacturers. More devices being supported would be a great thing.
- 10.6.7 has new AMD video card support, perhaps for new iMacs (9to5mac.com)
- Mac OS X may natively support “PC” Radeon graphics cards (arstechnica.com)
Apple’s Xserve was born in the spring of 2002 and is scheduled to die in the winter of 2011, and I now step up before its mourners to speak the eulogy for Apple’s maligned and misunderstood server product.
Chuck Goolsbee’s Eulogy is spot on, and every point is true according even to my limited experience. I’ve purchased 2 different Xserves since they were introduced. On is 2nd generation G4 model, the other is a 2006 Intel model (thankfully I skipped the G5 altogether). Other than a weird bug in the Intel based Xserve (weird blue video screen), there have been no bumps or quirks to report. I agree that form factor of the housing is way too long. Even in the rack I used (a discard SUN Microsystems unit), the thing was really inelegant. Speaking of the drive bays too is a sore point for me. I have wanted dearly to re-arrange reconfigure and upgrade the drive bays on both the old and newer Xserve but the expense of acquiring new units was prohibitive at best, and they went out of manufacture very quickly after being introduced. If you neglected to buy your Xserve fully configured with the maximum storage available when it shipped you were more or less left to fend for yourself. You could troll Ebay and Bulletin Boards to score a bona fide Apple Drivebay but the supply was so limited it drove up prices and became a black market. The XRaid didn’t help things either, as drivebays were not consistently swappable from the Xserve to the XRaid box. Given the limited time most sysadmins have with doing research on purchases like this to upgrade an existing machine, it was a total disaster, big fail and unsurprising.
I will continue to run my Xserve units until the drives or power supplies fail. It could happen any day, any time and hopefully I will have sufficient warning to get a new Mac mini server to replace it. Until then, I too, along with Chuck Goolsbee among the rest of the Xserve sysadmins will kind of wonder what could have been.
Last year, Samsung told the world it had teamed with Instrinsity on a 1GHz ARM chip known as the Hummingbird, and Samsung manufactures the ARM chips underpinning the Apple iPhone, a smaller version of the iPad. This has led many to assume that the Hummingbird architecture is the basis for the the A4.
I am sure that Apple’s ability to act quickly and independently helped win them not just design expertise, but an actual nearly finished CPU in the form of the Hummingbird project. There does now seem to be a smartphone Megahertz War similar to the bad old days of desktop computing when AMD and Intel fought it out 1 gigahertz at a time. We will see what comes of this when the new iPhones come out this Summer. A4 may not translate into a handheld cpu form factor. But looking at the iFixit teardown of the iPad makes me think the iPad motherboard is almost the size of a cell phone! So who knows, maybe A4 is scalable down to iPhone as well. We’ll find out in June I’m sure when Apple hosts its Worldwide Developers Conference (WWDC) in San Francisco, CA.
The custom A4 processor in the iPad is in reality a castrated Cortex A8 ARM design, say several sources.
This is truly interesting, and really shows some attempt to optimize the chips with ‘known’ working designs. Covering the first announcement of the A4 chip by Brightside of News, I tried to argue that customizing a chip by licensing a core design from ARM Holdings Inc. isn’t all that custom. Following this Ashlee Vance wrote in the NYTimes the cost of development for the A4 ‘could be’ upwards of $1Billion. And now just today MacNN/Electronista is saying Apple used the ARM A8. By this I mean the ARM Cortex A8 is a licensed core already being used in the Apple iPhone 3GS. It is a proven, known cpu core that engineers are familiar with at Apple. Given the level of familiarity, it’s a much smaller step to optimize that same CPU core for speed and integration with other functions. Like for instance the GPU or memory controllers can be tightly bound into the final CPU. Add a dose of power management and you got good performance and good battery life. It’s not cutting edge to be sure, but it is more guaranteed to work right out of the gate. That’s a bloodthirsty step in the right direction of market domination. However, the market hasn’t quite yet shown itself to be so large and self sustaining that slate devices are a sure thing in the casual/auxiliary/secondary computing device market. You may have an iPhone and you may have a laptop, bu this device is going to be purchased IN ADDITION not INSTEAD OF those two existing device markets. So anyone who can afford a third device is probably going to be the target market for iPad as opposed to creating a new platform for people that want to substitute an iPad for either the iPhone or laptop.
In bypassing a traditional chip maker like Intel and creating its own custom ARM-based processor for the iPad, Apple has likely incurred an investment of about $1 billion, a new report suggests.
After reading the NYTimes article linked to within this article I can only conclude it’s a very generalized statement that it costs $1Billion to create a custom chip. The exact quote from the NYTimes article author Ashlee Vance is: “Even without the direct investment of a factory, it can cost these companies about $1 billion to create a smartphone chip from scratch.”
Given that is one third the full price of building a chip fabrication plant, why so expensive? What is the breakdown of those costs. Apple did invest money in PA Semiconductor to get some chip building expertise (they primarily designed chips that were fabricated at overseas contract manufacturing plants). Given Qualcomm has created the Snapdragon CPU using similar cpu cores from ARM Holdings Inc., they must have $1Billion to throw around too? Qualcomm was once dominant in the cell phone market licensing its CDMA technology to the likes of Verizon. But it’s financial success is nothing like the old days. So how does Qualcomm come up with $1Billion to develop the Snapdragon CPU for smartphones? Does that seem possible?
Qualcomm and Apple are licensing the biggest building blocks and core intellectual property from ARM, all they need to do is route and place and verify the design. Where does the $1Billion figure come into it? Is it the engineers? Is it the masks for exposing the silicon wafers? I argue now as I did in my first posting about the Apple A4 chip, the chip is an adaptation of intellectual property, a license to a CPU design provided by ARM. It’s not literally created from ‘scratch’ starting with no base design or using completely new proprietary intellectual property from Apple. This is why I am confused. Maybe ‘from scratch’ means different things to different people.
I remember reading announcements of the 64GB SDXC card format coming online from Toshiba. And just today Samsung has announced it’s making a single chip 64GB flash memory module with a built-in memory controller. Apple’s iPhone design group has been big fans of the single chip large footprint flash memory from Toshiba. They bought up all of Toshiba’s supply of 32GB modules before they released the iPhone 3GS last Summer. Samsung too was providing the 32GB modules to Apple prior to the launch. Each Summer newer bigger modules are making for insanely great things that the iPhone can do. Between the new flash memory recorders from Panasonic/JVC/Canon and the iPhone what will we do with the doubling of storage every year? Surely there will be a point of diminishing return, where the chips cannot be made any thinner and stacked higher in order to make these huge single chip modules. I think back to the slow evolution and radical incrementalism in the iPod’s history going from 5GB’s of storage to start, then moving to 30GB and video! Remember that? the Video iPod @ 30GBytes was dumbfounding at the time. Eventually it would top out at 120 and now 160GBytes total on the iPod classic. At the rate of change in the flash memory market, the memory modules will double in density again by this time next year, achieving 128GBytes for a single chip modules with embedded memory controller. At that density a single SDHC sized memory card will also be able to hold that amount of storage as well. We are fast approaching the optimal size for any amount of video recording we could ever want to do and still edit when we reach the 128 Gbyte mark. At that size we’ll be able to record 1080p video upwards of 20 hours or more on today’s video cameras. Who wants to edit much less watch 20 hours of 1080p video? But for the iPhone, things are different, more apps means more fun. And at 128GB of storage you never have to delete an app, or an single song from your iTunes or a single picture or video, just keep everything. Similarly for those folks using GPS, you could keep all the maps you ever wanted to use right onboard rather than download them all the time thus providing continuous navigation capabilities like you would get with a dedicated GPS unit. I can only imagine the functionality of the iPhone increasing as a result of the increased storage 64GB Flash memory modules would provide. Things can only get better. And speaking of better, The Register just reported today some future directions.
There could be a die process shrink in the next gen flash memory products. There are also some opportunities to use slightly denser memory cells in the next gen modules. The combination of the two refinements might provide the research and design departments at Toshiba and Panasonic the ability to double the density of the SDXC and Flash memory modules to the point where we could see 128GBytes and 256GBytes in each successive revision of the technology. So don’t be surprised if you see a Flash memory module as standard equipment on every motherboard to hold the base Operating System with the option of a hard drive for backup or some kind of slower secondary storage. I would love to see that as a direction netbook or full-sized laptops might take.
http://www.electronista.com/articles/09/04/27/toshiba.32nm.flash.early/ (Toshiba) Apr 27, 2009
http://www.electronista.com/articles/09/05/12/samsung.32gb.movinand.ship/ (Samsung) May 13, 2009
http://www.theregister.co.uk/2010/01/14/samsung_64gbmovinand/ (Samsung) Jan 14, 2010
The answer to the question in this picture to the left is a resounding NO! All bets are being placed on Apple using a custom processor for it’s version of a Tablet PC. This is interesting in that everyone in the Technology Computer Gizmo/Gadget news circles has pursued this as a story starting last week.
China Times Daily apparently is hinting a new Mac Tablet is being manufactured for release in October of this year. I’m surprised they decided to go with a custom processor for the tablet. But given the ultra-competitiveness of the Wintel netbook market, CPUs are the next big thing in product differentiation. There are manufacturers now using Google’s Android cell phone OS paired with cell phone processors from ARM and Motorola for a new generation of battery conserving netbooks. Most of those products are targeted at the Pacific Rim market and will never see the American market at all. Which made me sad because I would love to have a netbook with extra long battery run times.
I have adapted much of my computer needs to what can be delivered through a network, web-browser and web apps. So the netbook to me is a nice analog to a cell phone and I’ve been waiting to jump into the market until some bigger innovations occured. Maybe this product will help shift the market the way the iPhone has done for cell phones. And with their new found CPU designs maybe product differentiation will even be easier.
However, hairy eyeball of experience rears it’s ugly head and takes the shine of this buzzing hive of technology press bees. Enter Mark Sigal @ O’Reilly.com. Mark doesn’t think the tablet is the real story, but that the iPod Touch IS the Mac Tablet right here, right now. Given Mark Sigal’s earlier survey of the mobile computing landscape he proposes a unified matrix of Apple computing products rather than phone vs. computer. So the longer one waits, the less we have to worry about whether we should be buying a Tablet or an iPhone. Personally I think the bigger screen and the new CPU from PA-Semi does warrant some extra attention. I think we’re going to see either longer battery run times, or maybe mix of iApps and iLife and iWork on the same happy device. But, who knows? We all have to wait until October.
The VentureBeat note says that Apple divided the PA-Semi designers between two projects: ARM-based mobile phone processors on the one hand and a tablet processor, possibly ARM-based as well, on the other.
So we are looking at an Apple CPU-powered Mac tablet with touchscreen functionality and an October launch. The timing is said to be suitable for sales in the lead up to Christmas. Neither the manufacturers nor Apple are saying anything.
Futurists are all alike. You have your 20th Century types like the Italians who celebrated war. You have Hitler’s architect Albert Speer. You have guys like George Gilder hand waving, making big pronouncements. And all of them use terms like paradigm and cusp as a warning to you slackers, trailers, luddite ne’er-do-wells. Make another entry in your list of predictions for Apple’s Worldwide Developers Conference (WWDC). Everyone feels like Apple has to really top what it’s achieved since last year with the Apple iPhone, the iPhone OS and the AppStore. Mark Sigal writing for O’Reilly Radar believes there’s so much untapped juice within the iPhone that an update in the OS will become the next cusp/paradigm shift.
From today’s O’Reilly Radar article by Mark Sigal:
Flash forward to the present, and we are suddenly on the cusp of a game-changing event; one that I believe kicks the door open for 3D and VR apps to become mainstream. I am talking about the release of iPhone OS version 3.0.
I’m not so certain. One can argue that even the average desktop 3D accelerator doesn’t really do what Sigal would ‘like’ to see in the iPhone. Data overlays is nice, for a 3D glasses kind of application sure, but it’s not virtual reality. It’s more like a glorified heads-up display which the military has had going back to the Korean War. So enter me into the column of the hairy eyeball, critical and suspicious of claims that an OS Update will change things. In fact OSes don’t change things. The way people think about things, that’s what changes things. The move of the World Wide Web from an information sharing utility to a medium for commerce, that was a cusp/paradigm shift. And so it goes with the iPhone and the Viewmaster Viewer. They’re fun yes. But do they really make us change the way we think?
A co-worker has been working on a reporting tool to allow a Mac user to get reports from Time Machine whenever there’s a failure in the backup. Failure messages occasionally come up when Time Machine runs, but it never says what folder, what file or really what kind of failure occured. Which is not what you want if you are absolutely depending on the data being recoverable via Time Machine. It’s not bulletproof and it will lull you into complacency once you have it up and running. I tend to agree that a belt and suspenders approach is best. I’ve read countless articles saying Disk Clones are the best, and on the other side, Incremental Backups are most accurate (in terms of having the latest version of a file) and more efficient with disk space (no need to duplicate the system folder again right?) With the cost of Western Digital My Books dropping all the time, you could purchase two separate USB2 Lifebooks, use a disk cloning utility for one drive, Time Machine for the other. Then you would have a bullet proof backup scheme. One reader commented in this article that off-site backup is necessary as well, so include that as the third leg of your backup triad.
Since errors and failure can happen in any backup system, we recommend that if you have the available resources (namely, spare external hard drives) that you set up dual, independent backups, and, in doing so, take advantage of more than one way of backing up your system. This will prevent any errors in a backup system from propagating to subsequent backups.
One strongly recommended solution that we advocate is to have both a snapshot-based system such as Time Machine in addition to a bootable clone system as well using a software package such as SuperDuper or Carbon Copy Cloner. Doing this will ensure you can both boot and access your most recently changed files in the event of either data loss or hardware failure.
It’s no secret Robert X. Cringely follows the strategic directions of Apple’s laptop/desktop design teams:
In Robert X. Cringley’s recent posting on PBS.org brings up the topic of Apple’s attempt to incorporate H.264 into their product line. New buyers of the most recently introduced Mac laptops have rushed to measure the CPU load of their machines while playing back HD TV and Movie content downloaded from the iTunes store. CPU’s are now only idling along at 20% capacity versus the old 100%+ experienced in the previous generation of Mac desktops and laptops. Where is the secret sauce?
Cringley expected NTT of Japan to provide a special custom made encoder/decoder chip specifically geared for the H.264 codec. However nowhere in the current tear downs of the the MacBook and MacBook Pro has anyone identified a free standing chip doing the offloading of H.264 decoding. Now he’s speculating the chip might have been licensed as a ‘core’ by nVidia and incorporated into the new fully integrated chipset that drives all the I/O on the motherboard. Somewhere in there maybe even in the 16 cores of the video processor some kind of H.264 decoding acceleration is going on. But it’s not being touted very widely by the Apple marketing machine.
Cringely suspects there’s a reason to soft pedal H.264 acceleration on the new Macintoshes. While iTunes has been in the past nothing more than a means to an end (you want to sell iPods? Well get the content to play on them first!), the burgeoning field of online content distribution may be the next big end. Netflix has shown that even in a snail mail distribution network, there is potential for a profit to be made. But as I’ve heard coworkers repeat in the past, where’s the profit of letting someone OWN the content. There is a feeling amongst a number of internet bloggers, consultants, and insiders that Hollywood wants to rent, not let you own the creative output of their studios. Whether it be music, TV or film you have to pay in order play. A one time ownership fee is a hard way to make a living. But future payments for each viewing, now that’s a guaranteed revenue stream.
What’s standing in the way of the stream is the series of tubes. The interwebs as they exist in the U.S. today make the Netflix distribution network far more workable and profitable than any attempt to push 5GB of HD versions of SpiderMan 3 into your Apple TV. The network will not allow for this to work on any scale right now. So the first step in the plan is to get H.264 decoding to work effortlessly on Mac products then sit back and wait and hope somehow the network will evolve to the level that Steve Jobs thinks it should.
What would lead Steve Jobs to think the network is going to rush in and save the day? How many articles do you read on Slashdot regularly about how far behind the U.S. is when it comes to Internet infrastructure? Why does anyone at Apple think this is going to work? It’s quite a stretch, and I don’t see it happening in my lifetime. Good Luck Apple.