Reading through all the hubbub and hand-waving from the technology ‘teardown’ press outlets, one would have expected a bigger leap from Apple’s chip designers. A fairly large chip sporting an enormous graphics processor integrated into the die is what Apple came up with to help boost itself to the next higher rez display (so-called Retina Display). The design rule is still a pretty conservative 45nm (rather than try to push the envelope by going with 32nm or thinner to bring down the power requirements). Apple similarly had to boost its battery capacity to make up for this power hungry pixel demon by almost 2X more than the first gen iPad. So for almost the ‘same’ amount of battery capacity (10 hours of reserve power), you get the higher rez display. But a bigger chip and higher rez display will add up to some extra heat being generated, generally speaking. Which leads us to a controversy.
Given this knowledge there has been a recent back and forth argument over thermal design point for iPad 3rd generation. Consumer Reports published an online article saying the power/heat dissipation was much higher than previous generation iPads. They included some thermal photographs indicating the hot spots on the back of the device and relative temperatures. While the iPad doesn’t run hotter than a lot of other handheld devices (say Android tablets). It does run hotter than say an iPod Touch. But as Apple points out that has ALWAYS been the case. So you gain some things you give up some things and still Apple is the market leader in this form factor, years ahead of the competition. And now the tempest in the teapot is winding down as Consumer Reports (via LA Times.com)has rated the 3rd Gen iPad as it’s no. 1 tablet on the market (big surprise). So while they aren’t willing to retract their original claim of high heat, they are willing to say it doesn’t count as ’cause for concern’. So you be the judge when you try out the iPad in the Apple Store. Run it through its paces, a full screen video or 2 should heat up the GPU and CPU enough to get the electrons really racing through the device.
In the bad old days of 1996 when Apple’s marketshare hit rock bottom, everyone fled to Windows 95 en masse. Disparaging the Mac OS every single one of the ‘professional’ technical press predicted the end of Apple. Oh, how wrong they were and the Mac loyal fan-base crowed and shouted with joy that Apple has now achieved a terrific comeback. But, whither the loyal fan-base from days gone by from the Dark Ages pre-Steve, 1996? They will all become part of the iOS collective, they too will be assimilated. Read On:
Rumors of an ARM-based MacBook Air are not new. In May, one report claimed that Apple had built a test notebook featuring the same low-power A5 processor found in the iPad 2. The report, which came from Japan, suggested that Apple officials were impressed by the results of the experiment.
Following up on an article they did back on May 27th, and one prior to that on May 6th, AppleInsider does a bit of prediction and prognosticating about the eventual fusion of iOS and Mac OS X. What they see triggering this is an ARM chip that would be able to execute 64-bit binaries across all of the product lines (A fabled ARM A-6). How long would it take to do this consolidation and interweaving? How many combined updaters, security patches, Pro App updaters would it take to get OS X 10.7 to be ‘more’ like iOS than it is today? Software development is going to take a while and it’s not just a matter of cross-compiling to an ARM chip from a software based on Intel chips.
Given that 64-bit Intel Atom chips are already running on the new Seamircro SM10000 (x64), it won’t be long now I’m sure before the ARM equivalent ARM-15 chip hits full stride. The designers have been aiming for a 4-core ARM design that will be encompassed by the ARM-15 release real soon now (RSN). The next step after that chip is licensed and piloted, tested and put into production will be a 64-bit clean design. I’m curious to see if 64-bit will be applied across ALL the different product lines within Apple. Especially when the issue of power-usage and Thermal Design power (TDM) is considered, will 64-bit ARM chips be as battery friendly? I wonder. True Intel has jumped the 64-bit divide on the desktop with the Core 2 Duo line some time ago and made them somewhat battery friendly. But they cannot compare at all to the 10 hours+ one gets on a 32-bit ARM chip today using the iPad.
Lastly, App Developers will also need to keep their Xcode environment up to date and merge in new changes constantly up to the big cutover to ARM x64. No telling what that’s going to be like apart from the previous 2 problems I have raised here. Apple in the 10.7 Lion run-up was very late in providing the support and tools to allow the developers to get their Apps ready. I will say though that in the history of migrations in Apple’s hardware/software, they have done more of them, more successfully than any other company. So I think they will be able to pull it off no doubt, but there will be much wailing and gnashing of teeth. And hopefully we’ll see something better as the end-users of the technology, something better than a much bigger profit margin for Apple (though that seems to be the prime mover in most recent cases as Steve Jobs has done the long slow fade into obscurity).
If ARM x64 is inevitable and iOS on Everything too, then I’m hoping things don’t change so much I can’t do things similarly to the way I do them now on the desktop. Currently on OS X 10.7 I am ignoring completely:
AppStore (not really because I had to download Lion)
Let’s hope this roster doesn’t get even longer over time as the iOS becomes the de facto OS on all Apple Products. Because I was sure hoping the future would be brighter than this. And as AppleInsider quotes from May 6th,
“In addition to laptops, the report said that Apple would ‘presumably’ be looking to move its desktop Macs to ARM architecture as well. It characterized the transition to Apple-made chips for its line of computers as a ‘done deal’.”
After the release of the iPad, Tom’s Hardware posted an article By Wolfgang Gruener about the History of Computing as it relates to the iPad. We get to meet Alan Kay the Computer Scientist who proposed the “Dynabook” an intellectual predecessor to the iPad. Alan Kay and Steve Jobs are friends in fact and get along very well. Until last week that is when a big problem occurred when another invention of Alan Kay’s the Squeak programming language ran into the Apple App Store. How did this happen? Read On
I wonder: Is there an opportunity for Alan Kay’s Dynabook? An iPad with a Sqeak implementation that enables any user to write his or her own applications, rather than resorting to purchasing an app?
Apple earlier this month instituted a new rule that also effectively blocks meta-platforms: clause 3.3.1, which stipulates that iPhone apps may only be made using Apple-approved programming languages. Many have speculated that the main target of the new rule was Adobe, whose CS5 software, released last week, includes a feature to easily convert Flash-coded software into native iPhone apps.
Some critics expressed concern that beyond attacking Adobe, Apple’s policies would result in collateral damage potentially stifling innovation in the App Store. Scratch appears to be a victim despite its tie to Jobs’ old friend.
What a difference 3 days makes right? Tom’s Hardware did a great retrospective on the ‘originality’ of the iPad and learned a heck of a lot of Computer History along the way. At the end of the article they plug Alan Kay’s Squeak based programming environment called Scratch. It is a free application that is used to create new graphical programs and is used as a means to teach mathematics problem-solving through writing programs in Scratch. The iPad was the next logical step in the distribution of the program, giving kids free access to it whenever and on whatever platform was available. But two days later, the announcement came out the Apple App Store, the only venue by which to purchase or even download software onto the iPhone or the iPad had roundly reject Scratch. The App Store will not allow it to be downloaded and that’s the end of that. The reasoning is Scratch (which is really a programming tool) has an interpreter built-in which allows it to execute the programs written within its programming environment. Java does this, Adobe Flash does this, it’s common with anything that’s like a programming tool. But Apple has forbidden anything that looks, sounds, or smells like a potential way of hijacking or hacking into their devices. So Scratch and Adobe Flash are now both forbidden to run on the Apple iPad. How quickly things change don’t they especially if you read the whole Tom’s Hardware article. Alan Kay and Steve Jobs are presented as really friendly towards one another.
AppleInsider calls a few strikes against hyperbole and supposition found in articles written about the Apple iPad A4 processor. Here now is a more likely accounting of what Apple’s chip design mergers and acquisitions really bought for the iPad development team.
Another report, appearing in The New York Times in February, stated that Apple, Nvidia and Qualcomm were all working to develop their own ARM-based chips before noting that “it can cost these companies about $1 billion to create a smartphone chip from scratch.” Developing an SoC based on licensed ARM designs is not “creating a chip from scratch,” and does not cost $1 billion, but the article set off a flurry of reports that said Apple has spent $1 billion on the A4.
Thankyou AppleInsider for trying to set the record straight. I doubted the veracity of the NYTimes article when I saw that $1Billion figure thrown around (seems more like the price of a Intel chip development project which is usually from scratch). And knowing now from this article here (link to PA Semi historical account), that PA Semi made a laptop version of a dual core G5 chip, leads me to believe power savings is something they would be brilliant at engineering solutions for (G5 was a heat monster, meaning electrical power use was large). P.A. Semi was going to made the G5 power efficient enough to fit into a laptop and they did it, but Apple had already migrated to Intel chips for its laptops.
Intrinsity + P.A. Semiconductor + Apple = A4. Learning that Intrinsity is an ARM developer knits a nice neat picture of a team of chip designers, QA folks and validation folks who would all team up to make the A4 a resounding success. No truer mark of accomplishment can be shown for this effort than Walt Mossberg and David Pogue stating in reviews of the iPad yesterday they both got over 10 hours of run time from their iPads. Kudos to Apple, you may not have made a unique chip but you sure as hell made a well optimized one. Score, score, score.
iPad is all anyone will be talking about for a while. And since I’ve tried to write about its innovations or lack of ‘true’ innovations (jacking up clockspeed is not an innovation) now comes time where real people get to weigh in. But that’s not me, I haven’t used an iPad. So I’ll try to aggregate the stories of people who have.
Apropos to the big Easter Weekend, Apple is releasing the iPad to the U.S. market. David Pogue from the NYTimes has done two reviews in one. Rather than anger his technophile readers or alienate his average readers he gave each audience his own review of a real hands-on iPad. Where’s Walt Mossberg on this topic? (Walt likes it) Pogue more or less says lack of a physical keyboard is a showstopper for many. Instead, users who need a keyboard need to get a laptop of some sort. Otherwise for what it accomplishes through finger gestures and software design the iPad is a pretty incredible end user experience. Whether or not your personality, demeanor is compatible with the iPad is up for debate. But try before you buy, hand-on will tell you much more than doing a web order and hoping for the best. And given the price, it’s a wise choice. Walt Mossberg too feels you had better actually try to use it before you buy. It is in his own words, not like any other computer but in a different class all its own. So don’t trust other people to tell you whether or not it will work for you.
One thing David Pogue is also very enthused by is the data plan seems less onerous than the first and second generation iPhone contracts with AT&T. The dam is about to burst on mandatory data plans, and in the iPad universe you can subscribe and lapse, re-subscribe lapse again depending on your needs. So don’t pay for a long term contract if you don’t need it. That addresses a long-standing problem I have had with the iPhone as it is currently marketed by Apple and AT&T. Battery life is another big upshot. The review models that Mossberg and Pogue used had ‘longer’, read that again LONGER run times than stated by Apple. Both guys tried doing real heavy network and video playback on the devices and went over the 10hr. battery life claimed by Apple. Score a big win for the iPad in that category.
Lastly Pogue hinted at maps looking and feeling like real maps on the bigger display. Mossberg points out the hardware isn’t what’s really important. No, it’s what’s going to show up on the AppStore specifically for the iPad. I think I’ve heard a few M.I.T. types say this before. It’s unimportant what it does. The question is what ‘else’ does it do. And that ‘else’ is the software developer’s coin of the realm. Without developers these products have no legs, no markets outside of the loyal fan base. What may come, no one can tell but it will be interesting times for the iPad owners that’s for sure.
Following further articles published on the Apple iPad cpu, new reports are surfacing the custom CPU Apple created called the A4 may be an ARM Cortex A8 single core cpu with integrated graphics GPU and controller logic.
The custom A4 processor in the iPad is in reality a castrated Cortex A8 ARM design, say several sources.
This is truly interesting, and really shows some attempt to optimize the chips with ‘known’ working designs. Covering the first announcement of the A4 chip by Brightside of News, I tried to argue that customizing a chip by licensing a core design from ARM Holdings Inc. isn’t all that custom. Following this Ashlee Vance wrote in the NYTimes the cost of development for the A4 ‘could be’ upwards of $1Billion. And now just today MacNN/Electronista is saying Apple used the ARM A8. By this I mean the ARM Cortex A8 is a licensed core already being used in the Apple iPhone 3GS. It is a proven, known cpu core that engineers are familiar with at Apple. Given the level of familiarity, it’s a much smaller step to optimize that same CPU core for speed and integration with other functions. Like for instance the GPU or memory controllers can be tightly bound into the final CPU. Add a dose of power management and you got good performance and good battery life. It’s not cutting edge to be sure, but it is more guaranteed to work right out of the gate. That’s a bloodthirsty step in the right direction of market domination. However, the market hasn’t quite yet shown itself to be so large and self sustaining that slate devices are a sure thing in the casual/auxiliary/secondary computing device market. You may have an iPhone and you may have a laptop, bu this device is going to be purchased IN ADDITION not INSTEAD OF those two existing device markets. So anyone who can afford a third device is probably going to be the target market for iPad as opposed to creating a new platform for people that want to substitute an iPad for either the iPhone or laptop.
What’s the point of licensing computer chip designs from another company if it costs about the 1/3 the price of building it yourself? That seems like a rhetorical question, but I always assumed that people who licensed technology from ARM holdings were aiming to save tons of money compared to fabricating the chips themselves. So how does the Apple iPad A4 cpu figure into this? Well it’s a custom CPU, but according to NYTimes creating a new cpu is serious business.
In bypassing a traditional chip maker like Intel and creating its own custom ARM-based processor for the iPad, Apple has likely incurred an investment of about $1 billion, a new report suggests.
After reading the NYTimes article linked to within this article I can only conclude it’s a very generalized statement that it costs $1Billion to create a custom chip. The exact quote from the NYTimes article author Ashlee Vance is: “Even without the direct investment of a factory, it can cost these companies about $1 billion to create a smartphone chip from scratch.”
Given that is one third the full price of building a chip fabrication plant, why so expensive? What is the breakdown of those costs. Apple did invest money in PA Semiconductor to get some chip building expertise (they primarily designed chips that were fabricated at overseas contract manufacturing plants). Given Qualcomm has created the Snapdragon CPU using similar cpu cores from ARM Holdings Inc., they must have $1Billion to throw around too? Qualcomm was once dominant in the cell phone market licensing its CDMA technology to the likes of Verizon. But it’s financial success is nothing like the old days. So how does Qualcomm come up with $1Billion to develop the Snapdragon CPU for smartphones? Does that seem possible?
Qualcomm and Apple are licensing the biggest building blocks and core intellectual property from ARM, all they need to do is route and place and verify the design. Where does the $1Billion figure come into it? Is it the engineers? Is it the masks for exposing the silicon wafers? I argue now as I did in my first posting about the Apple A4 chip, the chip is an adaptation of intellectual property, a license to a CPU design provided by ARM. It’s not literally created from ‘scratch’ starting with no base design or using completely new proprietary intellectual property from Apple. This is why I am confused. Maybe ‘from scratch’ means different things to different people.