computers macintosh technology

Apple A4 SOC unveiled – It’s an ARM CPU and the GPU! – Bright Side Of News*

Getting back to Apple A4, Steve Jobs incorrectly addressed Apple A4 as a CPU. We’re not sure was this to keep the mainstream press enthused, but A4 is not a CPU. Or we should say, it’s not just a CPU. Nor did PA Semi/Apple had anything to do with the creation of the CPU component.

via Apple A4 SOC unveiled – It’s an ARM CPU and the GPU! – Bright Side Of News*.

Apple's press release image of the A4 SoC

Interesting info on the Apple A4 System on Chip which is being used by the recently announced Ipad tablet computer. The world of mobile, low power processors is dominated by the designs of ARM Holdings Inc. Similarly ARM is providing the graphics processor intellectual property too. So in the commodity CPU/GPU and System on Chip (SoC) market ARM is the only way to go. You buy the license you layout the chip with all the core components you license and shop that around to a chip foundry. Samsung has a lot of expertise fabricating these chips made to order using the ARM designs. But Apparently another competitor Global Foundries is shrinking its design rules (meaning lower power and higher clock speeds) and may become the foundry of choice. Unfortunately outfits like iFixit can only figure out what chips and components go into an electronics device. They cannot reverse engineer the components going into the A4, and certainly anyone else would probably be sued by Apple if they did spill the beans on the A4’s exact layout and components. But  because everyone is working from the same set of Lego Blocks for the CPUs and GPUs and forming them into full Systems on a Chip, some similarities are going to occur.

The heart of the new Apple A4 System on Chip

One thing pointed out in this article is the broad adoption of the same clockspeed for all these ARM derived SoCs. 1Ghz is the clock speed across the board despite differences in manufacturers and devices. The reason being everyone is using the same ARM cpu cores and they  are designed to run optimally at the 1Ghz clock rate. So the more things change (meaning faster and faster time to market for more earth shaking designs) the more they stay the same (people adopt commodity CPU designs and become more similar in performance). It will take a big investment for Apple and PA Semiconductor to really TRULY differentiate themselves with a unique and different and proprietary CPU of any type. They just don’t have the time, though they may have the money. So when Jobs tells you something is exclusive to Apple, that may be true for industrial design. But for CPU/GPU/SoC, … Don’t Believe the Hype surround the Apple A4.

Also check out AppleInsider’s coverage of this same topic.


NYTimes weighs in on the Apple A4 chip and what it means for the iPad maintaining its competitive advantage. NYTimes gives Samsung more credit than Apple because they manufacture the chip. What they will not speculate on or guess at is ARM Holdings Inc. sale of licenses to it’s Cortex A-9 to Apple. They do hint that the nVidia Tegra CPU is going to compete directly against Apple’s iPad using the A4. However, as Steve Jobs has pointed out more than once, “Great Products Ship”. And anyone else in the market who has licensed the Cortex A-9 from ARM had better get going. You got 60 days or 90 days depending on your sales/marketing projections to compete directly with the iPad.

computers science & technology technology

Moore’s Law to take a breather • The Register

Back in the days of Byte magazine still being published, there was a lot of talk and speculation about new technology to create smaller microchips. Some manufacturers were touting Extreme UV, some thought X-rays would be necessary. In the years since then a small modification of existing manufacturing methods was added.

“Immersion” lithography, or exposing lithography masks using water as the means of transmission rather than air was widely adopted to shrink things down. Dipping everything into optically clear water helps keep the UV light from scattering, the way it would if it were travelling through air or a simple vacuum. So immersion has become widespread, adding years to the old technology. Now even the old style UV processes are hitting the end of their useful life times.And Intel is at last touting Extreme UV as the next big thing.

Note this article from April 22, 2008. Intel was not at all confident in how cost effective Extreme UV would be for making chips on it’s production lines. The belief is EUV will allow chips to shrink from 32 nanometers down to the next lower process design rules. According to the article that would be the 22nm level, and would require all kinds fo tricks to achieve. Stuff like double-patterning, phase-shifting, and pixellated exposure masks in addition to immersion litho. They might be able to tweak the lens material for the exposure source, they might be able to tweak the refractive index of the immersion liquid. However the cost of production lines and masks to make the chips is going to sky-rocket. Brand new chip fab plants are still on the order of $1Billion+ to construct. The number of years the cost of those fabrication lines can be spread out (amortization) is not going to be long enough. So it looks like the commoditization of microchips will finally settle in. We will buy chips for less and less per 1,000, until they are like lightbulbs. It is very near the end of an era as Moore’s law finally hits the wall of physics.

Diminishing Returns of process shrinks

iSuppli is not talking about these problems, at least not today. But what the analysts at the chip watcher are pondering is the cost of each successive chip-making technology and the desire of chip makers not to go broke just to prove Moore’s Law right.

via iSuppli: Moore’s Law to take a breather • The Register.