Category: blogroll

This is what I subscribe to myself

  • Apple A4 SOC unveiled – It’s an ARM CPU and the GPU! – Bright Side Of News*

    Getting back to Apple A4, Steve Jobs incorrectly addressed Apple A4 as a CPU. We’re not sure was this to keep the mainstream press enthused, but A4 is not a CPU. Or we should say, it’s not just a CPU. Nor did PA Semi/Apple had anything to do with the creation of the CPU component.

    via Apple A4 SOC unveiled – It’s an ARM CPU and the GPU! – Bright Side Of News*.

    Apple's press release image of the A4 SoC

    Interesting info on the Apple A4 System on Chip which is being used by the recently announced Ipad tablet computer. The world of mobile, low power processors is dominated by the designs of ARM Holdings Inc. Similarly ARM is providing the graphics processor intellectual property too. So in the commodity CPU/GPU and System on Chip (SoC) market ARM is the only way to go. You buy the license you layout the chip with all the core components you license and shop that around to a chip foundry. Samsung has a lot of expertise fabricating these chips made to order using the ARM designs. But Apparently another competitor Global Foundries is shrinking its design rules (meaning lower power and higher clock speeds) and may become the foundry of choice. Unfortunately outfits like iFixit can only figure out what chips and components go into an electronics device. They cannot reverse engineer the components going into the A4, and certainly anyone else would probably be sued by Apple if they did spill the beans on the A4’s exact layout and components. But  because everyone is working from the same set of Lego Blocks for the CPUs and GPUs and forming them into full Systems on a Chip, some similarities are going to occur.

    The heart of the new Apple A4 System on Chip

    One thing pointed out in this article is the broad adoption of the same clockspeed for all these ARM derived SoCs. 1Ghz is the clock speed across the board despite differences in manufacturers and devices. The reason being everyone is using the same ARM cpu cores and they  are designed to run optimally at the 1Ghz clock rate. So the more things change (meaning faster and faster time to market for more earth shaking designs) the more they stay the same (people adopt commodity CPU designs and become more similar in performance). It will take a big investment for Apple and PA Semiconductor to really TRULY differentiate themselves with a unique and different and proprietary CPU of any type. They just don’t have the time, though they may have the money. So when Jobs tells you something is exclusive to Apple, that may be true for industrial design. But for CPU/GPU/SoC, … Don’t Believe the Hype surround the Apple A4.

    Also check out AppleInsider’s coverage of this same topic.

    Update: http://www.nytimes.com/2010/02/02/technology/business-computing/02chip.html

    NYTimes weighs in on the Apple A4 chip and what it means for the iPad maintaining its competitive advantage. NYTimes gives Samsung more credit than Apple because they manufacture the chip. What they will not speculate on or guess at is ARM Holdings Inc. sale of licenses to it’s Cortex A-9 to Apple. They do hint that the nVidia Tegra CPU is going to compete directly against Apple’s iPad using the A4. However, as Steve Jobs has pointed out more than once, “Great Products Ship”. And anyone else in the market who has licensed the Cortex A-9 from ARM had better get going. You got 60 days or 90 days depending on your sales/marketing projections to compete directly with the iPad.

  • Some people are finding Google Wave useful

    Posterous Logo

    I use google wave every single day. I start off the day by checking gmail. Then I look at a few news sites to see if anything of interest happened. Then I open google wave: because thats where my business lives. Thats how I run a complicated network of collaborators, make hundreds of decisions every day and organise the various sites that made me $14.000 in december.
    On how Google Wave surprisingly changed my life – This is so Meta.

    I’m glad some people are making use of Google Wave. After the first big spurt of interest, sending invites out to people interest tapered off quickly. I would login and see no activity whatsoever. No one was coming back to see what people had posted. So like everyone else I stopped coming back too.

    Compare this also to the Facebook ebb and flow. I notice the NYTimes.com occasionally slagging Facebook with an editorial in their Tech News section. Usually the slagging is conducted by someone who I would classify as a pseudo technology enthusiast (the kind that doesn’t back up their files, then subsequently writes about it in an article to complain about it). Between iPhone upgrades and writing up the latest free web service they occasionally rip Facebook in order to get some controversy going.

    But as I’ve seen Facebook has a rhythm of less participation then periods of intense participation. Sometimes it’s lonely, people don’t post or read for months and months. It makes me wonder what interrupts their lives long enough that people stop reading or writing posts. I would assume Google Wave might suffer the same kind of ebb and flow even when used for ‘business’ purposes.

    So the question is, does any besides this lone individual on Posterous use Google Wave on a daily business for work purposes?

    logo
    Google Wave
  • Intel linked with HPC boost buy • The Register

    Comment With Intel sending its “Larrabee” graphics co-processor out to pasture late last year – before it even reached the market – it is natural to assume that the chip maker is looking for something to boost the performance of high performance compute clusters and the supercomputer workloads they run. Nvidia has its Tesla co-processors and its CUDA environment. Advanced Micro Devices has its FireStream co-processors and the OpenCL environment it has helped create. And Intel has been relegated to a secondary role.

    via Intel linked with HPC boost buy • The Register.

    Intel’s long term graphics accelerator project code-named “Larabee. It’s an unfortunate side effect of losing all that money by time delays on the project that forces Intel now to reuse the processor as a component in a High Performance Computer (so-called Super Computer). The competition have been providing hooks or links into their CPUs and motherboard for auxiliary processors or co-processors for a number of years. AMD notably created a CPU socket with open specs that FPGA’s could slide into. Field Programmable Gate Arrays are big huge general purpose CPUs with all kinds of ways to reconfigure the circuits inside of them. So huge optimizations can be made in hardware that were previously done in Machine Code/Assembler by the compilers for that particular CPU. Moving from a high level programming language to an optimized hardware implementation of an algorithm can speed a calculation up by several orders of magnitude (1,000 times in some examples). AMD has had a number of wins in some small niches of the High Performance Computing market. But not all algorithms are created equal, and not all of them lend themselves to implementation in hardware (FPGA or it’s cousin the ASIC). So co-processors are a very limited market for any manufacturer trying to sell into the HPC market. Intel isn’t going to garner a lot of extra sales by throwing development versions of Larabee out to the HPC developers. Another strike is the dependence on a PCI express bus for communications to the Larabee chipset. While PCI Express is more than fast enough for graphics processing, an HPC setup would prefer a CPU socket adjacent to the general purpose CPUs. The way AMD has designed their motherboards all sockets are on the same motherboard and can communicate directly to one another instead of using the PCI Express bus. Thus, Intel loses again trying to market Larabee in the HPC market. One can only hope that other secret code-name projects like the CPU with 80 cores will see the light of day soon when it makes a difference rather than suffer the opportunity costs of a very delayed launch of Larabee.

  • Buzz Bombs in the News – Or the Wheel Reinvented

    Slashdot just posted this article for all to read on the Interwebs

    penguinrecorder writes“The Thunder Generator uses a mixture of liquefied petroleum, cooking gas, and air to create explosions, which in turn generate shock waves capable of stunning people from 30 to 100 meters away. At that range, the weapon is relatively harmless, making people run in panic when they feel the sonic blast hitting their bodies. However, at less than ten meters, the Thunder Generator is capable ofcausing permanent damage or killing people.”

    I went directly to the article itself and read the contents of it. And it was very straight forward, more or less indicating this new shockwave gun was an adaptation of the propane powered “scare crows” used to budge and shift birds from farm fields in Israel.

    http://www.defensenews.com/story.php?i=4447499&c=FEA&s=TEC

    TEL AVIV – An Israeli-developed shock wave cannon used by farmers to scare away crop-threatening birds could soon be available to police and homeland security forces around the world for nonlethal crowd control and perimeter defense.

    I think Mark Pauline and Survival Research Labs beat the Israeli’s to the punch inventing the so-called cannon:

    http://srl.org.nyud.net:8090/srlvideos/machinetests/bigpulsejetQT300.mov

    Prior to Mark Pauline and Survival Research Labs, the German military in WW2 adapted the pulse jet for the V-1 Buzz bomb. In short, a German terror weapon has indirectly become the product of an Israeli defense contractor. Irony Explodes. The V1 Buzz bomb was influenced by a French inventor Georges Marconnet. Everything Old is new again in the war on terror. Some good ideas never die, they just get re-invented like the wheel.

  • 64GBytes is the new normal (game change on the way)

    Panasonic SDXC flash memory card
    Flash memory chips are getting smaller and denser

    I remember reading announcements of the 64GB SDXC card format coming online from Toshiba. And just today Samsung has announced it’s making a single chip 64GB flash memory module with a built-in memory controller. Apple’s iPhone design group has been big fans of the single chip large footprint flash memory from Toshiba. They bought up all of Toshiba’s supply of 32GB modules before they released the iPhone 3GS last Summer. Samsung too was providing the 32GB modules to Apple prior to the launch. Each Summer newer bigger modules are making for insanely great things that the iPhone can do. Between the new flash memory recorders from Panasonic/JVC/Canon and the iPhone what will we do with the doubling of storage every year? Surely there will be a point of diminishing return, where the chips cannot be made any thinner and stacked higher in order to make these huge single chip modules. I think back to the slow evolution and radical incrementalism in the iPod’s history going from 5GB’s of storage to start, then moving to 30GB and video! Remember that? the Video iPod @ 30GBytes was dumbfounding at the time. Eventually it would top out at 120 and now 160GBytes total on the iPod classic. At the rate of change in the flash memory market, the memory modules will double in density again by this time next year, achieving 128GBytes for a single chip modules with embedded memory controller. At that density a single SDHC sized memory card will also be able to hold that amount of storage as well. We are fast approaching the optimal size for any amount of video recording we could ever want to do and still edit when we reach the 128 Gbyte mark. At that size we’ll be able to record 1080p video upwards of 20 hours or more on today’s video cameras. Who wants to edit much less watch 20 hours of 1080p video? But for the iPhone, things are different, more apps means more fun. And at 128GB of storage you never have to delete an app, or an single song from your iTunes or a single picture or video, just keep everything. Similarly for those folks using GPS, you could keep all the maps you ever wanted to use right onboard rather than download them all the time thus providing continuous navigation capabilities like you would get with a dedicated GPS unit. I can only imagine the functionality of the iPhone increasing as a result of the increased storage 64GB Flash memory modules would provide. Things can only get better. And speaking of better, The Register just reported today some future directions.

    There could be a die process shrink in the next gen flash memory products. There are also some opportunities to use slightly denser memory cells in the next gen modules. The combination of the two refinements might provide the research and design departments at Toshiba and Panasonic the ability to double the density of the SDXC and Flash memory modules to the point where we could see 128GBytes and 256GBytes in each successive revision of the technology. So don’t be surprised if you see a Flash memory module as standard equipment on every motherboard to hold the base Operating System with the option of a hard drive for backup or some kind of slower secondary storage. I would love to see that as a direction netbook or full-sized laptops might take.

    http://www.electronista.com/articles/09/04/27/toshiba.32nm.flash.early/ (Toshiba) Apr 27, 2009

    http://www.electronista.com/articles/09/05/12/samsung.32gb.movinand.ship/ (Samsung) May 13, 2009

    http://www.theregister.co.uk/2010/01/14/samsung_64gbmovinand/ (Samsung) Jan 14, 2010

  • The Eternal Value of Privacy by Bruce Schneier

    Two proverbs say it best: Quis custodiet custodes ipsos? (“Who watches the watchers?”) and “Absolute power corrupts absolutely.”

    via The Eternal Value of Privacy.

    Nobody is the final authority when it comes to monitoring and privacy. No surer example exists than when Stalin died, the rules changed. When the East German state ended the Stazi went away. When the U.S. invaded Iraq, Saddam Hussein fled from power. Those in power try to cleanse their country of all who oppose them (the wrong-thinkers). Then their power evaporates, they vanish and all the rules change again. The same is true of Bush 43.

    George W. Bush was here, now he’s gone. So why not dismantle all that surveillance gear the NSA put into all the network facilities at AT&T, Sprint? The rules have changed, you don’t need to acquiesce to the current administration, because it’s not the same people making the same demands. The rules have changed. Yet as world events on Christmas day have proved there’s always a Jaws-like shark fin rising and falling out there in the ocean. The threat is very close by and we have to be ever vigilant. So the watchers claim of authority is re-established with each and every tragic episode. Still, is a single incident cause for the continued erosion of our rights to privacy? Given the hair-trigger responses we try to architect and instant reprisals it’s obvious to me the current environment proves it can never end, under the current structure. So in order to stop the erosion, we need to change our thinking about the threat. True no one wants to be fearful of flying wherever they may go. And when they go, they don’t want to be faced with having to kill a fellow passenger in order to save themselves, but that’s the situation we have mentally put ourselves in.

    The only way out is to change our thinking. Change how we think about the danger, the threat and you change how much of our freedoms we are willing to give up to respond to the threat. And maybe we can get back to where we once belonged.

  • Revolutionise computer memory – New Scientist

    So where is the technology that can store our high-definition home cinema collection on a single chip? Or every book we would ever want to read or refer to? Flash can’t do that. In labs across the world, though, an impressive array of technologies is lining up that could make such dreams achievable.

    via Five ways to revolutionise computer memory – tech – 07 December 2009 – New Scientist.

    Memory Chips on the decrease
    RAM memory used to reign supreme in Dual Inline Packages (DIPS)

    I used to follow news stories on new computer memory technology on the IEEE.com website. I didn’t always understand all the terms and technologies, but I did want to know what might be coming on the market in a couples of years. Magnetic RAM seemed like a game changer as did Ferro-Electric RAM. Both of them like Flash could hold their contents without the computer being turned on. And in some ways they were superior to Flash in that they read/write cycle didn’t destroy the memory over time. Flash is known to have a useful fixed lifespan before it wears out. According to the postscript in this article at New Scientist flash memory can sustain between 10,000 and 100,000 read/write cycles before it fails. Despite this, flash memory doesn’t seem to be going away anytime soon, and begs the question where are my MRAMs and FeRAM chips?

    Maybe my faith in MRAM or Magnetic RAM was misplaced. I had great hopes for it exactly because so much time had been spent working on it. Looks like they couldn’t break the 32MB barrier in terms of the effective density of the MRAM chips themselves. And FeRAM is also stuck at 128MB effectively for similar reasons. It’s very difficult to contain or restrict the area over which the magnetism acts on the bits running through the wires on the chip. It’s all about too much crosstalk on the wires.

    This article mentions something called Racetrack Memory. And what about Racetrack Memory so called RRAM? It reminds me a lot of what I read about the old Sperry Univac computers that used Mercury Delay Lines to store 512bits at a time. Only now instead of acoustic waves, it’s storing actual electrons and reading them in series as needed. Cool stuff, and if I had to vote for which one is going to win, obviously Phase Change and Racetrack look like good prospects right now. I hope both of them see the light of day real soon now.

  • Intel Gets Graphic with Chip Delay – Bits Blog – NYTimes.com

    Intel’s executives were quite brash when talking about Larrabee even though most of its public appearances were made on PowerPoint slides. They said that Larrabee would roar onto the scene and outperform competing products.

    via Intel Gets Graphic with Chip Delay – Bits Blog – NYTimes.com.

    And so now finally the NY Times nails the coffin shut on Intel’s Larrabee saga. To refresh your memory this is the second attempt by Intel to create a graphics processor. The first failed attempt was some years ago in the late 1990s when 3dfx (bought by nVidia) was tearing up the charts with their Voodoo 1 and Voodoo 2 PCI-based 3D accelerator cards. The age of Quake, Quake 2 were upon us and everyone wanted smoother frame rates. Intel wanted to show its prowess in the design of a low cost graphics card running on the brand new AGP slot which Intel had just invented (remember AGP?). What turned out was a similar set of delays and poor performance as engineering samples came out of the development labs. Given the torrid pace of products released by nVidia and eventually ATI, Intel couldn’t keep up. Their benchmark was surpassed by the time their graphics card saw the light of day, and they couldn’t give them away. (see Wikipedia: Intel  i740)

    Intel i740 AGP graphics card
    1998 saw the failure of the Intel i740 AGP graphics card

    The Intel740, or i740, is a graphics processing unit using an AGP interface released by Intel in 1998. Intel was hoping to use the i740 to popularize the AGP port, while most graphics vendors were still using PCI. Released with enormous fanfare, the i740 proved to have disappointing real-world performance, and sank from view after only a few months on the market

    Enter Larrabee, a whole new ball game at Intel, right?! The trend toward larger numbers of parallel processors on GPUs from nVidia and ATI/AMD led Intel to believe they might leverage some of their production lines to make a graphics card again. But this time it was different, nVidia had moved from single purpose GPUs to General Purpose GPUs in order to create a secondary market using their cards as compute intensive co-processor cards. They called it CUDA and provided a few development tools at the early stages. Intel latched onto this idea of the General Purpose GPU and decided they could do better. What’s more general purpose than an Intel x86 processor right? And what if you could provided the libraries and Hardware Abstraction Layer that could turn a larger number of processor cores into something that looked and smelled like a GPU?

    For Intel it seemed like a win/win/win everybody wins. The manufacturing lines using older design rules at the 45nm size could be utilized for production, making the graphics card pure profit. They could put 32 processors on a card and program them to do multi duties for the OS (graphics for games, co-processor for transcoding videos to MP4). But each time they did a demo a product white paper and demo at a trade show it became obvious the timeline and schedule was slipping. They had benchmarks to show, great claims to make, future projections of performance to declare. Roadmaps were the order of the day. But just last week rumors started to set in.

    Similar to the graphics card foray of the past Intel couldn’t beat it’s time to market demons. The Larrabee project was going to be so late and still was using 45nm manufacturing design rules. Given Intel’s top of the line production lines moved to 32nm this year, and nVidia and AMD are doing design process shrinks on their current products, Intel was at a disadvantage. Rather than scrap the thing and lose face again, they decided to recover somewhat and put Larrabee out there as a free software/hardware development kit and see if that was enough to get people to bite. I don’t know what if any benefit any development on this platform would bring. It would rank right up there with the Itanium and i740 as hugely promoted dead-end products with zero to negative market share. Big Fail – Do Not Want.

    And for you armchair Monday morning technology quarter backs here are some links to enjoy leading up to the NYTimes article today:

    Tim Sweeney Laments Intel Larrabee Demise (Tom’s Hardware Dec. 7)

    Intel Kills Consumer Larrabee Plans (Slashdot Dec. 4)

    Intel delays Larrabee GPU, aims for developer “kit” in 2010 (MacNN Dec. 4)

    Intel condemns tardy Larrabee to dev purgatory (The Register Dec.4)

  • Google Shrinks Another Market (and I’m not talkin’ DNS)

    Brady Forest writes: Google has announced a free turn-by-turn navigation system for Android 2.0 phones such as the Droid.

    via Google Shrinks Another Market With Free Turn-By-Turn Navigation – O’Reilly Radar.

    And with that we enter a killer app for the cell phone market and the end of the market for single purpose personal navigation devices. Everyone is desperate to get a sample of the Motorola Droid phone to see how well the mix of features work on the phone. Consumer Reports has tried out a number of iPhone navigation apps to see how they measure up to the purpose built navigators. For people who don’t need specific features or generally aren’t connoisseurs of turn-by-turn directions, they are passable. But for anyone who bought early and often from Magellan, Garmin and TomTom the re-purposed iPhone Apps will come up short.

    It's big and heavy but it's got an OS that won't quitThe Motorola Droid however is trying to redefine the market by keeping most of the data in the cloud at Google Inc. datacenters and doing the necessary lookups as needed over the cell phone data network. This is the exact opposite of most personal navigation devices where all the mapping and point of interest data are kept on the device and manually updated through very huge, slow downloads of new data purchased online on an annual basis (at least for me). Depending on the results Consumer Reports gets, I’ll reserve judgment. This is not likely to shift the paradigm currently of personal navigation except that the devices are going to be necessarily even more multipurpose than Garmin has made them. And unwillingly made them at that. The Garmin Nuviphone was supposed to be a big deal. But it’s a poor substitute for a much cheaper phone and more feature filled navigation device. I think the inclusion of Google Maps and Google StreetView is the next big thing in navigation as the Lane assistance differentiated TomTom from Garmin about a year and a half ago. So radical incrementalism is the order of the day still in personal GPS devices. But with an open platform for developing navigation services, who knows what the future may hold. I’m hoping the current oligarchy between Garmin and TomTom starts to crumble and someone starts to eat away  at the low end or even the high end of the market. Something has got to give.

  • 7 Things You Should Know About Google Wave | EDUCAUSE

    Wave challenges us to reevaluate how communication is done, stored, and shared between two or more people.

    logo
    Google Wave

    via 7 Things You Should Know About Google Wave | EDUCAUSE.

    Point taken, since I watched the video of the demo done last spring I too have been smitten with the potential uses of Google Waves. First and foremost it is a communication medium. Second of all unlike email, there are no local, unsynced copies of the text/multimedia threads. Instead everything is central like an old style bulletin board, newsgroup or collaborative wiki. And like a  wiki revisions are kept and can be “Played Back” to see how things have evolved over time. For people recognizing the limits of emailing attachments to accomplish this goal of group editing, the benefits far outweigh the barriers to entry. I was hoping to get an invitation into Google Waves, but haven’t yet received one. Of course if I do get invited, the problem of the Fax Machine will crop up. I will need to find someone else who I know well enough to collaborate with in order to try it out. And hopefully there will be a ready and willing audience when I do finally get an invite.

    As far as how much better is Waves versus email, it depends very much on how you manage your communications already. Are you a telephone person or an email person or a face-to-face person. All these things affect how you will perceive the benefits of a persistent central store of all the blips and waves you participate in. I think Google could help explain things even to us mid-level technilogically capable folks who are still kind of bewildered by what went on in the Demos at Google Developer Day. But this PDF Educause has compiled will help considerably. The analogy I’m using now is the bulletin board/wiki/collaborative document example. Sometimes it’s just easier to understand something in comparison to something you already know/use/understand.

    a list of Google Waves with participant icons
    Waves can start to add up

    PS: Finally got an invite from Google Waves about two weeks ago and went hog wild inviting people to join in. If you want to include me in a Wave add me to your list as: carpetbomberz@googlewave.com. Early returns from sending invites and participating in some experimental Waves has shown the wild popularity dying down quite a bit. At one point we had 8 participants in one single Wave. Trying out some of the add-on tools was interesting too. But the universe of add-ons is pretty small at this point. Hopefully Google will get that third party development effort going in high gear. As far as the utility of the Google Waves, it is way too much like a super-charged glorified bulletin board. It doesn’t have any easy hooks in or out to other Social Media infrastructure. Someone has to make it seamless with Facebook/Twitter/Gmail either though RSS hooks or making the whole framework/interface embeddable or linkable in other websites. As always we’ll see how this goes. They need to keep a torrid pace of development like Facebook achieved from 2005-2007 improving and adding membership to the Google Wave Universe.