Category: mobile

  • ARM vet: The CPUs future is threatened • The Register

    8-inch silicon wafer with multiple intel Penti...
    Image via Wikipedia

    Harkening back to when he joined ARM, Segars said: “2G, back in the early 90s, was a hard problem. It was solved with a general-purpose processor, DSP, and a bit of control logic, but essentially it was a programmable thing. It was hard then – but by todays standards that was a complete walk in the park.”

    He wasn’t merely indulging in “Hey you kids, get off my lawn!” old-guy nostalgia. He had a point to make about increasing silicon complexity – and he had figures to back it up: “A 4G modem,” he said, “which is going to deliver about 100X the bandwidth … is going to be about 500 times more complex than a 2G solution.”

    via ARM vet: The CPUs future is threatened • The Register.

    A very interesting look a the state of the art in microprocessor manufacturing, The Register talks with one of the principles at ARM, the folks who license their processor designs to almost every cell phone manufacturer worldwide. Looking at the trends in manufacturing, Simon Segars is predicting a more difficult level of sustained performance gains in the near future. Most advancement he feels will be had by integrating more kinds of processing and coordinating the I/O between those processors on the same processor die. Which is kind of what Intel is attempting to do integrating graphics cores, memory controllers and CPU all on one slice of silicon. But the software integration is the trickiest part, and Intel still sees fit to just add more general purpose CPU cores to continue making new sales. Processor clocks stay pretty rigidly near the 3GHz boundary and have not shifted significantly since the end of the Pentium IV era.

    Note too, the difficulty of scaling up as well as designing the next gen chips. Referring back to my article from Dec.21,  2010; 450mm wafers (commentary on Electronista article), Intel is the only company rich enough to scale up to the next size of wafer. Every step in the manufacturing process has become so specialized that the motivation to create new devices for manufacture and test just isn’t there because the total number of manufacturers who can scale up to the next largest size of silicon wafer is probably 4 companies worldwide. That’s a measure of how exorbitantly expensive large scale chip manufacturing has become. It seems more and more a plateau is being reached in terms of clock speeds and the size of wafers finished in manufacturing. With these limits, Simon Segars thesis becomes even stronger.

  • AppleInsider | Apple seen merging iOS, Mac OS X with custom A6 chip in 2012

    Steve Jobs while introducing the iPad in San F...
    Image via Wikipedia

    Rumors of an ARM-based MacBook Air are not new. In May, one report claimed that Apple had built a test notebook featuring the same low-power A5 processor found in the iPad 2. The report, which came from Japan, suggested that Apple officials were impressed by the results of the experiment.

    via AppleInsider | Apple seen merging iOS, Mac OS X with custom A6 chip in 2012.

    Following up on an article they did back on May 27th, and one prior to that on May 6th,  AppleInsider does a bit of prediction and prognosticating about the eventual fusion of iOS and Mac OS X. What they see triggering this is an ARM chip that would be able to execute 64-bit binaries across all of the product lines (A fabled ARM A-6). How long would it take to do this consolidation and interweaving? How many combined updaters, security patches, Pro App updaters would it take to get OS X 10.7 to be ‘more’ like iOS than it is today? Software development is going to take a while and it’s not just a matter of cross-compiling to an ARM chip from a software based on Intel chips.

    Given that 64-bit Intel Atom chips are already running on the new Seamircro SM10000 (x64), it won’t be long now I’m sure before the ARM equivalent ARM-15 chip hits full stride. The designers have been aiming for a 4-core ARM design that will be encompassed by the ARM-15 release real soon now (RSN). The next step after that chip is licensed and piloted, tested and put into production will be a 64-bit clean design. I’m curious to see if 64-bit will be applied across ALL the different product lines within Apple. Especially when the issue of power-usage and Thermal Design power (TDM) is considered, will 64-bit ARM chips be as battery friendly? I wonder. True Intel has jumped the 64-bit divide on the desktop with the Core 2 Duo line some time ago and made them somewhat battery friendly. But they cannot compare at all to the 10 hours+ one gets on a 32-bit ARM chip today using the iPad.

    Lastly, App Developers will also need to keep their Xcode environment up to date and merge in new changes constantly up to the big cutover to ARM x64. No telling what that’s going to be like apart from the previous 2 problems I have raised here. Apple in the 10.7 Lion run-up was very late in providing the support and tools to allow the developers to get their Apps ready. I will say though that in the history of migrations in Apple’s hardware/software, they have done more of them, more successfully than any other company. So I think they will be able to pull it off no doubt, but there will be much wailing and gnashing of teeth. And hopefully we’ll see something better as the end-users of the technology, something better than a much bigger profit margin for Apple (though that seems to be the prime mover in most recent cases as Steve Jobs has done the long slow fade into obscurity).

    If ARM x64 is inevitable and iOS on Everything too, then I’m hoping things don’t change so much I can’t do things similarly to the way I do them now on the desktop. Currently on OS X 10.7 I am ignoring completely:

    1. Gestures
    2. Misson Control
    3. Launch Pad
    4. AppStore (not really because I had to download Lion)

    Let’s hope this roster doesn’t get even longer over time as the iOS becomes the de facto OS on all Apple Products. Because I was sure hoping the future would be brighter than this. And as AppleInsider quotes from May 6th,

    “In addition to laptops, the report said that Apple would ‘presumably’ be looking to move its desktop Macs to ARM architecture as well. It characterized the transition to Apple-made chips for its line of computers as a ‘done deal’.”

  • Google confirms Maps with local map downloads as iOS lags | Electronista

    A common message shown on TomTom OS when there...
    Image via Wikipedia

    Google Maps gets map downloads in Labs betaAfter a brief unofficial discovery, Google on Thursday confirmed that Google Maps 5.7 has the first experimental support for local maps downloads.

    via Google confirms Maps with local map downloads as iOS lags | Electronista.

    Google Maps for Android is starting to show a level of maturity only seen on dedicated GPS units. True, there still is no routing feature (you need access to Google’s servers for that functionality) But you at least a downloaded map that you can zoom out and in on to get a view without incurring heavy data charges. Yes, overseas you may rack up some big charges as you navigate live maps via the Google Maps app on Android. This is now solved partially by downloading in advance the immediate area you will be visiting (within a few miles radius). It’s an incremental improvement to be sure and makes Android phones a little more self sufficient without making you regret the data charges.

    Apple on the other hand is behind. Hands down they are kind of letting the 3rd party gps development go to folks like Navigon and TomTom who both require somewhat hefty fees to license their downloaded content. Apple’s Maps doesn’t compare to Navigon, TomTom, much less Google for actual usefulness in a wide range of situations. And Apple isn’t currently using the downloadable vector based maps introduced with this revision of Google Maps for Android vers. 5.7. So it will struggle with large jpeg images as you pan and scan around the map to find your location.

  • Apple patents hint at future AR screen tech for iPad | Electronista

    Structure of liquid crystal display: 1 – verti...
    Image via Wikipedia

    Apple may be working on bringing augmented reality views to its iPad thanks to a newly discovered patent filing with the USPTO.

    via Apple patents hint at future AR screen tech for iPad | Electronista. (Originally posted at AppleInsider at the following link below)

    Original Article: Apple Insider article on AR

    Just a very brief look at a couple of patent filings by Apple with some descriptions of potential applications. They seem to want to use it for navigation purposes using the onboard video camera. One half the screen will use the live video feed, the other half is a ‘virtual’ rendition of that scene in 3D to allow you to find a path or maybe a parking space in between all those buildings.

    The second filing mentions a see-through screen whose opacity can be regulated by the user. The information display will take precedence over the image seen through the LCD panel. It will default to totally opaque using no voltage whatsoever (In Plane switching design for the LCD).

    However the most intriguing part of the story as told by AppleInsider is the use of sensors on the device to determine angle, direction, bearing to then send over the network. Why the network? Well the whole rendering of the 3D scene as described in first patent filing is done somewhere in the cloud and spit back to the iOS device. No onboard 3D rendering needed or at least not at that level of detail. Maybe those datacenters in North Carolina are really cloud based 3D rendering farms?

  • A cocktail of AR and social marketing | Japan Pulse

    From top left: Shinjuku, Tokyo Tower, Rainbow ...
    Image via Wikipedia

    Though the AR element is not particularly elegant, merely consisting of a blue dot superimposed on your cell phone screen that guides the user through Tokyo’s streets, we think it’s nevertheless a clever marketing gimmick.

    via A cocktail of AR and social marketing | Japan Pulse.

    Augmented Reality (AR) in the news this week being used for a marketing campaign in Tokyo JP. It’s mostly geared towards getting people out to visit bars and restaurants to collect points. Whoever gets enough points can cash them in for Chivas Regal memorabilia. But hey, it’s something I guess. I just wish the navigation interface was a little more sophisticated.

    I also wonder how many different phones you can use as personal navigators to find the locations awarding points. Seems like GPS is an absolute requirement, but so is one that has a Foursquare or Livedoor client as well.

  • Macintouch Reader Reports: User Interface Issues iOS/Lion

    Magic Mouse on MacBook Pro. Canon Rebel T1i wi...
    Image via Wikipedia

    Anyways, I predict a semi-chaos, where – for example- a 3 fingers swipe from left to right means something completely different in Apple than in any other platform. We are already seeing signs of this in Android, and in the new Windows 8.Also, users will soon need “cheat sheets” to remember the endless possible combinations.Would be interesting to hear other people’s thoughts.

    via User Interface Issues.

    After the big WWDC Keynote presentation by Steve Jobs et. al. the question I have too is what’s up with all the finger combos for swiping. In the bad old days people needed wire bound notebooks to tell them all about the commands to run their IBM PC. And who can forget the users of WordPerfect who had keyboard template overlays to remind themselves of the ‘menu’ of possible key combos (Ctrl/Alt/Shift). Now we are faced with endless and seemingly arbitrary combinations off finger swipes/pinches/flicks etc.

    Like other readers who responded to this question on the Macintouch message boards, what about the bad old days of the Apple 1 button mouse? Remember when Apple finally capitulated and provided two mice buttons (No?) well they did it through software. Just before the Magic Mouse hit town Apple provided a second mouse button (at long last) bringing the Mac inline for the first time with the Windows PC convention of left and right mouse buttons. How recently did this happen? Just two years ago maybe, Apple introduced the wired and wireless version of the Mighty Mouse? And even then it was virtual, not a literal real two button-ness experience either. Now we have the magic mouse with no buttons, no clicking. It’s one rounded over trackpad that accepts the Lionized gestures. To quote John Wayne, “It’s gettin’ to be Ri-goddamn-diculous”.

    So whither the haptic touch interface conventions of the future? Who is going to win the gesture arms race? Who is going to figure out less is more when it comes to gestures? It ain’t Apple.

  • SPDY: An experimental protocol for a faster web – The Chromium Projects

    Google Chromium alpha for Linux. User agent: M...
    Image via Wikipedia

    As part of the “Let’s make the web faster” initiative, we are experimenting with alternative protocols to help reduce the latency of web pages. One of these experiments is SPDY (pronounced “SPeeDY”), an application-layer protocol for transporting content over the web, designed specifically for minimal latency.  In addition to a specification of the protocol, we have developed a SPDY-enabled Google Chrome browser and open-source web server. In lab tests, we have compared the performance of these applications over HTTP and SPDY, and have observed up to 64% reductions in page load times in SPDY. We hope to engage the open source community to contribute ideas, feedback, code, and test results, to make SPDY the next-generation application protocol for a faster web.

    via SPDY: An experimental protocol for a faster web – The Chromium Projects.

    Google wants the World Wide Web to go faster. I think we all would like to have that as well. But what kind of heavy lifting is it going to take? The transition from Arpanet to the TCP/IP protocol took a very long time and required some heavy handed shoving to accomplish the cutover in 1984. We can all thank Vint Cerf for making that happen so that we could continue to grow and evolve as an online species (Tip of Hat). But now what? There’s been a move to evolved from TCP/IP version 4 to version 6 to accommodate the increase in number of network devices. Speed really wasn’t a consideration in that revision. I don’t know how this project integrates with TCP/IP vers. 6. But I hope maybe it can be pursued on a parallel course with the big migration to TCP/IP vers. 6.

    What would be the worst thing that could happen is to create another Facebook/Twitter/Apple Store/Google/AOL cul-de-sac that only benefits the account holders loyal to Google. Yes it would be nice if Google Docs and all the other attendant services provided via/through Google got onboard the SPDY accelerator train. I would stand to benefit, but things like this should be pushed further up into the wider Internet so that everyone, everywhere has the same benefits. Otherwise this is an attempt to steal away user accounts and create churn in the competitors account databases.

  • Bye, Flip. We’ll Miss You | Epicenter | Wired.com

    Image representing Flip Video as depicted in C...
    Image via CrunchBase

    Cisco killed off the much-beloved Flip video camera Tuesday. It was an unglamorous end for a cool device that just few years earlier shocked us all by coming to dominate the video-camera market, utterly routing established players like Sony and Canon

    via Bye, Flip. We’ll Miss You | Epicenter | Wired.com.

    I don’t usually write about Consumer Electronics per se. This particular product category got my attention due to it’s long gestation and overwhelming domination of a category in the market that didn’t exist until it was created. It was the pocket video camera with a built-in flip out USB connector. Like a USB flash drive with a LCD screen, a lens and one big red button, the Flip pared down everything to the absolute essentials, including the absolute immediacy of online video sharing via YouTube and Facebook. Now the revolution has ended, devices have converged and many are telling the story of explaining Why(?) this has happened. In the case of Wired.com’s Robert Capps he claims Flip lost its way after Cisco lost its way doing the Flip 2 revision, trying to get a WiFi connected camera out there for people to record their ‘Lifestream’.

    Prior to Robert Capps, different writers for different pubs all spouted the conclusion of Cisco’s own Media Relations folks. Cisco’s Flip camera was the victim of inevitable convergence, pure and simple. Smartphones, in particular Apple’s iPhone kept adding features all once available only on the Flip. Easy recording, easy sharing, larger resolution, bigger LCD screen, and it could play Angry Birds too! I don’t cotton to that conclusion as fed to us by Cisco. It’s too convenient and the convergence myth does not account for the one thing Flip has the iPhone doesn’t have, has never had WILL never have. And that is a simple, industry standard connector. Yes folks convergence is not simply displacing cherry-picked features from one device and incorporating into yours, no. True convergence is picking up all that is BEST about one device and incorporating it, so that fewer and fewer compromises must be made. Which brings me to the issue of the Apple multi-pin connector that has been with us since the first iPod hit the market in 2002.

    See the Flip didn’t have a proprietary connector, it just had a big old ugly USB connector. Just as big and ugly as the one your mouse and keyboard use to connect to your desktop computer. The beauty of that choice was Flip could connect to just about any computer manufactured after 1998 (when USB was first hitting the market). The second thing was all the apps for making the Flip play back the videos you shot or to cut them down and edit them were sitting on the Flip, just like hard drive, waiting for you to install them on whichever random computer you wanted to use. Didn’t matter whether or not it had the software installed, it COULD be installed directly from the Flip itself. Isn’t that slick?! You didn’t have to first search for the software online, download and install, it was right there, just double-click and go.

    Compare this to the Apple iOS cul-de-sac we all know as iTunes. Your iPhone, iTouch, iPad, iPod all know your computer not through simply by communicating through it’s USB connector. You must first have iTunes installed AND have your proprietary Apple to USB connector to link-up. Then and only then can your device ‘see’ your computer and the Internet. This gated community provided through iTunes allows Apple to see what you are doing, market directly to you and watch as you connect to YouTube to upload your video. All with the intention of one day acting on that information, maintaining full control at each step along the path way from shooting to sharing your video. If this is convergence, I’ll keep my old Flip mino (non-HD) thankyou very much. Freedom (as in choice) is a wonderful thing and compromising that in the name of convergence (mis-recognized as convenience) is no compromise. It is a racket and everyone wants to sell you on the ‘good’ points of the racket. I am not buying it.

  • Calxeda boasts of 5 watt ARM server node • The Register

    Calxeda is not going to make and sell servers, but rather make chips and reference machines that it hopes other server makers will pick up and sell in their product lines. The company hopes to start sampling its first ARM chips and reference servers later this year. The first reference machine has 120 server nodes in a 2U rack-mounted format, and the fabric linking the nodes together internally can be extended to interconnect multiple enclosures together.

    via Calxeda boasts of 5 watt ARM server node • The Register.

    SeaMicro and now Calxeda are going gangbusters for the ultra dense low power server market. Unlike SeaMicro, Calxeda wants to create reference designs it licenses to manufacturers who will build machines with 120 cores in a 2 Unit rack. SeaMicro’s record right now is 512 cores per 10U rack  or roughly 102+ cores in a 2 Unit rack. The difference is the SeaMicro product uses an Intel low power Atom cpu,  whereas Calxeda is using a processor used more often in smart phones and tablet computers. SeaMicro has hinted they are not wedded to the Intel Architecture, but they are more interested in shipping real live product than coming up with generic designs others can license. In the long run it’s entirely possible SeaMicro may switch to a different CPU, they have indicated previously they have designed their servers with flexibility enough to swap out the processor to any other CPU if necessary. It would be really cool to see an apples-to-apples comparison of a SeaMicro server using first Intel CPUs versus ARM-based CPUs.

  • AppleInsider | Insider Mac OS X 10.7 Lion: Auto Save, File Versions and Time Machine

    Original 1984 Macintosh desktop
    Image via Wikipedia

    However, Windows’ Shadow Copy is really intended for creating a snapshot of an entire volume for backup purposes; users can’t trigger the creation of a new version of an individual file in Windows. This makes Lion’s Versions a very different beast: its more akin to a versioning file system that works like Time Machine, but local to the user’s own disk.

    via AppleInsider | Insider Mac OS X 10.7 Lion: Auto Save, File Versions and Time Machine [Page 2].

    Reading this article from Apple Insider’s series of previews of Mac OS X 10.7 has been an education in both the iOS based universe and the good ol’ desktop universe I already know and love. At first I was apprehensive about the desktop OS taking such a back seat to the mobile devices Apple has been introducing at an increasingly fast pace. From iPods to iPhones to iPod Touch and now the iPad, there’s no end to the permutations iOS based devices can take. Prior to the iPhone and iPod Touch releases, Apple was using an embedded OS with none of the sophistication and capability of a real desktop operating system. This was both a frugal and conservative approach as media players while having real CPUs inside were never intended to have network stacks, garbage collection on UI servers, etc. There was always enough there to present a User Interface off some sort, with access to a local file system and ability to sync files between a host based iTunes client and the device (whichever generation iPod it might be). Along with that each generation hardware most likely varied by degrees as video playback  became a touted feature in newer iPods with bigger internal hard drives (so-called video ipods). I can imagine that got complicated quickly as CPU and video chips and media playback capabilities ranged widely up and down the product line. As each device required its own tweaks to the embedded OS, and iTunes was tweaked to accommodate these local variations, I’m sure the all seeing eye of Steve Jobs began to wince at the increasing complexity of the iPod product line. Enter the iOS, a smaller, cleaner fully optimized OS for low power mobile devices. It’s got everything a desktop OS has without any of the legacy device concerns (backward compatibility) of a typical desktop OS. This allowed for creating ‘just enough’ capability in the networking capability the UI Server and the local storage. Apps written for iOS were unique to that environment though they might have started out as Mac OS X apps. By taking the original code base, re-factoring it and doing complete low level rewrites from top to bottom, you got a version of the Safari web browser on a mobile device. It could display ANY webpage and kind of do some display optimizations of the page on the fly. And there were a number of developers rushing to get an app to run on the new devices. So wither the Apple Mac OS X?

    Well in the rush of creating an iOS app universe, the iOS development team added many features along the way. One great gap was the missing cut & paste analogy long enjoyed on desktop OSes. Eventually this feature made it in, and others like it slowly got integrated. Apple’s custom A4 chip using and ARM Core 8 cpu was tearing up the charts, out competing every other mobile phone OS on the market. Similarly the iPad took that same approach of getting out there with new features and becoming a more desktop like mobile device. A year has passed since the original iPad hit the market, the Mac OS is due for a change, the big question is what does Steve Jobs think? There were hints and rumors he wanted everyone to enjoy the clean room design of the iOS, dump the legacy messiness of old Mac OS X. Dan Lyons of Newsweek gave voice to these concerns quite clearly in his June 8 article in Newseek. Steve Jobs would eventually reply directly to this author and state emphatically he was wrong. Actions speak louder than words, Apple’s World Wide Developer Conference in 2010 seemed to really hard sell the advantages of developing for the new iOS. Conversely, Microsoft has proven over and over again, legacy support in an OS is a wonderful source of income, once you have established your monopoly. However, Apple has navigated the legacy hardware seas before with its first big migration from Motorola 68000 processors to the PowerPC chip, then subsequently the migration from PowerPC to Intel chips. From a software standpoint attrition occurs as people dump their legacy hardware anyways (not uncommon amongst Apple users to eventually get rid of their older hardware). So to help deliver the benefit of newer software requirements are now fully in place that even certain first gen Intel based Macs won’t be able to run the newest Mac OS X (that’s the word now). Similarly legacy support for PowerPC native apps running under Intel in emulation (using the Rosetta software) will also go away. Which then brings us to the point of this whole blog posting, where’s the beef?

    The beef dear reader is not in the computers but in ourselves. As Macintosh OSes evolve so do the workflow and the new paradigm being foisted upon us through the use of mobile devices is the lack of need to go to the File Menu -> Choose Save or Save As… That’s what the new iOS design portends in the future. Same goes for open documents in process, everything is done for you at long last. The computer does what finally you thought it did all the time and what Microsoft eventually built into Word (not the OS itself), Autosave. Newly developed versions of TextEdit made by Apple to run under OS X 10.7 were tested and tried out to see how they work under the new Auto Save and Versions architecture. Now, you just make a new document and the computer (safely) assumes you will most likely want to save the document as you are working on it, and you may want to go back and undo some changes you made. After all these years of using desktop computers, this is now built right in at long last. So from the commandline to the GUI and now to the Mobile OS, computer architects and UI engineers have a good idea of what you might want to do before you choose to do it, and it’s built in at the lowest level of the OS finally! And all of these are going to be in the next version of Mac OS X, due for release this July, 2011. After reading these articles from AppleInsider looking at the screenshots, I’m way more enthused and willing to change and adapt the way I work to the new regime of hybrid iOS and MacOS X going forward.