Category: wired culture

Those promoters and bandwagoneers of everything in the Fy00tcha!

  • Google shows off Project Glass augmented reality specs • The Register

    Thomas Hawk’s picture of Sergey Brin wearing the prototype of Project Glass

    But it is early days yet. Google has made it clear that this is only the initial stages of Project Glass and it is seeking feedback from the general public on what they want from these spectacles. While these kinds of heads-up displays are popular in films and fiction and dearly wanted by this hack, the poor sales of existing eye-level screens suggests a certain reluctance on the part of buyers.

    via Google shows off Project Glass augmented reality specs • The Register.

    The video of the Google Glass interface is kind of interesting and problematic at the same time. Stuff floats in and out of few kind of like the organism that live in the mucous of your eye. And the the latency delays of when you see something and issue a command give it a kind of halting staccato cadence when interacting with it. It looks and feels like old style voice recognition that needed discrete pauses added to know when things ended. As a demo it’s interesting, but they should issue releases very quickly and get this thing up to speed as fast as they possibly can. And I don’t mean having the CEO Sergey Brin show up at a party wearing the thing. According to reports the ‘back pack’ that the glasses are tethered to is not small. Based on the description I think Google has a long way to go yet.

    http://my20percent.wordpress.com/2012/02/27/baseball-cap-head-up-displa/

    And on the smaller scale tinkerer front, this WordPress blogger fashioned an older style ‘periscope’ using a cellphone, mirror and half-mirrored sunglasses to get a cheaper Augmented Reality experience. The cellphone is an HTC unit strapped onto the rim of a baseball hat. The display is than reflected downwards through a hold cut in the rim and then is reflected off a pair of sunglasses mounted at roughly a 45 degree angle. It’s cheap, it works, but I don’t know how good the voice activation is. Makes me wonder how well it might work with an iPhone Siri interface. The author even mentions that HTC is a little heavy and an iPhone might work a little better. I wonder if it wouldn’t work better still if the ‘periscope’ mirror arrangement was scrapped altogether. Instead just mount the phone flat onto the bill of the hat, let the screen face downward. The screen would then reflect off the sunglasses surface. The number of reflecting surfaces would be reduced, the image would be brighter, etc. I noticed a lot of people also commented on this fellow’s blog and might get some discussion brewing about longer term the value-add benefits to Augmented Reality. There is a killer app yet to be found and even Google hasn’t captured the flag yet.

    This picture shows the Wikitude World Browser ...
    This picture shows the Wikitude World Browser on the iPhone looking at the Old Town of Salzburg. Computer-generated information is drawn on top of the screen. This is an example for location-based Augmented Reality. (Photo credit: Wikipedia)
  • Picture This: Hosted Lifebits in the Personal Cloud | Cloudline | Wired.com

    Jon Udell
    Jon Udell (Photo credit: Wikipedia)

    It’s not just photos. I want the same for my whole expanding set of digital objects, including medical and financial records, commercial transactions, personal correspondence, home energy use data, you name it. I want all of my lifebits to be hosted in the cloud under my control. Is that feasible? Technically there are huge challenges, but they’re good ones, the kind that will spawn new businesses.

    via (Jon UdellPicture This: Hosted Lifebits in the Personal Cloud | Cloudline | Wired.com.

    From Gordon Moore‘s MyLifeBits to most recently Stephen Wolfram‘s personal collection of data and now to Jon Udell. Witness the ever expanding universe of personal data. Thinking about Gordon Moore now, I think the emphasis from Microsoft Research was always on video and pictures and ‘recollecting’ what’s happened in any given day. Stephen Wolfram’s emphasis was not so much on collecting the data but analyzing it after the fact and watching patterns emerge. Now with Jon Udell we get a nice kind of advancing of the art by looking at possible end-game scenarios. So you have collected a mass of LifeBits, now what?

    Who’s going to manage this thing? Is anyone going to offer a service that will help manage it? All great questions because the disparate form social networking lifebits take versus other like health and ‘performance’ lifebits (like Stephen Wolfram collects and maintains for himself) are pointing up a big gap that exists in the cloud services sector. Ripe pickings for anyone in the entrepreneurial vein to step in and bootstrap a service like the one Jon Udell proposes. If someone was really smart they could get it up and running cheaply on Amazon Web Services (AWS) until it got to be too cost and performance prohibitive to keep it hosted there. That would both allow an initial foray to test the waters, see the size and tastes of the market and adapt the hosted lifebits service to anyone willing to pay up. That might just be a recipe for success.

  • ARM Wants to Put the Internet in Your Umbrella | Wired Enterprise | Wired.com

    Image representing Wired Magazine as depicted ...
    Image via CrunchBase

    On Tuesday, the company unveiled its new ARM Cortex-M0+ processor, a low-power chip designed to connect non-PC electronics and smart sensors across the home and office.

    Previous iterations of the Cortex family of chips had the same goal, but with the new chip, ARM claims much greater power savings. According to the company, the 32-bit chip consumes just nine microamps per megahertz, an impressively low amount even for an 8- or 16-bit chip.

    via ARM Wants to Put the Internet in Your Umbrella | Wired Enterprise | Wired.com.

    Lower power means a very conservative power budget especially for devices connected to the network. And 32 bits is nothing to sneeze at considering most manufacturers would pick a 16 or 8-bit chip to bring down the cost and power budget too. According to this article the degree of power savings is so great in fact that in sleep mode the chip consumes almost no power at all. For this market Moore’s Law is paying off big benefits especially given the bonus of a 32bit core. So not only will you get a very small lower power cpu, you’ll have a much more diverse range of software that could run on it and take advantage of a larger memory address space as well. I think non-PC electronics could include things as simple as web cams or cellphone cameras. Can you imagine a CMOS camera chip with a whole 32bit cpu built in? Makes you wonder no just what it could do, but what ELSE it could do, right?

    The term ‘Internet of Things‘ is bandied about quite a bit as people dream about cpus and networks connecting ALL the things. And what would be the outcome if your umbrella was connected to the Internet? What if ALL the umbrellas were connected? You could log all kinds of data, whether it was opened or close, what the ambient temperature is. It would be like a portable weather station for anyone aggregating all the logged data potentially. And the list goes on and on. Instead of Tire pressure monitors, why not also capture video of the tire as it is being used commuting to work. It could help measure the tire wear and setup and appointment when you need to get a wheel alignment. It could determine how many times you hit potholes and suggest smoother alternate routes. That’s the kind of blue sky wide open conjecture that is enabled by a 32-bit low/no power cpu.

    Moore's Law, The Fifth Paradigm.
    Moore’s Law, The Fifth Paradigm. (Photo credit: Wikipedia)
  • Accidental Time Capsule: Moments from Computing in 1994 (from RWW)

    Byte Magazine is one of the reasons Im here today, doing what I do. Every month, Byte set its sights on the bigger picture, a significant trend that might be far ahead or way far ahead. And in July 1994, Jon Udell to this very day, among the most insightful people ever to sign his name to an article was setting his sights on the inevitable convergence between the computer and the telephone.

    via Accidental Time Capsule: Moments from Computing in 1994, by 

    Jon Udell
    Jon Udell (Photo credit: Wikipedia)

    I also liked Tom Halfhill, Jerry Pournelle, Steve Gilmore, and many other writers at Byte Inc. over the years too. I couldn’t agree more with Scott Fulton, as I still am a big fan of Jon Udell and any projects he worked on and documented. I can credit Jon Udell for getting me to be curious about weblogging, Radio Userland, WordPress, Flickr and del.icio.us (social bookmarking website). And watching his progress on a ‘Calendar of Public Calendars’, The elmcity project. Jon’s attempting to catalog and build an aggregated list of calendars that have RSS style feeds that anyone can subscribe to. No need for automated emails filling a filtered email box. No, you just fire up a browser and read what’s posted. You find out what’s going on and just add the event to your calendar.

    As Jon has discovered the calendar exists, the events are there, they just aren’t evenly distributed yet (ie much like the future). So in his analysis of ‘what works’ Jon’s found some sterling examples of calendar keeping and maintenance some of which has popped up in interesting places, like Public School systems. However the biggest downfall of all events calendars is the all too common practice of taking Word Documents and exporting them as PDF files which get posted to a website. THAT is the calendar for far too many organizations and it fails utterly as a means of ‘discovering’ what’s going on.

    Suffice it to say elmcity has been a long term goal of organizing and curatorial work that Jon is attempting to get an informal network of like-minded people involved in. And as different cities form up calendar ‘hubs’ Jon is collecting them into larger networks so that you can just search one spot and find out ‘what’s happening’ and then adding those events to your own calendar in a very seamless and lightweight manner. I highly recommend following Jon’s weblog as he’s got the same ability to explain and analyze these technologies that he excelled at while at Byte Inc. And continues to follow his bliss and curiosity about computers, networks and more generally technology.

  • Stephen Wolfram Blog : The Personal Analytics of My Life

    Publicity photo of en:Stephen Wolfram.
    Publicity photo of en:Stephen Wolfram. (Photo credit: Wikipedia)

    One day I’m sure everyone will routinely collect all sorts of data about themselves. But because I’ve been interested in data for a very long time, I started doing this long ago. I actually assumed lots of other people were doing it too, but apparently they were not. And so now I have what is probably one of the world’s largest collections of personal data.

    via Stephen Wolfram Blog : The Personal Analytics of My Life.

    Gordon Bell
    Gordon Bell (Photo credit: Wikipedia)

    In some ways similar to Stephen Wolfram, Gordon Bell at Microsoft has engaged in an attempt to record his “LifeBits” using a ‘wearable’ computer to record video and capture what goes on in his life. In my opinion, Stephen Wolfram has done Gordon Bell one better by collecting data over a much longer period and of a much wider range than Gordon Bell accomplished within the scope of LifeBits. Reading Wolfram’s summary of all his data plots is as interesting as seeing the plots themselves. There can be no doubt that Stephen Wolfram has always and will continue to think differently than most folks, and dare I say most scientists. Bravo!

    The biggest difference between MyLifeBits versus Wolfram’s personal data collection is the Wolram’s emphasis on non-image based data. The goal it seems for the Microsoft Research group is to fulfill the promise of Vannevar Bush’s old article titled “As we may think” printed in the Atlantic, July 1945. In this article Bush proposes a prototype of a more ‘visual computer’ that would act as a memory recall and analytic thinking aid. He named it the Memex.

    Gordon Bell and Jim Gemmell of Microsoft Research, seemed to be focused on the novelty of a camera carried and taking pictures automatically of the area immediately in front of it. This log of ‘what was seen’ was meant to help cement visual memory and recall. Gordon Bell had spent a long period of time digitizing, “articles, books, cards, CDs, letters, memos, papers, photos, pictures, presentations, home movies, videotaped lectures, and voice recordings and stored them digitally.” This over emphasis on visual data I think if used properly might be useful to some but is more a product of Gordon Bell’s own personal interest in seeing how much he could capture then catalog after the fact.

    Stephen Wolfram’s data wasn’t even necessarily based on a ‘wearable computer‘ the way MyLifeBits seems to be. Wolfram built in a logging/capture system into things he did daily on a computer. This even included data collected by a digital pedometer to measure the steps he would take in a day. The plots of the data are most interesting in comparison to one another especially given the length of time over which they were collected (a much bigger set than Gordon Bell’s Life Bits I dare say). So maybe this points to another step forward in the evolution of Lifebits perhaps? Wolfram’s data seems to be more useful in a lot of ways, he’s not as focused on memory and recall of any given day. But maybe a synthesis of Wolfram’s data collection methods and analysis and Gordon Bell’s MyLifeBits capture of image data might be useful to a broader range of people if someone wanted to embrace and extend these two scientists’ personal data projects.

  • Hope for a Tool-Less Tomorrow | iFixit.org

    I’ve seen the future, and not only does it work, it works without tools. It’s moddable, repairable, and upgradeable. Its pieces slide in and out of place with hand force. Its lid lifts open and eases shut. It’s as sleek as an Apple product, without buried components or proprietary screws.

    via Hope for a Tool-Less Tomorrow | iFixit.org.HP Z1 worstation

    Oh how I wish this were true today for Apple. I say this as a recent purchaser of a Apple re-furbished iMac 27″. My logic and reasoning for going with the refurbished over new was based on a few bits of knowledge gained reading Macintosh weblogs. The rumors I read included the idea that Apple repaired items are strenuously tested before being re-sold. In some cases return items are not even broken, they are returns based on buyers remorse or cosmetic problems. So there’s a good chance the logic board and lcd have no problems. Now reading back this Summer just after the launch of Mac OS X 10.7 (Lion), I read about lots of problems with crashes off 27″ iMacs. So I figured a safer bet would be to get a 21″ iMac. But then I started thinking about Flash-based Solid State Disks. And looking at the prohibitively high prices Apple charges for their installed SSDs, I decided I needed something that I could upgrade myself.

    But as you may know iMacs over time have never been and continue to remain not user up-gradable. However, that’s not to say people haven’t tried or succeeded in upgrading their own iMacs over the years. Enter the aftermarket for SSD upgrades. Apple has attempted to zig and zag as the hobbyists swap in newer components like larger hard drives and SSDs. Witness the Apple temperature sensor on the boot drive in the 27″ iMac, where they have added a sensor wire to measure the internal heat of the hard drive. As the Mac monitors this signal it will rev-up the internal fans. Any iMac hobbyist attempting to swap out a a 4TByte or 3TByte drive for the stock Apple 2TByte drive will suffer the inevitable panic mode of the iMac as it cannot see its temperature sensor (these replacement drives don’t have the sensor built-in) and assumes the worst. They say the noise is deafening when those fans speed up, and they never, EVER slow down. This Apple’s attempt insure sanctity through obscurity. No one is allowed to mod or repair, and that means anyone foolish enough to attempt to swap their internal hard drive on the iMac.

    But, there’s a workaround thank goodness and that is the 27″ iMac whose internal case is just large enough to install a secondary hard drive. You can slip a 2.5″ SSD into that chassis. You just gotta know how to open it up. And therein lies the theme of this essay, the user upgradable, user friendly computer case design. The antithesis of this idea IS the iMac 27″ if you read these steps from iFixit and the photographer Brian Tobey. Both of these websites make clear the excruciating minutiae of finding and disconnecting the myriad miniature cables that connect the logic board to the computer. Without going through those steps one cannot gain access to the spare SATA connectors facing towards the back of the iMac case. I decided to go through these steps to add an SSD to my iMac right after it was purchased. I thought Brian Tobey’s directions were just slightly better and had more visuals pertinent to the way I was working on the iMac as I opened up the case.

    It is in a word a non-trivial task. You need the right tools, the right screwdrivers. In fact you even need suction cups! (thankyou Apple). However there is another way, even for so-called All-in-One style computer designs like the iMac. It’s a new product from Hewlett-Packard targeted for the desktop engineering and design crowd. It’s an All-in-One workstation that is user upgradable and it’s all done without any tools at all. Let me repeat that last bit again, it is a ‘tool-less’ design. What you may ask is a tool-less design? I hadn’t heard of it either until I read this article in iFixit. And after having followed the links to the NewEgg.com website to see what other items were tagged as ‘tool-less’ I began to remember some hints and stabs at this I had seen in some Dell Optiplex desktops some years back. The ‘carrier’ bracket for the CD/DVD and HDD drive bays were these green plastic rails that just simply ‘pushed’ into the sides of the drive (no screws necessary).

    And when I considered my experience working with the 27″ iMac actually went pretty well (it booted up the first time no problems) after all I had done to it, I consider myself very lucky. But it could have been better. And there’s no reason it cannot be better for EVERYONE. It also made me think of the XO Laptop (One Laptop Per Child project) and I wondered how tool-less that laptop might be. How accessible are any of these designs? And it also made me recall the Facebook story I recently commented on about how Facebook is designing its own hard drive storage units to make them easier to maintain (no little screws to get lost and dropped onto a fully powered motherboard and short things out). So I much more hope than when I first embarked on the do it yourself journey of upgrading my iMac. Tool-less design today, Tool-less design tomorrow and Tool-less design forever.

    Image representing Hewlett-Packard as depicted...
    Image via CrunchBase
  • The PC is dead. Why no angry nerds? :: The Future of the Internet — And How to Stop It

    Famously proprietary Microsoft never dared to extract a tax on every piece of software written by others for Windows—perhaps because, in the absence of consistent Internet access in the 1990s through which to manage purchases and licenses, there’d be no realistic way to make it happen.

    via The PC is dead. Why no angry nerds? :: The Future of the Internet — And How to Stop It.

    While true that Microsoft didn’t tax Software Developers who sold product running on the Windows OS, a kind of a tax levy did exist for hardware manufacturers creating desktop pc’s with Intel chips inside. But message received I get the bigger point, cul-de-sacs don’t make good computers. They do however make good appliances. But as the author Jonathan Zittrain points out we are becoming less aware of the distinction between a computer and an applicance, and have lowered our expectation accordingly.

    In fact this points to the bigger trend of not just computers becoming silos of information/entertainment consumption no, not by a long shot. This trend was preceded by the wild popularity of MySpace, followed quickly by Facebook and now Twitter. All platforms as described by their owners with some amount of API publishing and hooks allowed to let in 3rd party developers (like game maker Zynga). But so what if I can play Scrabble or Farmville with my ‘friends’ on a social networking ‘platform’? Am I still getting access to the Internet? Probably not, as you are most likely reading what ever filters into or out of the central all-encompassing data store of the Social Networking Platform.

    Like the old World Maps in the days before Columbus, there be Dragons and the world ends HERE even though platform owners might say otherwise. It is an Intranet pure and simple, a gated community that forces unique identities on all participants. Worse yet it is a big brother-like panopticon where each step and every little movement monitored and tallied. You take quizzes, you like, you share, all these things are collection points, check points to get more data about you. And that is the TAX levied on anyone who voluntarily participates in a social networking platform.

    So long live the Internet, even though it’s frontier, wild-catting days are nearly over. There will be books and movies like How the Cyberspace was Won, and the pioneers will all be noted and revered. We’ll remember when we could go anywhere we wanted and do lots of things we never dreamed. But those days are slipping as new laws get passed under very suspicious pretenses all in the name of Commerce. As for me I much prefer Freedom over Commerce, and you can log that in your stupid little database.

    Cover of "The Future of the Internet--And...
    Cover via Amazon
  • Samsung: 2 GHz Cortex-A15 Exynos 5250 Chip

    Samsung also previewed a 2 GHz dual-core ARM Cortex-A15 application processor, the Exynos 5250, also designed on its 32-nm process. The company said that the processor is twice as fast as a 1.5 GHz A9 design without having to jump to a quad-core layout.

    via Samsung Reveals 2 GHz Cortex-A15 Exynos 5250 Chip.

    Deutsch: Offizielles Logo der ARM-Prozessorarc...
    Image via Wikipedia

    More news on the release dates and the details off Samsung’s version of the ARM Cortex A15 cpu for mobile devices. Samsung is helping ramp up performance by shrinking the design rule down to 32nm, and in the  A15 cpu dropping two out of the four possible cores. This choice is to make room for the integrated graphics processor. It’s a deluxe system on a chip that will no doubt give any A9 equipped tablet a run for its money. Indications at this point by Samsung are that the A15 will be a tablet only cpu and not adapted to smartphone use.

    Early in the Fall there were some indications that the memory addressing of the Cortex A15 would be enhanced to allow larger memories (greater than 4GBytes) to be added to devices. As it is now memory addressing isn’t a big issue as memory extensions (up to 40bits Large Physical  Address Extensions-LPAE) are allowed under the current generation Cortex A9. However the Instructions are still the same 32 bit Instruction Set longtime users of the ARM architecture are familiar with, and as always are backward compatible with previous generation software. It would appear that the biggest advantage to moving to Cortex A15 would be the potential for higher clock rates, decent power management and room to grow on the die for embedded graphics.

    Apple in it’s designs using the Cortex processors has stayed one generation behind the rest of the manufacturers and used all possible knowledge and brute force to eek out a little more power savings. Witness the iPad battery life still tops most other devices on the market. By creating a fully customized Cortex A8, Apple has absolutely set the bar on power management on die, and on the motherboard as well. If Samsung decides to go the route of pure power and clock, but sacrifices two cores to get the power level down I just hope they can justify that effort with equally amazing advancements in the software that runs on this new chip. Whether it be a game or better yet a snazzy User Interface, they need to differentiate themselves and try to show off their new cpu.

  • Augmented Reality Maps and Directions Coming to iPhone

    iOS logo
    Image via Wikipedia

    Of course, there are already turn-by-turn GPS apps for iOS, Android and other operating systems, but having an augmented reality-based navigational system thats native to the phone is pretty unique.

    via Augmented Reality Maps and Directions Coming to iPhone.

    In the deadly navigation battle between Google Android and Apple iOS a new front is being formed, Augmented Reality. Apple has also shown that it’s driven to create a duplicate of the Google Maps app for iOS in an attempt to maintain its independence from the Googleplex by all means possible. Though Apple may re-invent the wheel (of network available maps), you will be pleasantly surprised what other bells & whistles get thrown in as well.

    Enter the value-added feature of Augmented Reality. Apple is now filing patents on AR relating to handheld device navigation. And maybe this time ’round the Augmented Reality features will be a little more useful than marked up Geo Locations. To date Google Maps hasn’t quite approached this level of functionality, but do have most of the most valuable dataset (Street View) that would allow them to also add an Augmented Reality component. The question is who will get to market first with the most functional, and useful version of Augmented Reality maps?

  • Apple patents hint at future AR screen tech for iPad | Electronista

    Structure of liquid crystal display: 1 – verti...
    Image via Wikipedia

    Apple may be working on bringing augmented reality views to its iPad thanks to a newly discovered patent filing with the USPTO.

    via Apple patents hint at future AR screen tech for iPad | Electronista. (Originally posted at AppleInsider at the following link below)

    Original Article: Apple Insider article on AR

    Just a very brief look at a couple of patent filings by Apple with some descriptions of potential applications. They seem to want to use it for navigation purposes using the onboard video camera. One half the screen will use the live video feed, the other half is a ‘virtual’ rendition of that scene in 3D to allow you to find a path or maybe a parking space in between all those buildings.

    The second filing mentions a see-through screen whose opacity can be regulated by the user. The information display will take precedence over the image seen through the LCD panel. It will default to totally opaque using no voltage whatsoever (In Plane switching design for the LCD).

    However the most intriguing part of the story as told by AppleInsider is the use of sensors on the device to determine angle, direction, bearing to then send over the network. Why the network? Well the whole rendering of the 3D scene as described in first patent filing is done somewhere in the cloud and spit back to the iOS device. No onboard 3D rendering needed or at least not at that level of detail. Maybe those datacenters in North Carolina are really cloud based 3D rendering farms?