blogtools entertainment google media surveillance

Audrey Watters: The Future of Ed-Tech is a Reclamation Project #DLFAB

Audrey Watters Media Predicts 2011
Audrey Watters Media Predicts 2011 (Photo credit: @Photo.)

We can reclaim the Web and more broadly ed-tech for teaching and learning. But we must reclaim control of the data, content, and knowledge we create. We are not resources to be mined. Learners do not enter our schools and in our libraries to become products for the textbook industry and the testing industry and the technology industry and the ed-tech industry to profit from. 

via The Future of Ed-Tech is a Reclamation Project #DLFAB.(by Audrey Watters)

Really philosophical article about what it is Higher Ed is trying to do here. It’s not just about student portfolios, it’s Everything. It is the books you check out the seminars you attend, the videos you watched the notes you took all the artifacts of learning. And currently they are all squirreled away and stashed inside data silos like Learning Management Systems.

The original World Wide Web was like the Wild, Wild West, an open frontier without visible limit. Cloud services and commercial offerings has fenced in the frontier in a series of waves of fashion. Whether it was AOL,, Geocities, Friendster, MySpace, Facebook the web grew in the form of gated communities and cul-de-sacs for “members only”. True the democracy of it all was membership was open and free, practically anyone could join, all you had to do was hand over the control, the keys to YOUR data. That was the bargain, by giving up your privacy, you gained all the rewards of socializing with long lost friends and acquaintances. From that little spark the surveillance and “data mining” operation hit full speed.

Reclaiming ownership of all this data, especially the component that is generated in one’s lifetime of learning is a worthy cause. Audrey Watters references Jon Udell in an example of the kind of data we would want to own and limit access to our whole lives. From the article:

Udell then imagines what it might mean to collect all of one’s important data from grade school, high school, college and work — to have the ability to turn this into a portfolio — for posterity, for personal reflection, and for professional display on the Web.

Indeed, and at the same time though this data may live on the Internet somewhere access is restricted to those whom we give explicit permission to access it. That’s in part a project unto itself, this mesh of data could be text, or other data objects that might need to be translated, converted to future readable formats so it doesn’t grow old and obsolete in an abandoned file format. All of this stuff could be give a very fine level of access control to individuals you have approved to read parts or pieces or maybe even give wholesale access to. You would make that decision and maybe just share the absolute minimum necessary. So instead of seeing a portfolio of your whole educational career, you just give out the relevant links and just those links. That’s what Jon Udell is pursuing now through the Thali Project. Thali is a much more generalized way to share data from many devices but presented in a holistic, rationalized manner to whomever you define as a trusted peer. It’s not just about educational portfolios, it’s about sharing your data. But first and foremost you have to own the data or attempt to reclaim it from the wilds and wilderness of the social media enterprise, the educational enterprise, all these folks who want to own your data while giving you free services in return.

Audrey uses the metaphor, “Data is the new oil” and that at the heart is the problem. Given the free oil, those who invested in holding onto and storing the oil are loathe to give it up. And like credit reporting agencies with their duplicate and sometime incorrect datasets, those folks will give access to that unknown quantity to the highest bidder for whatever reason. Whether its campaign staffers, private detectives, vengeful spouses, doesn’t matter as they own the data and set the rules as to how it is shared. However in the future when we’ve all reclaimed ownership of our piece of the oil field, THEN we’ll have something. And when it comes to the digital equivalent of the old manila folder, we too will truly own our education.

blogtools google media technology wired culture

Jon Udell on filter failure

Jon Udell
Jon Udell (Photo credit: Wikipedia)

It’s time to engineer some filter failure

Jon’s article points out his experience of the erosion of serendipity or at least opposing view points that social media enforces (somewhat) accidentally. I couldn’t agree more. One of the big promises of the Internet was that it was unimaginably vast and continuing to grow. The other big promise was that it was open in the way people could participate. There were no dictats or proscribed methods per se, but etiquette at best. There were FAQs to guide us, and rules of thumb to prevent us from embarrassing ourselves. But the Internet, It was something so vast one could never know or see everything that was out there, good or bad.

But like the Wild est, search engines began fencing in the old prairie. At once both allowing us to get to the good stuff and waste less time doing important stuff. But therein lies the bargain of the “filter”, giving up control to an authority to help you do something with data or information. All the electrons/photons whizzing back and forth on the series of tubes exisiting all at once, available (more or less) all at once. But now with Social Neworks, like AOL before we suffer from the side effects of the filter.

I remember being an AOL member, finally caving in and installing the app from some free floppy disk I would get in the mail at least once a week. I registered my credit card for the first free 20 hours (can you imagine?). And just like people who ‘try’ Netflix, I never unregistered. I lazily stayed the course and tried getting my money’s worth, spending more time online. At the same time ISPs, small mom and pop type shops were renting off parts of a Fractional T-1 leased line they owned, putting up modem pools and started selling access to the “Internet”. Nobody knew why you would want to do that with all teh kewl thingz one could do on AOL. Shopping, Chat Rooms, News, Stock quotes. It was ‘like’ the Internet. But not open and free and limitless like the Internet. And that’s where the failure begins to occur.

AOL had to police it’s population, enforce some codes of conduct. They could kick you off, stop accepting your credit card payments. One could not be kicked of the ‘Internet’ in the same way, especially in those early days. But getting back to Jon’s point about filters that fail and allow you to see the whole world, discover an opposing viewpoint or better mulitple opposing viewpoints. That is the promise of the Internet, and we’re seeing less and less of it as we corral ourselves into our favorite brand name social networking community. I skipped MySpace, but I did jump on Flickr, and eventually Facebook. And in so doing gave up a little of that wildcat freedom and frontier-like experience of  dial-up over PPP or SLIP connection to a modem pool, doing a search first on Yahoo, then AltaVista, and then Google to find the important stuff.

Enhanced by Zemanta
media technology wired culture

How Yahoo Killed Flickr and Lost the Internet

Image representing Flickr as depicted in Crunc...
Image via CrunchBase

But moreover, Yahoo needed to leverage this thing that it had just bought. Yahoo wanted to make sure that every one of its registered users could instantly use Flickr without having to register for it separately. It wanted Flickr to work seamlessly with Yahoo Mail. It wanted its services to sing together in harmony, rather than in cacophonous isolation. The first step in that is to create a unified login. That’s great for Yahoo, but it didn’t do anything for Flickr, and it certainly didn’t do anything for Flickr’s (extremely vocal) users.

via How Yahoo Killed Flickr and Lost the Internet.

Gizmodo article on how Yahoo first bought Flickr then proceeded to let it erode. As the old cliche sez’, The road to hell is paved with good intentions. For me personally I didn’t really mind the issue others had with the Yahoo login. I was allowed to use the Flickr login for a long time after they were taken over. But I still had to create a Yahoo account even if I never used it for anything other than accessing Flickr. Once I realized this was the case, i dearly wished Google had bought them as I WAS already using GMail and other Google services like it.

Most recently there’s been a lot of congratulations spread around following the release of a new Flickr uploader. I always had to purchase an add-on to my Apple iPhoto in order to streamline the cataloging, annotating, and arranging of picture sets. Doing the uploads one at a time through the Web interface was not on, I needed bulk uploads, but I refused to export picture sets out of iPhoto just to get them into Flickr. So an aftermarket arose for people like me invested heavily into iPhoto. And these add-on programs worked great, but they would go out of date or be incompatible with newer versions of iPhoto. So you would have to go back and drop another $10 USD on a newer version of your own iPhoto/Flickr exporter.

And by this time Facebook had so taken over the social networking aspects of picture sharing, no one could see the point of a single medium service (just picture sharing). When Facebook allowed you to converse, play games, and poke your friends, why would you log out and open Flickr to just manage your photos. The level of integration and friction was too high for the bulk of Internet users. So Facebook had gain the mindshare, reduced the friction and made everything seamless and just work the way everyone thought it should. And it is hard to come back from a defeat like that with the millions of sign ups that Facebook was enjoying. Yahoo should have had an app for that early on and let people share their Flickr sets with people using  similar access controls and levels of security.

I would have found Flickr a lot more useful if it had been well bridged into the Facebook universe during the critical time period of 2008-2010. For me that would have been just the time period when things were really chaotically ramping up in terms of total new Facebook account creations. The addition of an insanely great Flickr App for Facebook could have made a big difference with helping grow the community awareness and possibly garner a few new Flickr accounts along the way. However, agendas are always so much more blinders in the way that they close you off to the environment in which you operate. Flickr and Yahoo’s merger and the agenda of ‘integration’ more or less was the single most important thing going on during the giant Facebook ramp-up. And so it goes, Yahoo stumbles more than once and takes a perfectly good Web 2.0 app and lets it slowly erode Friendster and MySpace before it. So long Flickr it’s been good to know yuh.

Image representing Yahoo! as depicted in Crunc...
Image via CrunchBase
blogtools media technology wired culture

Owning Your Words: Personal Clouds Build Professional Reputations | Cloudline |

My first blogging platform was Dave Winer’s Radio UserLand. One of Dave’s mantras was: “Own your words.” As the blogosophere became a conversational medium, I saw what that could mean. Radio UserLand did not, at first, support comments. That turned out to be a constraint well worth embracing. When conversation emerged, as it inevitably will in any system of communication, it was a cross-blog affair. I’d quote something from your blog on mine, and discuss it. You’d notice, and perhaps write something on your blog referring back to mine.

via Owning Your Words: Personal Clouds Build Professional Reputations | Cloudline |

I would love to be able to comment on an article or a blog entry by passing a link to a blog entry within my own WordPress instance on However rendering that ‘feed’ back into the comments section on the originating article/blog page doesn’t seem to be common. At best I think I could drop a permalink into the comments section so people might be tempted to follow the link to my blog. But it’s kind of unfair to an unsuspecting reader to force them to jump and in a sense re-direct to another website just to follow a commentary. So I fully agree there needs to be a pub/sub style way of passing my blog entry by reference back into the comments section of the originating article/blog. Better yet that gives me some ability to amend and edit my poor choice of words the first time I publish a response. Too often silly mistakes get preserved in the ‘amber’ of the comments fields in the back-end MySQL databases of those content management systems housing many online web magazines. So there’s plenty of room for improvement and RSS could easily embrace and extend this style of commenting I think if someone were driven to develop it.

media technology wired culture

Accidental Time Capsule: Moments from Computing in 1994 (from RWW)

Byte Magazine is one of the reasons Im here today, doing what I do. Every month, Byte set its sights on the bigger picture, a significant trend that might be far ahead or way far ahead. And in July 1994, Jon Udell to this very day, among the most insightful people ever to sign his name to an article was setting his sights on the inevitable convergence between the computer and the telephone.

via Accidental Time Capsule: Moments from Computing in 1994, by 

Jon Udell
Jon Udell (Photo credit: Wikipedia)

I also liked Tom Halfhill, Jerry Pournelle, Steve Gilmore, and many other writers at Byte Inc. over the years too. I couldn’t agree more with Scott Fulton, as I still am a big fan of Jon Udell and any projects he worked on and documented. I can credit Jon Udell for getting me to be curious about weblogging, Radio Userland, WordPress, Flickr and (social bookmarking website). And watching his progress on a ‘Calendar of Public Calendars’, The elmcity project. Jon’s attempting to catalog and build an aggregated list of calendars that have RSS style feeds that anyone can subscribe to. No need for automated emails filling a filtered email box. No, you just fire up a browser and read what’s posted. You find out what’s going on and just add the event to your calendar.

As Jon has discovered the calendar exists, the events are there, they just aren’t evenly distributed yet (ie much like the future). So in his analysis of ‘what works’ Jon’s found some sterling examples of calendar keeping and maintenance some of which has popped up in interesting places, like Public School systems. However the biggest downfall of all events calendars is the all too common practice of taking Word Documents and exporting them as PDF files which get posted to a website. THAT is the calendar for far too many organizations and it fails utterly as a means of ‘discovering’ what’s going on.

Suffice it to say elmcity has been a long term goal of organizing and curatorial work that Jon is attempting to get an informal network of like-minded people involved in. And as different cities form up calendar ‘hubs’ Jon is collecting them into larger networks so that you can just search one spot and find out ‘what’s happening’ and then adding those events to your own calendar in a very seamless and lightweight manner. I highly recommend following Jon’s weblog as he’s got the same ability to explain and analyze these technologies that he excelled at while at Byte Inc. And continues to follow his bliss and curiosity about computers, networks and more generally technology.

media surveillance technology wired culture

Stephen Wolfram Blog : The Personal Analytics of My Life

Publicity photo of en:Stephen Wolfram.
Publicity photo of en:Stephen Wolfram. (Photo credit: Wikipedia)

One day I’m sure everyone will routinely collect all sorts of data about themselves. But because I’ve been interested in data for a very long time, I started doing this long ago. I actually assumed lots of other people were doing it too, but apparently they were not. And so now I have what is probably one of the world’s largest collections of personal data.

via Stephen Wolfram Blog : The Personal Analytics of My Life.

Gordon Bell
Gordon Bell (Photo credit: Wikipedia)

In some ways similar to Stephen Wolfram, Gordon Bell at Microsoft has engaged in an attempt to record his “LifeBits” using a ‘wearable’ computer to record video and capture what goes on in his life. In my opinion, Stephen Wolfram has done Gordon Bell one better by collecting data over a much longer period and of a much wider range than Gordon Bell accomplished within the scope of LifeBits. Reading Wolfram’s summary of all his data plots is as interesting as seeing the plots themselves. There can be no doubt that Stephen Wolfram has always and will continue to think differently than most folks, and dare I say most scientists. Bravo!

The biggest difference between MyLifeBits versus Wolfram’s personal data collection is the Wolram’s emphasis on non-image based data. The goal it seems for the Microsoft Research group is to fulfill the promise of Vannevar Bush’s old article titled “As we may think” printed in the Atlantic, July 1945. In this article Bush proposes a prototype of a more ‘visual computer’ that would act as a memory recall and analytic thinking aid. He named it the Memex.

Gordon Bell and Jim Gemmell of Microsoft Research, seemed to be focused on the novelty of a camera carried and taking pictures automatically of the area immediately in front of it. This log of ‘what was seen’ was meant to help cement visual memory and recall. Gordon Bell had spent a long period of time digitizing, “articles, books, cards, CDs, letters, memos, papers, photos, pictures, presentations, home movies, videotaped lectures, and voice recordings and stored them digitally.” This over emphasis on visual data I think if used properly might be useful to some but is more a product of Gordon Bell’s own personal interest in seeing how much he could capture then catalog after the fact.

Stephen Wolfram’s data wasn’t even necessarily based on a ‘wearable computer‘ the way MyLifeBits seems to be. Wolfram built in a logging/capture system into things he did daily on a computer. This even included data collected by a digital pedometer to measure the steps he would take in a day. The plots of the data are most interesting in comparison to one another especially given the length of time over which they were collected (a much bigger set than Gordon Bell’s Life Bits I dare say). So maybe this points to another step forward in the evolution of Lifebits perhaps? Wolfram’s data seems to be more useful in a lot of ways, he’s not as focused on memory and recall of any given day. But maybe a synthesis of Wolfram’s data collection methods and analysis and Gordon Bell’s MyLifeBits capture of image data might be useful to a broader range of people if someone wanted to embrace and extend these two scientists’ personal data projects.

computers media surveillance technology

RE: Erics Archived Thoughts: Vigilance and Victory

Erics Archived Thoughts: Vigilance and Victory.

While I agree there might be a better technical solution to the DNS blocking adopted by SOPA and PIPA bills, less formal networks are in essence filling the gap. By this I mean the MegaUpload takedown that occurred yesterday at the the order of the U.S. Justice Department. Without even the benefit of SOPA or PIPA, they ordered investigations, arrests and takedowns of the whole MegaUpload enterprise. But what is interesting is the knock-on effects social networks had in the vacuum left by the DNS blocking. Within hours the DNS was replaced by it’s immediate pre-cursors. That’s right, folks were sending the IP addresses of available MegaUpload hosts by plain text in Tweet messages the world ’round. And given the announcement today that Twitter will be closing in on it’s 500 Million’th account being created I’m not too worried about a technical solution to DNS blocking. That too is already moot, by virtue of the the fact of social networking and simple numeric IP addresses. Long live IPv4 and the quadruple octets

computers entertainment gpu h.264 media

AnandTech – AMD Radeon HD 7970 Review: 28nm And Graphics Core Next, Together As One

Image representing AMD as depicted in CrunchBase
Image via CrunchBase

Quick Sync made real-time H.264 encoding practical on even low-power devices, and made GPU encoding redundant at the time. AMD of course isn’t one to sit idle, and they have been hard at work at their own implementation of that technology: the Video Codec Engine VCE.

via AnandTech – AMD Radeon HD 7970 Review: 28nm And Graphics Core Next, Together As One.

Intel’s QuickSync helped speed up the realtime encoding of H.264 video. AMD is striking back and has Hybrid Mode VCE operations that will speed things up EVEN MORE! The key to having this hit the market and get widely adopted of course is the compatibility of the software with a wide range of video cards from AMD. The original CUDA software environment from nVidia took a while to disperse into the mainstream as it had a limited number of graphics cards it could support when it rolled out. Now it’s part of the infrastructure and more or less provided gratis whenever you buy ANY nVidia graphics card today. AMD has to follow this semi-forced adoption of this technology as fast as possible to deliver the benefit quickly. At the same time the User Interface to this VCE software had better be a great design and easy to use. Any type of configuration file dependencies and tweaking through preference files should be eliminated to the point where you merely move a slider up and down a scale (Slower->Faster). And that should be it.

And if need be AMD should commission an encoder App or a plug-in to an open source project like HandBrake to utilize the VCE capability upon detection of the graphics chip on the computer. Make it ‘just happen’ without the tempting early adopter approach of making a tool available and forcing people to ‘build’ a version of an open source encoder to utilize the hardware properly. Hands-off approaches that favor early adopters is going to consign this technology to the margins for a number of years if AMD doesn’t take a more activist role. QuickSync on Intel hasn’t been widely touted either so maybe it’s a moot point to urge anyone to treat their technology as an insanely great offering. But I think there’s definitely brand loyalty that could be brought into play if the performance gains to be had with a discreet graphics card far outpace the integrated graphics solution of QuickSync provided by Intel. If you can achieve a 10x order of magnitude boost, you should be pushing that to all the the potential computer purchasers from this announcement forward.

media science & technology technology

MIT boffin: Salted disks hold SIX TIMES more data • The Register

Close-up of a hard disk head resting on a disk...
Image via Wikipedia

This method shows, Yang says, that “bits can be patterned more densely together by reducing the number of processing steps”. The HDD industry will be fascinated to understand how BPM drives can be made at a perhaps lower-than-anticipated cost.

via MIT boffin: Salted disks hold SIX TIMES more data • The Register.

Moore’s Law applies to semi-conductors built on silicon wafers. And to a lesser extent it has had some application to hard disk drive storage as well. When IBM created is GMR (Giant Magneto-Resistive) read/write head technology and was able to develop it into a shipping product, a real storage arms race began. Densities increased, prices dropped and before you knew it hard drives went from 1Gbyte to 10Gbytes overnight practically speaking. Soon a 30Gbyte drive was the default average size boot and data drive for every shipping PC when just a few years before a 700Mbyte drive was the norm. This was a greater than 10X improvement with the adoption of a new technology.

I remember a lot of those touted technologies were added on and tacked on at the same time. PRML (Partial Read Maximum Likelihood) and Perpendicular Magnetic Recording  (PMR) too both helped keep the ball rolling in terms of storage density. IBM even did some pretty advanced work layering magnetic layers between magnetically insulating layers (using thin layers of Ruthenium) to help create even stronger magnetic recording media for the newer higher density drives.

However each new incremental advance has now run a course and the advances in storage technology are slowing down again. But there’s still one shining hope: Bit Patterned-Media (BPM). And in all the speculation about which technology is going to keep the storage density ball rolling, this new announcement is sure to play it’s part. A competing technique using lasers to heat the disk surface before writing data is also being researched and discussed, but is likely to force a lot of storage vendors to agree to make a transition to that technology simultaneously. BPM on the other hand isn’t so different and revolutionary that it must be rolled out en masse simultaneously by each drive vendor to insure everyone is compatible. And better yet BPM maybe a much lower cost and immediate way to increase storage densities without incurring big equipment and manufacturing machine upgrade costs.

So I’m thinking we’ll be seeing BPM much more quickly and we’ll continue to enjoy the advances in drive density for a little while longer.

media mobile

Augmented Reality Start-Up Ready to Disrupt Business – Tech Europe – WSJ

Image representing Layar as depicted in CrunchBase
Image via CrunchBase

“We have added to the platform computer vision, so we can recognize what you are looking at, and then add things on top of them.”

via Augmented Reality Start-Up Ready to Disrupt Business – Tech Europe – WSJ.

I’ve been a fan of Augmented Reality for a while, following the announcements from Layar over the past two years. I’m hoping out of this work comes something more than another channel for selling, advertising and marketing. But innovation always follows where the money is and artistic creative pursuits are NOT it. Witness the evolution of Layar from a toolkit to a whole package of brand loyalty add-ons ready to be sent out whole to any smartphone owner, unwitting enough to download the Layar created App.

The emphasis in this WSJ article however is not how Layar is trying to market itself. Instead they are more worried about how Layar is creating a ‘virtual’ space where meta-data is tagged onto a physical location. So a Layar Augmented Reality squatter can setup a very mundane virtual T-shirt shop (say like Second Life) in the same physical location as a high class couturier on a high street in London or Paris. What right does anyone have to squat in the Layar domain? Just like Domain Name System squatters of today, they have every right by being there first. Which brings to mind how this will evolve into a game of technical one-upsmanship whereby each Augmented Reality Domain will be subject to the market forces of popularity. Witness the chaotic evolution of social networking where AOL, Friendster, MySpace, Facebook and now Google+ all usurp market mindshare from one another.

While the Layar squatter has his T-shirt shop today, the question is who knows this other than other Layar users? Who will yet know whether anyone else will ever know? This leads me to conclude this is a much bigger deal to the WSJ than it is to anyone who might be sniped at by or squatted upon within an Augmented Reality cul-de-sac. Though those stores and corporations may not be able to budge the Layar squatters, they can at least lay claim to the rest of their empire and prevent any future miscreants from owning their virtual space. But as I say, in one-upsmanship there is no real end game, only just the NEXT game.