Audrey Watters: The Future of Ed-Tech is a Reclamation Project #DLFAB

Audrey Watters Media Predicts 2011
Audrey Watters Media Predicts 2011 (Photo credit: @Photo.)

We can reclaim the Web and more broadly ed-tech for teaching and learning. But we must reclaim control of the data, content, and knowledge we create. We are not resources to be mined. Learners do not enter our schools and in our libraries to become products for the textbook industry and the testing industry and the technology industry and the ed-tech industry to profit from. 

via The Future of Ed-Tech is a Reclamation Project #DLFAB.(by Audrey Watters)

Really philosophical article about what it is Higher Ed is trying to do here. It’s not just about student portfolios, it’s Everything. It is the books you check out the seminars you attend, the videos you watched the notes you took all the artifacts of learning. And currently they are all squirreled away and stashed inside data silos like Learning Management Systems.

The original World Wide Web was like the Wild, Wild West, an open frontier without visible limit. Cloud services and commercial offerings has fenced in the frontier in a series of waves of fashion. Whether it was AOL, Tripod.com, Geocities, Friendster, MySpace, Facebook the web grew in the form of gated communities and cul-de-sacs for “members only”. True the democracy of it all was membership was open and free, practically anyone could join, all you had to do was hand over the control, the keys to YOUR data. That was the bargain, by giving up your privacy, you gained all the rewards of socializing with long lost friends and acquaintances. From that little spark the surveillance and “data mining” operation hit full speed.

Reclaiming ownership of all this data, especially the component that is generated in one’s lifetime of learning is a worthy cause. Audrey Watters references Jon Udell in an example of the kind of data we would want to own and limit access to our whole lives. From the article:

Udell then imagines what it might mean to collect all of one’s important data from grade school, high school, college and work — to have the ability to turn this into a portfolio — for posterity, for personal reflection, and for professional display on the Web.

Indeed, and at the same time though this data may live on the Internet somewhere access is restricted to those whom we give explicit permission to access it. That’s in part a project unto itself, this mesh of data could be text, or other data objects that might need to be translated, converted to future readable formats so it doesn’t grow old and obsolete in an abandoned file format. All of this stuff could be give a very fine level of access control to individuals you have approved to read parts or pieces or maybe even give wholesale access to. You would make that decision and maybe just share the absolute minimum necessary. So instead of seeing a portfolio of your whole educational career, you just give out the relevant links and just those links. That’s what Jon Udell is pursuing now through the Thali Project. Thali is a much more generalized way to share data from many devices but presented in a holistic, rationalized manner to whomever you define as a trusted peer. It’s not just about educational portfolios, it’s about sharing your data. But first and foremost you have to own the data or attempt to reclaim it from the wilds and wilderness of the social media enterprise, the educational enterprise, all these folks who want to own your data while giving you free services in return.

Audrey uses the metaphor, “Data is the new oil” and that at the heart is the problem. Given the free oil, those who invested in holding onto and storing the oil are loathe to give it up. And like credit reporting agencies with their duplicate and sometime incorrect datasets, those folks will give access to that unknown quantity to the highest bidder for whatever reason. Whether its campaign staffers, private detectives, vengeful spouses, doesn’t matter as they own the data and set the rules as to how it is shared. However in the future when we’ve all reclaimed ownership of our piece of the oil field, THEN we’ll have something. And when it comes to the digital equivalent of the old manila folder, we too will truly own our education.

Advertisements

Jaunt – Meet the Crazy Camera That Can Make Movies for the Oculus Rift (Jordan Kushins-Gizmodo)

Oculus Rift
Oculus Rift (Photo credit: Digitas Photos)

If Facebook buying Oculus for a cool $2 billion is a step towards democratizing the currently-niche platform, Jaunt seems like an equally monumental step towards making awesome virtual reality content that appeals to folks beyond the gaming community. The VR movies in addition to VR games.

via Meet the Crazy Camera That Can Make Movies for the Oculus Rift.

Amazing story about a stealthy little company with a 3D video recording rig. This isn’t James Cameron like motion capture for 3D rendering. This is just 2D video in real time stitched together. No modeling, or texture-mapping, or animating required. Just run the video camera, capture the footage, bring it back to the studio and stitch it all together. Watch the production on your Oculus Rift head set. If you can produce 3D movies with this without having to invest in the James Cameron high end, ultra-expensive virtual sets, you just lowered the barriers to entry.

I’m also kind of disappointed that in the article the author keeps insisting that you “had to be there”. Telling us words cannot express the experience is like telling me in writing the “dog ate my homework”. I guess I “had to be there” for that too. Anyway you put it, telling me more about the company and the premises and about the prototypes means you’re writing for a Venture Capital audience, not someone who might make work using the camera or those who might consume the work made by the artists working with the camera. I say just cave into the temptation and TRY expressing the experience in words. Don’t worry if you fail, as you’ve just increased the comment rate on your story, engaging people longer after the initial date the story was published. In spite off the lack of daring, to describe the experience, I picked up enough detail, extrapolated it enough and read between the lines in a way that indicates this camera rig might well be the killer app, or authoring app for the Oculus Rift platform. Let’s hope it sees the light of day and makes it market quicker than the Google Glass prototypes floating around these days.

Enhanced by Zemanta

Nvidia Pulls off ‘Industrial Light and Magic’-Like Tools | EE Times

Image representing NVidia as depicted in Crunc...
Image via CrunchBase

The president of VMware said after seeing it (and not knowing what he was seeing), “Wow, what movie is that?” And that’s what it’s all about — dispersion of disbelief. You’ve heard me talk about this before, and we’re almost there. I famously predicted at a prestigious event three years ago that by 2015 there would be no more human actors, it would be all CG. Well I may end up being 52% or better right (phew).    – Jon Peddie

via Nvidia Pulls off ‘Industrial Light and Magic’-Like Tools | EE Times. Jon Peddie has covered the 3D animation, modeling and simulation market for YEARS. And when you can get a rise out of him like the quote above from EETimes, you have accomplished something. Between NVidia’s hardware and now its GameWorks suite of software modeling tools, you have in a word created Digital Cinema. Jon goes on to talk about how the digital simulation demo convinced a VMWare exec it was real live actors on a set. That’s how good things are getting.

And the metaphor/simile of comparing ILM to NVidia’s toolkits off the shelf is also telling. No longer does one need to have on staff computer scientists, physicists and mathematicians to help model, and simulate things like particle systems and hair. It’s all there along with ocean waves, and smoke altogether in the toolkit ready to use. Putting these tools into the hands of the users will only herald a new era of less esoteric, less high end, exclusive access to the best algorithms and tools.

nVidia GameWorks by itself will be useful to some people but re-packaging it in a way that embeds it in an existing workflow will widen the level of adoption.Whether that’s for a casual user or a student in a 3D modeling and animation course at a University. The follow-on to this is getting the APIs publishedto tap into this through current off the shelf tools like AutoCAD, 3D StudioMax, Blender, Maya, etc. Once the favorite tools can bring up a dialog box and start adding a particle system, full ray tracing to a scene at this level of quality, things will really start to take off. The other possibility is to flesh out GameWorks in a way that makes it more of a standalone, easily adopted  brand new package creatives could adopt and eventually migrate to over time. That would be another path to using GameWorks as an end-to-end digital cinema creation package.

Enhanced by Zemanta

Cargo-culting [managers are awesome / managers are cool when they’re part of your team] (tecznotes|Mike Migurski)

English: Code for America Logo
English: Code for America Logo (Photo credit: Wikipedia)

This is incidentally what’s so fascinating about the government technology position I’m in at Code for America. I believe that we’re in the midst of a shift in power from abusive tech vendor relationships to something driven by a city’s own digital capabilities. The amazing thing about GOV.UK is that a government has decided it has the know-how to hire its own team of designers and developers, and exercised its authority. That it’s a cost-saving measure is beside the point. It’s the change I want to see in the world: for governments large and small to stop copy-pasting RFP line items and cargo-culting tech trends (including the OMFG Ur On Github trend) and start thinking for themselves about their relationship with digital communication.

via managers are awesome / managers are cool when they’re part of your team (tecznotes).

My apologies to the original article’s author Mike Migurski. He was only mentioning cargo-culting in passing while he developed the greater thesis of different styles of managers. But the term cargo-culting was just too good to pass up because it’s so descriptive and so critical as to question the fundamental beliefs and arguments people make for wanting some new, New thing.

Cargo-culting. Yeah baby. Now that’s what I’m talking about. I liken this to “fashion” and trends coming and going. For instance where I work digital signage is the must have technology that everyone is begging for. Giant displays with capacitive touch capability, like 70″ iPads strapped motionless, monolithically to a wall. That’s progress. And better yet when they are unattended not being used they are digital advertising, yay! We win! It’s a win-win-win situation.

Sadly the same is true in other areas that indirectly affect where I work. Trends in Instructional Technology follow cargo-culting trends like flipping the classroom. Again people latch onto something and they have to have it regardless of the results or the benefits. None of the outcomes really enter into the decision to acquire the “things” people want. Flipping a classroom is a non-trivial task in that first you have to restructure how you teach the course. That’s a pretty steep requirement alone, but the follow-on item is to then record all your lectures in advance of the class meetings where you will then work with students to find the gaps in their knowledge. Nobody does the first part, or rarely do it because what they really want is the seemingly less difficult task they can delegate. Order up someone to record all my lectures, THEN I’ll flip my classroom. It’s a recipe for wasted effort and potential disaster.

Don’t let yourself fall victim to cargo-culting in the workplace. Know the difference between that which is new and that which is useful. Everyone will benefit from this when you can at least cast a hairy eye-ball at the new, new thing and ask simply, Why? Don’t settle for an Enron-like “Ask Why”, no. Keep working at the fundamental assumptions and arguments, justifications and rationalizations for wanting the New, new thing. If it’s valid, worthy and beneficial it will stand up to the questioning. Otherwise it will dodge, skirt, shirk, bob and weave the questions and try to subvert the process of review (accelerated, fast-tracked).

Enhanced by Zemanta

Google shows off Project Glass augmented reality specs • The Register

Thomas Hawk's picture of Sergey Brin wearing the prototype of Project Glass

But it is early days yet. Google has made it clear that this is only the initial stages of Project Glass and it is seeking feedback from the general public on what they want from these spectacles. While these kinds of heads-up displays are popular in films and fiction and dearly wanted by this hack, the poor sales of existing eye-level screens suggests a certain reluctance on the part of buyers.

via Google shows off Project Glass augmented reality specs • The Register.

The video of the Google Glass interface is kind of interesting and problematic at the same time. Stuff floats in and out of few kind of like the organism that live in the mucous of your eye. And the the latency delays of when you see something and issue a command give it a kind of halting staccato cadence when interacting with it. It looks and feels like old style voice recognition that needed discrete pauses added to know when things ended. As a demo it’s interesting, but they should issue releases very quickly and get this thing up to speed as fast as they possibly can. And I don’t mean having the CEO Sergey Brin show up at a party wearing the thing. According to reports the ‘back pack’ that the glasses are tethered to is not small. Based on the description I think Google has a long way to go yet.

http://my20percent.wordpress.com/2012/02/27/baseball-cap-head-up-displa/

And on the smaller scale tinkerer front, this WordPress blogger fashioned an older style ‘periscope’ using a cellphone, mirror and half-mirrored sunglasses to get a cheaper Augmented Reality experience. The cellphone is an HTC unit strapped onto the rim of a baseball hat. The display is than reflected downwards through a hold cut in the rim and then is reflected off a pair of sunglasses mounted at roughly a 45 degree angle. It’s cheap, it works, but I don’t know how good the voice activation is. Makes me wonder how well it might work with an iPhone Siri interface. The author even mentions that HTC is a little heavy and an iPhone might work a little better. I wonder if it wouldn’t work better still if the ‘periscope’ mirror arrangement was scrapped altogether. Instead just mount the phone flat onto the bill of the hat, let the screen face downward. The screen would then reflect off the sunglasses surface. The number of reflecting surfaces would be reduced, the image would be brighter, etc. I noticed a lot of people also commented on this fellow’s blog and might get some discussion brewing about longer term the value-add benefits to Augmented Reality. There is a killer app yet to be found and even Google hasn’t captured the flag yet.

This picture shows the Wikitude World Browser ...
This picture shows the Wikitude World Browser on the iPhone looking at the Old Town of Salzburg. Computer-generated information is drawn on top of the screen. This is an example for location-based Augmented Reality. (Photo credit: Wikipedia)

Buzzword: Augmented Reality

Augmented Reality in the Classroom Craig Knapp
Augmented Reality in the Classroom Craig Knapp (Photo credit: caswell_tom)

What it means. “Augmented reality” sounds very “Star Trek,” but what is it, exactly? In short, AR is defined as “an artificial environment created through the combination of real-world and computer-generated data.”

via Buzzword: Augmented Reality.

Nice little survey from the people at Consumer Reports, with specific examples given from the Consumer Electronics Show this past January. Whether it’s software or hardware there’s a lot of things that can be labeled and marketed as ‘Augmented Reality’. On this blog I’ve concentrated more on the apps running on smartphones with integrated cameras, acclerometers and GPS. Those pieces are important building blocks for an integrated Augmented Reality-like experience. But as this article from CR shows, your experience may vary quite a bit.

In my commentary on stories posted by others on the Internet, I have covered mostly just the examples of AR apps on mobile phones. Specifically I’ve concentrated on the toolkit provided by Layar to add metadata to existing map points of interest. The idea of ‘marking up’ the existing landscape for me holds a great deal of promise as the workload is shifted off the creator of the 3D world to the people traveling within it. The same could hold true for Massively Multiplayer Games and some worlds do allow the members to do that kind of building and marking up of the environment itself. But Layar provides a set of data that you can call up while merely pointing the cell phone camera at a compass direction and then bring up the associated data.

It’s a sort of hunt for information, sometimes it’s well done if the metadata mark-up is well done. But like many crowd-sourced efforts some amount of lower quality work or worse vandalism occurs. But this should keep anyone from trying to enhance the hidden data that can be discovered through a Layar enhanced Real World. I’m hoping the mobile phone based AR applications grow and find a niche if not a killer app. It’s still early days and mobile phone AR is not being adopted very quickly but I think there’s still a lot of untapped resources there. I don’t think we have discovered all the possible applications of mobile phone AR.

AnandTech – AMD Radeon HD 7970 Review: 28nm And Graphics Core Next, Together As One

Image representing AMD as depicted in CrunchBase
Image via CrunchBase

Quick Sync made real-time H.264 encoding practical on even low-power devices, and made GPU encoding redundant at the time. AMD of course isn’t one to sit idle, and they have been hard at work at their own implementation of that technology: the Video Codec Engine VCE.

via AnandTech – AMD Radeon HD 7970 Review: 28nm And Graphics Core Next, Together As One.

Intel’s QuickSync helped speed up the realtime encoding of H.264 video. AMD is striking back and has Hybrid Mode VCE operations that will speed things up EVEN MORE! The key to having this hit the market and get widely adopted of course is the compatibility of the software with a wide range of video cards from AMD. The original CUDA software environment from nVidia took a while to disperse into the mainstream as it had a limited number of graphics cards it could support when it rolled out. Now it’s part of the infrastructure and more or less provided gratis whenever you buy ANY nVidia graphics card today. AMD has to follow this semi-forced adoption of this technology as fast as possible to deliver the benefit quickly. At the same time the User Interface to this VCE software had better be a great design and easy to use. Any type of configuration file dependencies and tweaking through preference files should be eliminated to the point where you merely move a slider up and down a scale (Slower->Faster). And that should be it.

And if need be AMD should commission an encoder App or a plug-in to an open source project like HandBrake to utilize the VCE capability upon detection of the graphics chip on the computer. Make it ‘just happen’ without the tempting early adopter approach of making a tool available and forcing people to ‘build’ a version of an open source encoder to utilize the hardware properly. Hands-off approaches that favor early adopters is going to consign this technology to the margins for a number of years if AMD doesn’t take a more activist role. QuickSync on Intel hasn’t been widely touted either so maybe it’s a moot point to urge anyone to treat their technology as an insanely great offering. But I think there’s definitely brand loyalty that could be brought into play if the performance gains to be had with a discreet graphics card far outpace the integrated graphics solution of QuickSync provided by Intel. If you can achieve a 10x order of magnitude boost, you should be pushing that to all the the potential computer purchasers from this announcement forward.