“While 50 percent of MOOC registrants dropped off within a week or two of enrolling, attrition rates decreased substantially after that window.”
So with a 50% attrition rate everyone has to keep in mind those overwhelmingly large enrollment are not representative of the typical definition of the word “student”. They are shopping. They are consumers who once they find something is not to their taste whisk away to the next most interesting thing. Hard to say what impact this has on people “waiting in line” if there’s a cap on total enrollees. Typically though the unlimited enrollment seems to be the norm for this style of teaching as well as unlimited in ‘length of time’. You can enroll/register after the course has completed. That however throws off the measurements of dropping out as the registration occurs outside the time of the class actively being conducted. So there’s still a lot of questions that need to be answered. More experiments designed to factor out the idiosyncracies of these open fora online.
There is an interesting Q&A interview after the opening summary in this article talking with one of the primary researchers on MOOCs, Andrew Ho, from the Harvard Graduate School of Education. It’s hard to gauge “success” or to get accurate demographic information to help analyze the behavior of some MOOC enrollees. The second year of the experiments will hopefully yield better results, something like conclusions should be made after the second round. But Ho emphasizes we need more data from a wider sampling than just Harvard and MIT, that will confirm or help guide further research in the large scale, Massive Online Open Course (MOOC). As the cliché goes, the jury is still out on the value add of offering real college courses in the MOOC format.
id Software has formally announced Carmack has left the building. Prior to this week he was on a sabbatical from id, doing consulting/advisory work for the folks putting the Oculus Rift together. Work being done now is to improve the speed of the refresh on the video screens. That’s really the last biggest hurdle to jump prior to this set of VR goggles and motion sensor out on the open market. The beta units are still out there, and people are experimenting with the Oculus versions of some First Person Shooters, but the revolution is not here,… yet.
Oculus will need to pull-off some optimizations for the headset. Some outstanding are not just the refresh rate but also what display technology is going to chosen. OLED is still up for consideration over backlit LCDs, but that may be a last stand in order to solve the refresh problem. Latency in the frame rates on the video displays is causing motion sickness of the current crop of beta testers of the Oculus Rift VR headset. The amount they’re attempting to speed up is roughly 1/2 the current fastest frame refresh rate they can achieve now. Hopefully this problem can be engineered out of the next revision of the beta units.
Bill Atkinson—creator of MacPaint—painted in MacPaint (Photo credit: ✖ Daniel Rehn)
“I missed the mark with HyperCard,” Atkinson lamented. “I grew up in a box-centric culture at Apple. If I’d grown up in a network-centric culture, like Sun, HyperCard might have been the first Web browser.
Bill Atkinson‘s words on HyperCard and what could have been are kind of sad in a way. But Bill is a genius by any measure of Computer Science and programming ability. Without QuickDraw, the Mac would not have been much of a graphical experience for those attempting to write software for the Mac. Bill’s drawing routines took advantage of all the assembly language routines available on the old Motorola 68000 chip and eked out every last bit of performance to make the Mac what it was in the end; Insanely Great.
I write this in reference also to my experience of learning and working with HyperCard. It acts as the opening parenthesis to my last 16 years working for my current employer. Educational Technology has existed in various forms going all the way back to 1987 when Steve Jobs was attempting to get Universities to buy Macs and create great software to run on those same computers. There was an untapped well of creativity and energy that Higher Education represented and Jobs tried to get the Macintosh computer in any school that would listen.
The period is long since gone. The idea of educational software, interactive hypermedia, CD-ROMs all gone the way of the web and mobile devices. It’s a whole new world now, and the computer of choice is the mobile phone you pick-up on 2 year contract to some telecom carrier. That’s the reality. So now designers and technologists are having to change to a “mobile first” philosophy and let all other platforms and form factors follow that design philosophy. And it makes sense as desktop computer sales still erode a few percentage points each year. It’s just a matter of time before we reach peak Desktops. It’s likely already happened, we just haven’t accepted it as gospel.
Every technology is a stepping stone or shoulder to stand on leading to the next stepping stone. Evolutionary steps are the rule of the day. Revolution has passed us by. We’re in for the long slog, putting things into production making them do useful work. Who has time to play and discover when everyone has a pre-conceived notion of the brand device and use it will serve. I want X to do Y, no time to advise or consult to fit and match things based on their essential quality or essence of what they are good at accomplishing. This is the brand and this is how I’m going to use it. That’s what Educational Technology has become these days.
This tells me my job with foursquare is to be “driven” like a calf into a local business. Of course, this has been the assumption from the start. But I had hoped that somewhere along the way foursquare could also evolve into a true QS app, yielding lat-lon and other helpful information for those (like me) who care about that kind of thing. (And, to be fair, maybe that kind of thing actually is available, through the foursquare API. I saw a Singly app once that suggested as much.) Hey, I would pay for an app that kept track of where I’ve been and what I’ve done, and made that data available to me in ways I can use.
foursquare as a kind of Lifebits I think is what Doc Searls is describing. A form of self-tracking a la Stephen Wolfram or Gordon Moore. Instead foursquare is the carrot being dangled to lure you into giving your business to a particular retailer. After that you accumulate points for numbers of visits and possibly unlock rewards for your loyalty. But foursquare no doubt accumulates a lot of other data along the way that could be use for the very purpose Doc Searls was hoping for.
Gordon Moore’s work at Microsoft Research bootstrapping the My Lifebits project is a form of memory enhancement, but also logging of personal data that can be analyzed later. The collection or ‘instrumentation’ of one’s environment is what Stephen Wolfram has accomplished by counting things over time. Not to say it’s simpler than the My Lifebits, but it is in someways lighter weight data (instead of videos and pictures, mouse clicks and tallies of email activity, times of day, etc.) There is no doubt that foursquare could make a for profit service to paying users where they could collect this location data and serve it up to subscribers, letting them analyze the data after the fact.
I firmly believe a form of My Lifebits could be aggregated across a wide range of free and paid services along with personal instrumentation and data collecting like the kind Stephen Wolfram does. If there’s one thing I’ve learned readings stories about inventions like these from MIT’s Media Lab is that it’s never an either or proposition. You don’t have to just adopt Gordon Moore’s technology or Stephen Wolfram’s techniques or even foursquare’s own data. You can do all or just pick and choose the ones that suit your personal data collection needs. Then you get to slice, dice and analyze to your heart’s content. What you do with it after that is completely up to you and should be considered as personal as any legal documents or health records you already have.
Which takes me back to an article I wrote some time ago in reference to Jon Udell calling for a federated LifeBits type of service. It wouldn’t be constrained to one kind of data, but all the LifeBits aggregated potentially and new repositories for stuff that must be locked down and private. So add Doc Searls to the list of bloggers and long time technology writers who see an opportunity. Advocacy (in the case of Doc’s experience with foursquare) on behalf of sharing unfiltered data with the users on whom data is collected is one step in that direction. I feel Jon Udell is also an advocate for users gaining access to all that collected and aggregated data. But as Jon Udell asks, who is going to be the first to attempt to offer this up as a pay-for service in the cloud where you can for a fee access your lifebits aggregated into one spot (foursquare,twitter,facebook,gmail,flickr,photostream,mint,eRecords,etc.) so that you don’t spend your life logging on and logging off from service to service to service. Aggregation could be a beautiful thing.
But moreover, Yahoo needed to leverage this thing that it had just bought. Yahoo wanted to make sure that every one of its registered users could instantly use Flickr without having to register for it separately. It wanted Flickr to work seamlessly with Yahoo Mail. It wanted its services to sing together in harmony, rather than in cacophonous isolation. The first step in that is to create a unified login. That’s great for Yahoo, but it didn’t do anything for Flickr, and it certainly didn’t do anything for Flickr’s (extremely vocal) users.
Gizmodo article on how Yahoo first bought Flickr then proceeded to let it erode. As the old cliche sez’, The road to hell is paved with good intentions. For me personally I didn’t really mind the issue others had with the Yahoo login. I was allowed to use the Flickr login for a long time after they were taken over. But I still had to create a Yahoo account even if I never used it for anything other than accessing Flickr. Once I realized this was the case, i dearly wished Google had bought them as I WAS already using GMail and other Google services like it.
Most recently there’s been a lot of congratulations spread around following the release of a new Flickr uploader. I always had to purchase an add-on to my Apple iPhoto in order to streamline the cataloging, annotating, and arranging of picture sets. Doing the uploads one at a time through the Web interface was not on, I needed bulk uploads, but I refused to export picture sets out of iPhoto just to get them into Flickr. So an aftermarket arose for people like me invested heavily into iPhoto. And these add-on programs worked great, but they would go out of date or be incompatible with newer versions of iPhoto. So you would have to go back and drop another $10 USD on a newer version of your own iPhoto/Flickr exporter.
And by this time Facebook had so taken over the social networking aspects of picture sharing, no one could see the point of a single medium service (just picture sharing). When Facebook allowed you to converse, play games, and poke your friends, why would you log out and open Flickr to just manage your photos. The level of integration and friction was too high for the bulk of Internet users. So Facebook had gain the mindshare, reduced the friction and made everything seamless and just work the way everyone thought it should. And it is hard to come back from a defeat like that with the millions of sign ups that Facebook was enjoying. Yahoo should have had an app for that early on and let people share their Flickr sets with people using similar access controls and levels of security.
I would have found Flickr a lot more useful if it had been well bridged into the Facebook universe during the critical time period of 2008-2010. For me that would have been just the time period when things were really chaotically ramping up in terms of total new Facebook account creations. The addition of an insanely great Flickr App for Facebook could have made a big difference with helping grow the community awareness and possibly garner a few new Flickr accounts along the way. However, agendas are always so much more blinders in the way that they close you off to the environment in which you operate. Flickr and Yahoo’s merger and the agenda of ‘integration’ more or less was the single most important thing going on during the giant Facebook ramp-up. And so it goes, Yahoo stumbles more than once and takes a perfectly good Web 2.0 app and lets it slowly erode Friendster and MySpace before it. So long Flickr it’s been good to know yuh.
Paul Otellini, CEO of Intel (Photo credit: Wikipedia)
During Intels annual investor day on Thursday, CEO Paul Otellini outlined the companys plan to leverage its multi-billion-dollar chip fabrication plants, thousands of developers and industry sway to catch up in the lucrative mobile device sector, reports Forbes.
But what you are seeing is a form of Fear, Uncertainty and Doubt (FUD) being spread about to sow the seeds of mobile Intel processors sales. The doubt is not as obvious as questioning the performance of ARM chips, or the ability of manufacturers like Samsung to meet their volume targets and reject rates for each new mobile chip. No it’s more subtle than that and only noticeable to people who know details like what design rule Intel is currently using versus that which is used by Samsung or TSMC (Taiwan Semiconductor Manufacturing Corp.) Intel is currently just releasing its next gen 22nm chips as companies like Samsung are still trying to recoup their investment in 45nm and 32nm production lines. Apple is just now beginning to sample some 32nm chips from Samsung in iPad 2 and Apple TV products. It’s current flagship model iPad/iPhone both use a 45nm chip produced by Samsung. Intel is trying to say that the old generation technology while good doesn’t have the weight and just massive investment in the next generation chip technology. The new chips will be smaller, energy efficient, less expensive all the things need to make higher profit on consumer devices using them. However, Intel doesn’t do ARM chips, it has Atom and that is the one thing that has hampered any big design wins in cellphone or tablet designs to date. At any narrow size of the design rule, ARM chips almost always use less power than a comparably sized Atom chip from Intel. So whether it’s really an attempt to spread FUD, can easily be debated one way or another. But the message is clear, Intel is trying to fight back against ARM. Why? Let’s turn back the clock to March of this year in a previous article also appearing in Apple Insider:
This article is referenced in the original article quoted at the top of the page. And it points out why Intel is trying to get Apple to take notice of its own mobile chip commitments. Apple designs its own chips and has the manufacturing contracted out to a foundry. To date Samsung has been the sole source of the A-processors used in iPhones/iPod/iPad devices as Apple is trying to get TSMC up to speed to get a second source. Meanwhile sales of the Apple devices continues to grow handsomely in spite of these supply limits. More important to Intel is the blistering growth in spite of being on older foundry technology and design rules. Intel has a technological and investment advantage over Samsung now. They do not have a chip however that is BETTER than Apple’s in house designed ARM chip. That’s why the underlying message for Intel is that it has to make it’s Atom chip so much better than an A4, A5, A5X at ANY design ruling that Apple cannot ignore Intel’s superior design and manufacturing capability. Apple will still use Intel chips, but not in its flagship products until Intel achieves that much greater level of technical capability and sophistication in its Mobile microprocessors.
Intel is planning a two-pronged attack on the smartphone and tablet markets, with dual Atom lines going down to 14 nanometers and Android providing the special sauce to spur sales.
Lastly, Ian Thomson from The Register weighs in looking at what the underlying message from Intel really is. It’s all about the future of microprocessors for the consumer market. However the emphasis in this article is that Android OS devices whether they be phones or tablets or netbooks will be the way to compete AGAINST Apple. But again it’s not Apple as such it’s the microprocessor Apple is using in it’s best selling devices that scares Intel the most. Intel has since its inception been geared towards the ‘mainstream’ market selling into Enterprises and the Consumer area for years. It has milked the desktop PC revolution as it helped create it more or less starting with its forays into integrated micro-processor chips and chipsets. It reminds me a little of the old steel plants that existed in the U.S. during the 1970s as Japan was building NEW steel plants that used a much more energy efficient design, and a steel making technology that created a higher quality product. So less expensive higher quality steel was only possible by creating brand new steel plants. But the old line U.S. plants couldn’t justify the expense and so just wrapped up and shutdown operations all over the place. Intel while it is able to make that type of investment in newer technology is still not able to create the energy saving mobile processor that will out perform an ARM core cpu.
Profile shown on Thefacebook in 2005 (Photo credit: Wikipedia)
Codenamed “Knox,” Facebook’s storage prototype holds 30 hard drives in two separate trays, and it fits into a nearly 8-foot-tall data center rack, also designed by Facebook.The trick is that even if Knox sits at the top of the rack — above your head — you can easily add and remove drives. You can slide each tray out of the the rack, and then, as if it were a laptop display, you can rotate the tray downwards, so that you’re staring straight into those 15 drives.
Nice article around Facebook’s own data center design and engineering efforts. I think their approach is going to advance the state of the art way more than Apple/Google/Amazon’s own protected and secretive data center efforts. Although they have money and resources to plow into custom engineered bits for their data centers, Facebook can at least show off what its learned in the time that it has scaled up to a huge number of daily users. Not the least of which is expressed best by their hard drive rack design, a tool-less masterpiece.
This article emphasizes the physical aspects of the racks in which the hard drives are kept. It’s a tool-less design not unlike what I talked about in this article from a month ago. HP has adopted a tool-less design for its all-in-one (AIO) Engineering Workstation, see Introducing the HP Z1 Workstation. The video link will demonstrate the idea of a tool-less design for what is arguably not the easiest device to design without the use of proprietary connectors, fasteners, etc. I use my personal experience of attempting to upgrade my 27″ iMac as the foil for what is presented in the HP promo video. If Apple adopted a tool-less design for its iMacs there’s no telling what kind of aftermarket might spring up for the hobbyist or even the casually interested Mac owners.
I don’t know how much of Facebook’s decisions regarding their data center designs is driven by the tool-less methodology. But I can honestly say that any large outfit like Facebook and HP attempting to go tool-less in some ways is a step in the right direction. Comapnies like O’Reilly’s Make: magazine and iFixit.org are readily providing path for anyone willing to put in the work to learn how to fix the things they own. Also throw into that mix less technology and more Home Maintenance style outfits like Repair Clinic, while not as sexy technologically, I can vouch for their ability to teach me how to fix a fan in my fridge.
Borrowing the phrase, “If you can’t fix it, you don’t own it” let me say I wholeheartedly agree. And also borrowing from the old Apple commercial, Here’s to the crazy ones because they change things. They have no respect for the status quo, so lots stop throwing away those devices, appliances, automobiles and let’s start first by fixing some things.
Unsung Heroes of Tech Back in the late 1970s you wouldnt have guessed that this shy young Cambridge maths student named Wilson would be the seed for what has now become the hottest-selling microprocessor in the world.
This is an amazing story of how a small computer company in Britain was able to jump into the chip design business and accidentally create a new paradigm in low power chips. Astounding what seemingly small groups can come with as complete product categories unto themselves. The BBC Micro was the single most important project that kept the company going and was produced as a learning aid for the BBC television show: The_Computer_Programme, a part of the BBC Computer Literacy Project. From that humble beginning of making the BBC Micro, Furber and Wilson’s ability to engineer a complete computer was well demonstrated.
But whereas the BBC Micro used an off the shelf MOS 6502 cpu, a later computer used a custom (bespoke) designed chip created in house by Wilson and Furber. This is the vaunted Acorn Risc Machine (ARM) used in the Archimedes desktop computer. And that one chip helped launch a revolution unto itself in that the very first time the powered up a sample chip, the multimeter hooked up to registered no power draw. At first one would think this was a flaw, and ask “What the heck is happening here?” But in fact when further inspection showed that the multimeter was correct, the engineers discovered that the whole cpu was running of power that was leaking from the logic circuits within the chip itself. Yes, the low power requirement of this first sample chip of the ARM cpu in 1985 ran on 1/10 of a watt of electricity. And that ‘bug’ then went on to become a feature in later generations of the ARM architecture.
Today we know of the ARM cpu cores as a bit of licensed Intellectual Property that any chip make can acquire and implement in their mobile processor designs. It has come to dominate many different architectures by different manufacturers as diverse as Qualcomm and Apple Inc. But none of it ever would have happened were it not for that somewhat surprising discovery of how power efficient that first sample chip really was when it was plugged into a development board. So thankyou Sophie Wilson and Steve Furber, as the designers and engineers today are able to stand upon your shoulders the way you once stood on the shoulders of people who designed the MOS 6502.
MOS 6502 microprocessor in a dual in-line package, an extremely popular 8-bit design (Photo credit: Wikipedia)
Sebastian Thrun, Associate Professor of Computer Science at Stanford University. (Photo credit: Wikipedia)
Google X formerly Labs founder Sebastian Thrun debuted a real-world use of his latest endeavor Project Glass during an interview on the syndicated Charlie Rose show which aired yesterday, taking a picture of the host and then posting it to Google+, the companys social network. Thrun appeared to be able to take the picture through tapping the unit, and posting it online via a pair of nods, though the project is still at the prototype stage at this point.
You may remember Sebastian Thrun the way I do. He was spotlighted a few times on the PBS TV series NOVA in their coverage of the DARPA Grand Challenge competition follow-up in 2005. That was the year that Carnegie Mellon University battled Stanford University to win in a race of driverless vehicles in the desert. The previous year CMU was the favorite to win, but their vehicle didn’t finish the race. By the following years competition, the stakes were much higher. Stanford started it’s effort that Summer 2004 just months after the March Grand Challenge race. By October 2005 the second race was held with CMU and Stanford battling it out. Sebastian Thrun was the head of the Stanford team, and had previously been at CMU and a colleague of the Carnegie race team head, Red Whittaker. In 2001 Thrun took a sabbatical year from CMU and spent it at Stanfrod. Eventually Thrun left Carnegie-Mellon altogether and moved to Stanford in July 2003.
Thrun also took a graduate student of his and Red Whittaker’s with him to Stanford, Michael Montemerlo. That combo of experience at CMU and a grad student to boot help accelerate the pace at which Stanley, the driverless vehicle was able to be developed and compete in October of 2005. Now move forward to another academic sabbatical this time from Stanford to Google Inc. Thrun took a group of students with him to work on Google Street View. Eventually this lead to another driverless car funded completely internally by Google. Thrun’s accomplishments have continued to accrue at regular intervals so much so that now Thrun has given up his tenure at Stanford to join Google as a kind of entrepreneurial research scientist helping head up the Google X Labs. The X Labs is a kind of internal skunkworks that Google funds to work on various and sundry technologies including the Google Driverless Car. Add to this Sebastian Thrun’s other big announcement this year of an open education initiative that’s titled Udacity (attempting to ‘change’ the paradigm of college education). The list as you see goes on and on.
So where does that put the Google Project Glass experiment. Sergey Brin attempted to show off a prototype of the system at a party very recently. Now Sebastian Thrun has shown it off as well. Google Project Glass is a prototype as most online websites have reported. Sebastian Thrun’s interview on Charlie Rose attempted to demo what the prototype is able to do today. It appears according to this article quoted at the top of my blogpost that Google Glass can respond to gestures, and voice (though that was not demonstrated). Questions still remain as to what is included in this package to make it all work. Yes, the glasses do appear ‘self-contained’ but then a wireless connection (as pointed out by Mashable.com) would not be visible to anyone not specifically shown all the components that make it go. That little bit of visual indirection (like a magician) would lead one to believe that everything resides in the glasses themselves. Well, so much the better then for Google to let everyone draw their own conclusions. As to the concept video of Google Glass, I’m still not convinced it’s the best way to interact with a device:
As the video shows it’s more centered on voice interaction very much like Apple’s own Siri technology. And that as you know requires two things:
1. A specific iPhone that has a noise cancelling microphone array
2. A broadband cellphone connection back to the Apple mothership data center in North Carolina to do the Speech-t0-Text recognition and responses
So it’s guaranteed that the glasses are self-contained to an untrained observer, but to do the required heavy lifting as it appears in the concept video is going to require the Google Glasses and two additional items:
1. A specific Android phone with the Google Glass spec’d microphone array and ARM chip inside
2. A broadband cellphone connection back to the Google motherships wherever they may be to do some amount of off-phone processing and obviously data retrievals for the all the Google Apps included.
It would be interesting to know what passes over that personal area network between the Google Glasses and the cellphone data uplink a real set of glasses is going to require. The devil is in those details and will be the limiting factor on how inexpensively this product could be manufactured and sold.
Thomas Hawk’s photo of Sergey Brin wearing Google Glasses
My first blogging platform was Dave Winer’s Radio UserLand. One of Dave’s mantras was: “Own your words.” As the blogosophere became a conversational medium, I saw what that could mean. Radio UserLand did not, at first, support comments. That turned out to be a constraint well worth embracing. When conversation emerged, as it inevitably will in any system of communication, it was a cross-blog affair. I’d quote something from your blog on mine, and discuss it. You’d notice, and perhaps write something on your blog referring back to mine.
I would love to be able to comment on an article or a blog entry by passing a link to a blog entry within my own WordPress instance on WordPress.com. However rendering that ‘feed’ back into the comments section on the originating article/blog page doesn’t seem to be common. At best I think I could drop a permalink into the comments section so people might be tempted to follow the link to my blog. But it’s kind of unfair to an unsuspecting reader to force them to jump and in a sense re-direct to another website just to follow a commentary. So I fully agree there needs to be a pub/sub style way of passing my blog entry by reference back into the comments section of the originating article/blog. Better yet that gives me some ability to amend and edit my poor choice of words the first time I publish a response. Too often silly mistakes get preserved in the ‘amber’ of the comments fields in the back-end MySQL databases of those content management systems housing many online web magazines. So there’s plenty of room for improvement and RSS could easily embrace and extend this style of commenting I think if someone were driven to develop it.