I’ve seen some claims that newer SSDs coming out are implementing the SATA TRIM command. This development is hailing a new era in SDD performance, something we have all wished for since the introduction of SSDs back in 2005. In the last 4 years, performance gains have usually been obtained by using RAID controllers within the SSDs. Worse yet, some SATA disk controllers on the SSDs were known to be total dogs when it comes to performance. Enter the hero of our story: Indilinx
Indilinx decided after multiple requests to enter the market and show that SSDs are worthy of some real product development. Patriot is the one of the first manufacturers to adopt the Indilinx disk controller. Given announcements from Microsoft recently over the addition of full OS support for the SATA TRIM command and now the Indilinx controller,…
One can only hope that Windows 7 will allow SSDs to finally equal or surpass their HDD counterparts. Finger crossed, hoping the Indilinx takes the market by storm and Microsoft will fully embrace and improve its support for the TRIM command
Despite the huge performance gains, two major things plague SSDs:
I am a fan of David Lynch. I saw the movie Blue Velvet once on MTV of all places. It wasn’t in its entirety but it did have all the adult content. It was more frightening than any horror movie I saw up until or after that time. Because its horror is so palpable. It is as real as me sitting here typing or getting up to go to the bathroom or driving to work. Its plainness and realness are what raises my level of paranoia 100 percent.
As a person David Lynch seems very mild, and he’s pretty happy generally and kind of nostalgic. He put up a website some years back to allow fans to contribute to his causes. Meditation is a big deal for him, and he’s trying to setup a large scale school for teaching tascendental meditation. So it’s always a shock or slightly unsettling to see him speak about something he hates. David Lynch hates Product Placement and knows that watching a movie on a telephone is much worse than seeing on a big movie screen. I haven’t really thought about David Lynch very recently. But a link to an NPR website reviewing new music from the Artis Moby caught my attention.
Moby
NPR.org, June 15, 2009 – Moby has just made his best record in 10 years — at least I think so. The new record by the DJ, singer, bassist, keyboardist, guitarist and all-around renaissance man, Wait for Me, is filled with beauty, sadness and celebration.
Moby had said in an interview he was inspired by an interview done by BAFTA for it’s David Lean Lecture Series. Moby felt Lynch was saying being creative was more important than the market for the work being created. Which led me to finding the original video and transcript of the interview:
“Everybody probably knows that success is just as dangerous as failure, maybe more. You second guess yourself from then on because you’re afraid to fall. Failure? Terrible at first but then, oh man, total freedom. There is nowhere to go but up, and it’s a very good thing.”
Moby asked David Lynch to make a video for one of the music tracks. Here’s the link to video on pitchfork:
So given this interesting combination of thoughts and ideas and inspiration all I can say is I’m so happy the web allows people to find those little seeds to start big fires burning. Lynch is right. Creativity is the thing. Or as Lynch likes to say the little fish that allow you to catch the really deep, abstract big fish. I too have received inspiration from finding the original album posted on NPR.org. I listened to the whole thing all the way through rather than a track at a time. Moby designed this to be an old style ‘album’ experience and he handcrafted it, a very personal work. I like it. I like it a lot. It’s fantastic. Run out and buy it, or download it or something. Do it. Do it now!
I was raised on the most successful initiatives from Public Television, or ETV as it was previously known (E standing for Educational of course). Sesame Street, 3-2-1 Contact, Mister Rogers’ Neighborhood, and Reading Rainbow were my bread and butter as a kid.
I couldn’t agree more I too grew up with Educational Television as a child. In fact in the Northeast corner of South Dakota there was a huge transmitter just outside our little town. It was a PBS tower and sometimes that was the only station we could get. In between days at school and dinner time I watched re-runs of Gilligan’s Island or old Hanna Barbera cartoons on Captain 11 on KELO-TV. Those were the days. I used to thoroughly hate the adult shows my parents watched like Masterpiece Theatre. They must have seen every episode of Upstairs, Downstairs three times. But then I too loved watching repeats of Mr. Roger’s Neighborhood and Sesame Street. I was one of the chief beneficiaries of Newton Minow’s speech to the National Association of Brodacasters back in 1961. For me television might have been a vast wasteland, but there were some bright shining spots along the way.
A few weeks ago, my laptop suffered a fall onto linoleum that made its congenitally nervous hard drive more nervous even than usual. Fortunately, days later, the drive turned miraculously tranquil, efficient. Its anxieties disappeared, as if by magic. There was no freezing or whirring. I wrote some e-mail messages, surfed the Web and organized some photos before shutting things down.
There is no sadder admission by someone who considers themselves a competent IT Professional, than to say, “I resemble that.” I too suffered a Hard Drive mishap, caused by my own ignorance while upgrading my Mac from OS X 10.3 to OS X 10.4. The problem lay in an article I read on a Mac Enthusiast website that indicated there was a new user account migration utility built-in to the new installer on 10.4. So rather than run the Archive and Install option, which would leave the old operating system and all its files, I chose Erase and Install. Why? My mis-reading of the article on the enthusiast website led me to believe I could Erase and Install and then watch the User Migration Utility magically lauch itself. It would pull over my user folder and all the Applications installed on the machine originally. Leaving me with much less work to do once the OS was installed. Past experience proved that reinstalling all your old software takes forever, and I was trying to avoid that.
The key to this new way of thinking is Migration Assistant (the same tool that Apple provides to facilitate moving files from an old Mac to a new one). You don’t have to run this program separately; all its capabilities are integrated into Setup Assistant under the auspices of “File Transfer.”
So you can imagine to my horror as the erase and install was progressing, the Migration Assistant was not popping up asking me what I wanted to do. And by then it was too late. The Erasure was already wiping the drive or at least setting all the flags on all the files so that they appeared to be open, write-enabled sectors on the Hard Drive. And I didn’t have a full backup of the drive contents before the install. That was my biggest mistake, considering now I’m very familiar with disk cloning. I too have learned the hard lessons of self-inflicted hard drive mishaps. You should take heed of all these warnings too. Put down that iPhone, turn off that TV get on Amazon and buy yourself an external Hard drive and backup, backup, backup.
Did anyone watch the demo video from Google Australia? A number of key members from Google Maps set out to address the task of communication and collaboration. Lars and Jens Rasmussen decided now that Gmaps is a killer, mash-up enabled web app, it’s time to design the Next Big Thing. Enter Google Wave, it is the be all end all paradigm shifting cloud application of all time. It combines all the breathless handwaving and fits of pique that Web 2.0 encompassed 5 years ago. I consider Web 2.0 to have really started the Summer of 2004 with some blogging and podcasting efforts going on and slow but widespread adoption of RSS publishing and subscribing. So first I’ll give you the big link to the video of the demo by Lars Rasmussen and Company:
It is 90 minutes long. It is full of every litte UI tweak and webapp nicety along with rip-roaring collaboration functionality examples and “possible uses” for Google Wave. If you cannot or will not watch a 90 minute video just let me say pictures do speak louder than words. I would have to write a 1,000 page manual to describe everything that’s included in Google Wave. First let’s start off the list of what Google Wave is ‘like’.
It’s like email. You can send and receive messages with a desktop software client. It’s like Chat, you can chat live with anyone who is also on Google Wave. It’s like webmail in that you can also run it without a client and see the same data store. It’s like social bookmarking, you find something you copy it, you keep it, you annotate it, you share it. It’s like picture sharing websites, you take a picture, you upload it, you annotate it, you tag it, you share it. It’s like video sharing websites, same thing as before, upload, annotate, tag, share. It’s like WebEx where you give a presentation, everyone can see the desktp presentation as you give it and comment on it through a chat back-channel. It’s like Sharepoint where you can check-in, check-out documents, revise them, see the revisions and share them with others. It’s like word processor, it has spell checking enabled live as you type. It can even translate into other languages for you on the fly. It’s like all those Web 2.0 mash-ups where you take parts from one webapp and combine them with another so you can have Twitter embedded within your Google Waves. There are no documents as such only text streams associated with authors, editors, recipients, etc. You create waves, you share waves, you store waves, you edit waves, you embed waves, you mash-up waves. One really compelling example given towards the end is using Waves as something like a Content Managements System where mulitple authors work, comment, revise a single text document (a wave) and then collapse it down into a single new revision that get’s shared out until a full document, fully edited is the final product. Whether that be a software spec, user manual or website article doesn’t matter the collaboration mechanism is the same.
So that’s the gratuitous list of what I think Google Wave is. There is some question as to whether Gmail, Google Docs & Spreadsheets will go away in favor of this new protocol and architecture. Management at Google have indicated it is not the case, but that the current Google suite would adopt Google Wave like functionality. I think the collaboration capability would pump-up the volume on the Cloud based software suite. Microsoft will have to further address something like this being made freely available or even leaseable for private business like Gmail is today. And thinking even farther ahead for Universities using Course Management Systems today,… There’s a lot of functionality in Google Wave that is duplicated in 90% of pay for, fully licensed software for Content Management Systems. Any University already using Gmail for student email and wanting to dip their toes into Course Management Systems should consider Google Wave as a possibility. Better yet, any company that repackages and leverages Google Wave in a new Course Management System would likely compete very heavily with the likes of Microsoft/Blackboard.
About a year ago I wrote an article about nVidia’s attempt to use it’s video graphics cards to accelerate transcoding. H.264 was fast becoming the gold standard for desktop video, video sharing through social networking websites, and for viewing on handheld devices. In the time since then, Badaboom entered the market and has gone through a revision of it’s original GPU accelerated transcoding software. Apple is now touting OpenCL as the API through which any software can access the potential of using all those graphics pipelines to accelerate parallel operations off of the CPU. nVidia is supporting OpenCL whole hog and I think there is some hope Microsoft won’t try to undermine it too much though it’s standing strong with DirectX as the preferred API for anything that talks to a graphics card for any reason.
So where does AMD with it’s ATI card fit into the universe of GPU accelerated software? According to Anandtech, it doesn’t fit in at all. The first attempts at providing transcoding have proved a Big Fail. While Badaboom outlcasses it at every turn in the transcoded video it produces. Hopefully OpenCL can be abstracted enough to cover AMD and nVidia’s product offerings with a single unified interface to allow acceleration to occur much more easily as citizen of the OS. Talking directly to the metal is only going to provide headaches down the road as OSes are updated and drivers change. But even with that level of support, it looks like AMD’s not quite got the hang of this yet. Hopefully they can spare a few engineers and a few clock cycles and take Avivo out of alpha prototype stage and show off what they can do. The biggest disappointment of all is that even the commercial transcoder from Cyberlink using the ATI card didn’t match up to Badaboom on nVidia.
A few months ago, we tested AMD’s AVIVO Video Converter. AMD had just enabled video transcode acceleration on the GPU, and they wanted to position their free utility as competition to CUDA enabled (and thus NVIDIA only) Badaboom. Certainly, for a free utility, we would not expect the same level of compatibility and quality as we would from a commercial application like Badaboom. But what we saw really didn’t even deliver what we would expect even from a free application.
Did you know that recently Wikipedia banned editing articles on the Church of Scientology? This reminded me of a project where Jon Udell showed an animation of edits done to a Wikipedia page. Only through animating and visualizing the process did one really understand what had happens to a Wikipedia article over time. Each bit of phrasing, verbiage and links goes back and forth with paragraphs and sentences disappearing then reappaearing. We don’t think of editing words as inherently visual. Compared to film or music recording, writing prose or technical writing is a mental exercise, not a visual one. Yet, when shown a compelling example like Jon Udell’s we inherently just ‘get it’.
After that article was published by Jon Udell and since the wikiAnimate example coursed its way through the Internet, there hasn’t been much noticeable follow-up action. Lots of good ideas are left to wither in the Internet Archive. I don’t see a lot of Slashdot activity on visualizing wiki edits. The biggest problem Jon points out with the original wikiAnimate solution was that it would do a round trip of HTTP GET for every step shown in the animation. This loads down the network way too much and hits Wikipedia with to many HTTP GET requests. Jon Udell, ever the vigilant writer/researcher decided to revisit the original idea. Jon is a kind of pragamtist who readily adapts what already exists. He suggests a couple of ways existing projects could be adapted to the purpose of visualizing changes in text as it is written.
The Wave toolkit from Google is one example. Google Wave has the ability to “playback” conversations back and forth over a period of time. Maybe that ‘playback’ feature could be re-used by an enterprising developer using the Wave APIs. Another possible solution Jon Udell gives is FeedSync which is implemented in the Windows Live webservice. My assumption is there is some kind of flight recording like ability to track each step, then play it back. I don’t write software or develop software. I barely do scripting. however Jon Udell is big on prototyping and showing full examples of how a Social Bookmarking service like del.icio.us could be adapted to the purpose of aggregating community calendars and transforming their contents into multiple output formats for re-consumption. And he’s willing to write just enough middleware and glue code to make it work. It’s a kind of rampant re-usableism. I would characterize the philosophy as this: Sure there’s enough good ideas/products out there one must only decompose the problem to the point where you see the pattern fit well with an existing solution. That’s the true genius of a guy like Jon Udell.
In the netbook manufacturing and product development industry, the next big thing is always Intel’s rev of the CPU and chipset. Cue the entry of the Pine Trail CPU and it’s partner I/O Hub chip. Only just this year has Intel shown a willingness to combine functions onto the same processor die. I am very interested to see that the CPU is combining not just the Memory Controller as is the case the top of the line i7 CPU family. Talk about a weight reduction right? The original chipset consisted of no less than 3 processors a North Bridge and South Bridge along with the CPU. Now with the coming of the Pine Trail, it’s a big CPU/GPU/Memory combo and a single I/O hub. I’m hoping the power consumption improves and comes much closer to the proposed specs of the Android based netbooks that will use Smartphone CPUs like Motorola’s or ARM based System-on-Chip custom CPUs. If Intel can combine functions and get battery life for a 3-cell unit to average 8+ hours under even heavy CPU loads, then they will have truly accomplished something. I’m looking forward to the first products to market using the Intel N450, but don’t expect to see them until after Christmas of this year 2009.
The Intel Atom
It should use the technology behind Pineview and would be made built on a new, 45 nanometer design that merges the memory controller and graphics directly into the processor; accompanying it would be the new-generation Tiger Point chipset, which is needed for and takes advantage of the N450 design.
Remembering that the Intel Itanium was supposed to be a ground-breaking departure with the past, can Larrabee be all that and more for graphics? Itanium is still not what Intel had hoped. And poor early adopters are still buying new and vastly over-priced minor incremental revs of the same CPU architecture to this day. Given the delays (2011 is now the release date) and it’s size (650mm^2) how is Intel every going to make this project a success. It seems bound for the the Big Fail heap of the future as it bears uncanny resemblances to Itanium and the Intel i740 graphics architecture. The chips is far too big and the release date way to far into the future to keep up with developments at nVidia and AMD. They are not going to stand still waiting for the behemoth to release to manufacturing. I just don’t know how Larrabee is ever going to be successful. It took so long to release the i740, that the market for low end graphics GPUs had eroded to the point where Intel could only sell it for the measly price of $35 per card, and even then no one bought it.
According to current known information, our source indicated that Larrabee may end up being quite a big chip–literally. In fact,we were informed that Larrabee may be close to 650mm square die, and to be produced at 45nm. “If those measurements are normalized to match Nvidia’s GT200 core, then Larrabee would be roughly 971mm squared,” said our source–hefty indeed. This is of course, an assumption that Intel will be producing Larrabee on a 45nm core.
Futurists are all alike. You have your 20th Century types like the Italians who celebrated war. You have Hitler’s architect Albert Speer. You have guys like George Gilder hand waving, making big pronouncements. And all of them use terms like paradigm and cusp as a warning to you slackers, trailers, luddite ne’er-do-wells. Make another entry in your list of predictions for Apple’s Worldwide Developers Conference (WWDC). Everyone feels like Apple has to really top what it’s achieved since last year with the Apple iPhone, the iPhone OS and the AppStore. Mark Sigal writing for O’Reilly Radar believes there’s so much untapped juice within the iPhone that an update in the OS will become the next cusp/paradigm shift.
From today’s O’Reilly Radar article by Mark Sigal:
Flash forward to the present, and we are suddenly on the cusp of a game-changing event; one that I believe kicks the door open for 3D and VR apps to become mainstream. I am talking about the release of iPhone OS version 3.0.
I’m not so certain. One can argue that even the average desktop 3D accelerator doesn’t really do what Sigal would ‘like’ to see in the iPhone. Data overlays is nice, for a 3D glasses kind of application sure, but it’s not virtual reality. It’s more like a glorified heads-up display which the military has had going back to the Korean War. So enter me into the column of the hairy eyeball, critical and suspicious of claims that an OS Update will change things. In fact OSes don’t change things. The way people think about things, that’s what changes things. The move of the World Wide Web from an information sharing utility to a medium for commerce, that was a cusp/paradigm shift. And so it goes with the iPhone and the Viewmaster Viewer. They’re fun yes. But do they really make us change the way we think?