Clippit asking if the user needs help (Photo credit: Wikipedia)
Agreed. I think insofar as a computer AI can watch and see what we’re doing and step in and prompt us with some questions, THAT will be the killer app. It won’t be Clippy the assistant from MS Word, but a friendly prompt saying, “I just watched you do something 3 times in a row, would you like some help doing a bunch of them without having to go through the steps yourself?” Then you got the offer of assistance, it’s timely and non-threatening. You won’t have to “turn-on” a Macro recorder to tell the computer what you want to do, and let it see the steps. It (the computer) will have already recognized you are doing a repetitive task it can automate. And as Jon points out it’s just a matter of successive approximations until you get the slam dunk, series of steps that gets the heavy lifting done. Then the human can address the exceptions list. The 20-50 examples that didn’t work quite right or the AI felt diverged from the pattern. That exception list is what the human should really be working on, not the 1,000 self-similar items that can be handled with the assistance of an AI.
My recent post about redirecting a page of broken links weaves together two few different ideas. First, that the titles of the articles on that page of broken links can be used as search terms in alternate links that lead people to those articles’ new locations. Second, that non-programmers can create macros to transform the original links into alternate search-driven links.
There was lots of useful feedback on the first idea. As Herbert Van de Sompel and Michael Nelson pointed out, it was a really bad idea to discard the original URLs, which retain value as lookup keys into one or more web archives. Alan Levine showed how to do that with the Wayback Machine. That method, however, leads the user to sets of snapshots that don’t consistently mirror the original article, because (I think) Wayback’s captures happened both before and after the breakage.
Back in the day MIT and WGBH produced films together. Imagine that. Well imagine no longer. This was an actual production done at NASA facilities in Virginia and California surveying researching Supersonic Transports (SST) from 1966, a few years before Robert McNamara shut it all down to save money to spend on teh War in Vietnam.
I too am a big believer in doing some amount of testing when the opportunity comes along. Most recently I had to crunch down some video to smaller file sizes. I decided to use Handbrake as that’s the hammer I use for every nail. And in the time since I first started using it, a number of options have cropped up all surrounding the use of the open source x264 encoding libraries. There are now more commandline tweaks and options than you could ever imagine. Thankfully the maintainers for Handbrake have simplified some of the setting through the GUI based version of the software. Now I wasn’t going for quality but for file size and I got it using the “constant quality” output option as opposed to my classical fave “Constant Bitrate”. Let’s just say after a few hours of doing different tweaks on the same file I got bit rates way down without using Contant Bit Rate. And it seams to work no matter what the content is (static shots or fast moving action). So kudos to Spreadys for giving a head-to-head comparison. Much appreciated.
After another long phone call, I decided to repeat a test previously conducted 3 years ago.
The conversation surrounded a small experiment on transcoders and players. It highlighted an issue in that any documented process must include what software was used, the settings and then a comparison of the results. It originally proved that just specifying a player type and / or container format was useless, as the video file itself could have been created in a million different ways. I was asked, “would the same issues happen today?” With updated software and higher spec PC’s, would issues still arise. I said Yes… but then thought I had better check!
Disclaimer!! – This is by no means scientific. I have replicated the real world and not dug too deep into encoding parameters or software settings and the PC used is mid ranged. I have posted this information in order to…
That’s right, the fault dear reader is not in our stars but in ourselves. We have slow internets speeds, ‘Cuz Business. That’s the briefest synopsis yet that I’ve written. Whether it’s carriers allow each others traffic to run across their networks or peering arrangements or whatever, each business is trying to mess with the other guy’s traffic. And the consumers the customers all lose as a result.
I’m always fascinated by these one-off, one of a kind clustered systems like this Raspberry Pi rig. Kudos for doing the assembly and getting it all running. As the comments mention it may not be practical in terms of price. But still it’s pretty cool for what it is.
Photoshop is the only application from Adobe’s suite that’s getting the streaming treatment so far, but the company says it plans to offer other applications via the same tech soon. That doesn’t mean it’s planning to phase out its on-premise applications, though.
Back in 1997 and 1998 I spent a lot of time experimenting and playing with Netscape Communicator “Gold”. It had a built in web page editor that more or less gave you WYSIWYG rendering of the html elements live as you edited. It also had a Email client and news reader built into it. I spent also a lot of time reading Netscape white papers on their Netscape Communications server and LDAP server and this whole universe of Netscape trying to re-engineer desktop computing in such a way that the Web Browser was the THING. Instead of a desktop with apps, you had some app-like behavior resident in the web browser. And from there you would develop your Javascript/ECMAscript web applications that did other useful things. Web pages with links in them could take the place of Powerpoint. Netscape Communicator Gold would take the place of Word, Outlook. This is the triumvirate that Google would assail some 10 years later with its own Google Apps and the benefit of AJAX based web app interfaces and programming.
Turn now to this announcement by Adobe and Google in a joint effort to “stream” Photoshop through a web browser. A long time stalwart of desktop computing, Adobe Photoshop (prior to being bundled with EVERYTHING else) required a real computer in the early days (ahem, meaning a Macintosh) and has continued to do so even more (as the article points out) when CS4 attempted to use the GPU as an accelerator for the application. I note each passing year I used to keep up with new releases of the software. But around 1998 I feel like I stopped learning new features and my “experience” more or less cemented itself in the pre-CS era (let’s call that Photoshop 7.0) Since then I do 3-5 things at most in Photoshop ever. I scan. I layer things with text. I color balance things or adjust exposures. I apply a filter (usually unsharp mask). I save to a multitude of file formats. That’s it!
Given that there’s even a possibility to stream Photoshop on a Google Chromebook based device, I think we’ve now hit that which Netscape had discovered long ago. The web-browser is the desktop, pure and simple. It was bound to happen especially now with the erosion into different form factors and mobile OSes. iOS and Android have shown what we are willing to call an “app” most times is nothing more than a glorified link to a web page, really. So if they can manage to wire-up enough of the codebase of Photoshop to make it work in realtime through a web browser without tons and tons of plug-ins and client-side Javascript, I say all the better. Because this means architecturally speaking good old Outlook Web Access (OWA) can only get better and become more like it’s desktop cousin Outlook 2013. Microsoft too is eroding the distinction between Desktop and Mobile. It’s all just a matter of more time passing.
The software ecosystem for ARM servers “is still shaky, there needs to be a lot more software development going on and it will take time,” says Gwennap.
Previous generations of multi-core, massively parallel, ARM based servers were one off manufacturers with their own toolsets and Linux distros. HP’s attempt to really market to this segment of the market will hopefully be substantial enough to get an Ubuntu distro that has enough Libraries and packages to make it function right out of the box. In the article it says companies are using the Proliant ARM-based system as a memcached server. I would speculate that if that’s what people want, the easier you can make that happen from an OS and app server standpoint the better. There’s a reason folks like to buy Synology and BuffaloTech NAS products and that’s the ease with which you spin them up and get a lot of storage attached in a short amount of time. If Proliant can do that for people needing quicker and more predictable page loads on their web apps, then optimize for memcached performance and make it easy to configure and put into production.
Now what, you may ask, is memcached? If you’re running a web server or a web application that requires a lot of speed so that purchases or other transactions complete and show some visual cue that it was successful, the easiest way to do that is through cacheing. The web page contents are kept in a high speed storage location separate from the actual webpage and when required will redirect, or point to the stuff that sits over in that high speed location. By swapping the high speed stored stuff for the slower stuff, you get a really good experience with the web page refreshing automagically showing your purchases in a shopping cart, or that your tax refund is on it’s way. The web site world is built on caching so we don’t see spinning watches or other indications that processing is going on in the background.
To date, this type of caching has seen different software packages do this for first Apache web servers, but now in the world of Social Media, it’s doing it for any type of web server. Whether it’s Amazon, Google or Facebook, memcached or a similar cacheing server is sending you that actual webpage as you click, submit and wait for the page to refresh. And if a data center owner like Amazon, Google and Facebook can lower the cost for each of it’s memcached servers, they can lower their operating costs for each of these cached web pages and keep everyone happy with the speed of their respective websites. Whether or not ARM-based servers see a wider application is dependent on the apps being written specifically for that chip architecture. But at least now people can point to memcached and web page acceleration as a big first win that might see wider adoption longer term.
Hats off and kudos to Consumer Reports for getting on this story as soon as they could. Measurement trumps anecdotes any and all days of the week. Here now some data and measurements regarding the bendy iPhone 6 and 6 Plus.
Interesting to finally see this form factor hit the market. These cards now are as big or bigger than the typical laptop hard drive. That’s a big deal in that any computer fortified with an SDXC card slot can have a flash based back-up store. I keep my Outlook mail archives on an a drive like this. And occasionally I use it to transfer files the way I would do with a reliable USB flash drive. And this on a laptop that already has an SSD, so I’ve got 2 tiers of this kind of storage. We’re reaching a kind of singularity in flash based storage where the chips and packaging are allowing for such small form factors, hard drives become moot. If I can stuff something this small into a slot roughly the size of a U.S. postage stamp, then why do I need an SATA or even an M.2 sized interface? Is it just for the sake of throughput and performance? That may be the only real argument.
I hope Oculus can get a shipping product out on the market soon. Perfectionism is not helping launch this market. The longer they wait, the more chance there’ll be a cheaper equally well working competitor. Pleez Oculus, release the Rift.