Blog

  • links for 2009-06-05

  • Suspenders and a Belt: Tips for backin’ up your Mac

    The Mac has Time Machine built right in
    The Mac has Time Machine built right in

    A co-worker has been working on a reporting tool to allow a Mac user to get reports from Time Machine whenever there’s a failure in the backup. Failure messages occasionally come up when Time Machine runs, but it never says what folder, what file or really what kind of failure occured. Which is not what you want if you are absolutely depending on the data being recoverable via Time Machine. It’s not bulletproof and it will lull you into complacency once you have it up and running. I tend to agree that a belt and suspenders approach is best. I’ve read countless articles saying Disk Clones are the best, and on the other side, Incremental Backups are most accurate (in terms of having the latest version of a file) and more efficient with disk space (no need to duplicate the system folder again right?) With the cost of Western Digital My Books dropping all the time, you could purchase two separate USB2 Lifebooks, use a disk cloning utility for one drive, Time Machine for the other. Then you would have a bullet proof backup scheme. One reader commented in this article that off-site backup is necessary as well, so include that as the third leg of your backup triad.

    Since errors and failure can happen in any backup system, we recommend that if you have the available resources (namely, spare external hard drives) that you set up dual, independent backups, and, in doing so, take advantage of more than one way of backing up your system. This will prevent any errors in a backup system from propagating to subsequent backups.

    One strongly recommended solution that we advocate is to have both a snapshot-based system such as Time Machine in addition to a bootable clone system as well using a software package such as SuperDuper or Carbon Copy Cloner. Doing this will ensure you can both boot and access your most recently changed files in the event of either data loss or hardware failure.

    via MacFixIt

  • Consumer Reports: Sharpen your mower blade

    How to change a lawnmower blade
    How to change a lawnmower blade

    I own a very large metal file, and I use it to sharpen my lawnmower blade. Everything I’ve ever read about sharpening a mower blade indicated you don’t ever use power equipment. By that I mean something that spins at a high rate of speed with a sharpening wheel attached. The reason given is always the same. The steel used in lawnmower blades is hardened along the edges to hold the sharpness longer. When you try to resharpen the blade using a high speed spinning sharpening stone, the metal in the blade heats up. Sometimes it can heat up so much the metal turns color. When you see that color, you have effectively removed the hardening of the steel. It will now be just as soft as a wire coat hanger and not hold the sharp edge for very long. However today I read in Consumer Reports blog that they use a Dremel tool with a blade sharpening attachment. How is this different from your average cheap bench grinder? It’s hand held, but other than that, does it heat up the blade any less? I also have one of the electric drill attachments they mention in the article. I don’t use that tool for the same reason why I probably wouldn’t use the Dremel attachment, it’s going to heat up the blade as the sharpening progresses.

    Another good sharpening option is Dremels Lawn Mower & Garden Tool Sharpener attachment about $8 is. Peter Sawchuk, our outdoor-power-equipment expert, uses this attachment at our mower/tractor-testing site in Fort Myers, Florida, where we check out several dozen models every year. “I see value in the attachment for homeowners,” says Sawchuk, noting that the nylon guide holds the blade at the right angle for maximum sharpness. In Sawchuks experience, the only drawback to the attachment is that it cant grind out major nicks. You can also get similar drill attachments for sharpening a mower blade. Properly clamping the blade in a stationary position and using two hands to guide the tool will help you get a uniformly sharp cutting edge.

    from consumer reports

  • Qualcomm shows Eee PC running Android OS

    Asus Eee PC
    Asus Eee PC

    You wouldn’t think Asus could keep up the blistering pace of development. But here they are once again first to market with a device that uses less power, costs less for the CPU and uses a free OS from none other than Google Inc. What could be better for a bona fide addict of the Google Desktop? Check out the specs on the Snapdragon CPU here.

    If I purchase a netbook at all, this is going to be the one I want just for reading blogs and posting to WordPress. This IS my next computer as far as I’m concerned.

    ” The new laptop — which Qualcomm calls a smartbook — is thinner and lighter than current members of Asustek’s Eee PC netbook lineup because the 1GHz Snapdragon processor that it uses does not require a heat sink or a cooling fan, said Hank Robinson, vice president of business development at Qualcomm.

    Qualcomm’s Snapdragon includes a 1GHz Arm processor core, a 600MHz digital-signal processor and hardware video codecs. Currently, Asustek’s Eee PC line of netbooks relies on Intel processors, in particular the low-cost, low-power Atom chip, which has an x86 processor core.”

    via Good Gear Guide.

  • links for 2009-05-23

  • CSS Web Site Design Hands-On Training

    CSS Web Site Design by Eric Meyer
    CSS Web Site Design by Eric Meyer

    Specifically what is it about CSS that drives me crazy? I am the most evil of ancient dead wood. I am an HTML table man. I got by using tables and got on with my life. I never questioned what I did. I co-worker began beating the drum of usability and web standards about 5 years ago and eliminated tables from his web designs and re-designs. I clung to tables every time somebody pitifully ask me, “Could you make a website for me?”. Usually it was no more than a two cell table to enforce a two column style layout. Picture and text both in their respective cells. Nothing complicated our out of control.

    The day of reckoning is now, I am finally “learning” how to use CSS to enforce the old 2 column style web page I always would get asked to make for people. That’s right simplicity in it’s most essential form. But here’s the rub, you absolutely need to know what you are doing. You need a guide to help you navigate the idiosyncracies and vagueries of the CSS Box model and CSS Layout algorithms. What’s that?!

    Even at this late stage in the evolution of CSS, HTML, CSS2, XHTML and CSS3 and beyond there’s no tag, no markup, no code that will let you express this simple idea:

    Create 2 columns (picture goes on left, text goes on right)

    No, in fact what I’m learning from Eric Raymond’s book written under the aegis of Lynda.com is you need to know how the CSS box model including margins, and padding interact to force the web page to render as you intend it to. While creating a DIV element is stragithforward and having DIVs sit comfortably next to one another (like those old 2 cell tables) getting them to render correctly as the web browser grows and shrinks is tricky.

    On page 131 of CSS Web Site Design, Eric Raymond states (pull quote indicating use of padding in the #content DIV and the negative margin for the #sidebar DIV). This kind of precision with rendering the DIVs is absolutely necessary to replicate the old HTML table formatting. But the workaround, or magic or trick to it all is reading or learning from someone else the fact that a DIVs width is determine by it’s stated width + the margin. If you set the margin to be a negative (which is allowed) the browser effectively subracts that amount from the stated width. When the two numbers add up to -1, the web browser ‘ignores’ the DIV and lets it do whatever it wants. In this example that Eric Raymond uses the sidebar DIV pops up to the same height as the content DIV. Previously, the sidebar had to sit all the way to the right, until the browser go to small, then it would suddenly pop-down below it’s neighbor. With a negative width, the browser doesn’t see the sidebar div and leaves it alone, it doesn’t popdown, it stays right where it is.

    The other half of this equation is to leave generous amounts of padding in the content DIV.  The giant field of padding allows the ignored sidebar DIV to occupy some nice open whitespace, without stomping on any text in the content DIV. So extra padding on left hand DIV, negative width on the right hand DIV and you have the functional CSS equivalent of the two cell table of yore. Wow. I understand the necessity of doing this, I know all the benefits of adhering to accessibility requirements, and maintaining web standards. Nobody should be discriminated against because the website wasn’t designed the right way.

    My own conclusion however is that while CSS does a lot it suffers the same problems as we had under all previous web design regimes. To accomplish the simple stuff, requires deep, esoteric knowledge of the workarounds. That is what differentiates the Dreamweaver jockeys from the real web designers, they know the tricks and employ them to get the website to look like the Photoshop comps. And it’s all searchable, and readable using screen reading software. But what a winding, twisty road to get there. It’s not rocket science, but it ain’t trivial either.

  • links for 2009-04-16

  • Nvidia pitches OpenCL as ‘market builder’ • The Register

    I used to participate pretty heavily in the old Byte Magazine online forums. One particular thread I was actively involved in was Reconfigurable Computing. The premise we followed was that of Field Programmable Gate Arrays becoming so powerful they could be used as CPUs on a desktop computer. Most people felt this was doable, but inefficient, more likely a FPGA as a reconfigurable co-processor might be better. Enter OpenCL, the way of parsing out tasks to the right tool. In some ways I see a strong correlation to the old Reconfigurable CPU discussion where you used the best tool for the job. In FPGA worlds, you would reconfigure cores to match a particular workload on demand. So if you were playing a Game, you might make the CPU into a GPU until you were done with the game. If you were recording audio, you would reconfigure the FPGA into a DSP, and so on.

    OpenCL seems much more lightweight and less risky on the implementation side because it just takes advantage of what’s there. Not anything like the ideas we had of earth shaking changes in architecture (using an FPGA instead of a CPU). Reading what OpenCL might allow in a diverse multi processor desktop computer, it makes me want to strike up the argument for co-processors at least. In an OpenCL world you could easily have an FPGA available as a co-processor and a nice robust nVidia GPU chugging away without discriminating architecturally against either. OpenCL would help parse out the task. Mix in some FPGA level support in the OS as a re-configurable processor, and Voila, you get a DSP or whatever else you might want at any point in the clock cycle.

    Given Intel’s drive towards multi-core CPUs, nVidia’s parallel processors, and somewhat less impressive gains on the FPGA front, you could have an AI right on your desktop. Now someone had better get started on those OpenCL drivers and hooks for the kernel! I wish sometimes I could be that person. But it’s too far out of my ability to make it happen.

  • College Music Service Ruckus.com Shuts Down

    Now that Ruckus has shut down a number of Universities in the U.S. have been caught unawares. No one had forewarning that their attempt to provide a legal alternative to file sharing was going to be taken away. Worse yet RIAA who said they weren’t starting an ‘new’ legal suits against students back in August has continued filing new suits since that time. So what is one to do?

    I was asked recently to do a quick rundown of legal music services or retail outfits. I think information gathering at this point is the only sane approach to such a sudden cut in service. At the University where I work we posted an announcement that Ruckus had shut down and we were looking into it. What I noticed at two other Universities who had Ruckus was they put up notices that indicated while Ruckus was shutdown there are still other legal alternatives to Peer to Peer filesharing. So the question becomes does an institution still bear the financial responsibility to provide a legal alternative to file sharing?

    In these tight financial times, University provosts an chancellors are going to have to really reconsider how much legal protection they need from the RIAA. There may have been a vain hope that students would be encouraged to use the legal music services. But most stayed away and either continued filesharing or buying music from iTunes.com. So in this second phase of Universities dealing with the peer to peer filesharing, I think we are at the point of advocacy ONLY. Providing a legal alternative at reduced cost and  little to no choice is now too costly (in terms of Full Time Equivalent users). It is also too costly in terms of broken contracts when your music service goes out of business overnight. I did find one listing at a University website that had been a Ruckus customer pointing to our old friend Wikipedia.

    What better way to provide an alternative than to permit the wisdom of crowds to prevail. Rather than me write up a review of all the possible alternatives, why not let the contributors to Wikipedia collect and edit all the reviews/stats on all the services. Then each and every individual can make the rational choice as to whether or not they want to buy music or steal it over the Internet.