Blog

  • 25 years of HyperCard—the missing link to the Web | Ars Technica

    Bill Atkinson—creator of MacPaint—painted in M...
    Bill Atkinson—creator of MacPaint—painted in MacPaint (Photo credit: ✖ Daniel Rehn)

    “I missed the mark with HyperCard,” Atkinson lamented. “I grew up in a box-centric culture at Apple. If I’d grown up in a network-centric culture, like Sun, HyperCard might have been the first Web browser.

    via 25 years of HyperCard—the missing link to the Web | Ars Technica.

    Bill Atkinson‘s words on HyperCard and what could have been are kind of sad in a way. But Bill is a genius by any measure of Computer Science and programming ability. Without QuickDraw, the Mac would not have been much of a graphical experience for those attempting to write software for the Mac. Bill’s drawing routines took advantage of all the assembly language routines available on the old Motorola 68000 chip and eked out every last bit of performance to make the Mac what it was in the end; Insanely Great.

    I write this in reference also to my experience of learning and working with HyperCard. It acts as the opening parenthesis to my last 16 years working for my current employer. Educational Technology has existed in various forms going all the way back to 1987 when Steve Jobs was attempting to get Universities to buy Macs and create great software to run on those same computers. There was an untapped well of creativity and energy that Higher Education represented and Jobs tried to get the Macintosh computer in any school that would listen.

    The period is long since gone. The idea of educational software, interactive hypermedia, CD-ROMs all gone the way of the web and mobile devices. It’s a whole new world now, and the computer of choice is the mobile phone you pick-up on 2 year contract to some telecom carrier. That’s the reality. So now designers and technologists are having to change to a “mobile first” philosophy and let all other platforms and form factors follow that design philosophy. And it makes sense as desktop computer sales still erode a few percentage points each year. It’s just a matter of time before we reach peak Desktops. It’s likely already happened, we just haven’t accepted it as gospel.

    Every technology is a stepping stone or shoulder to stand on leading to the next stepping stone. Evolutionary steps are the rule of the day. Revolution has passed us by. We’re in for the long slog, putting things into production making them do useful work. Who has time to play and discover when everyone has a pre-conceived notion of the brand device and use it will serve. I want X to do Y, no time to advise or consult to fit and match things based on their essential quality or essence of what they are good at accomplishing. This is the brand and this is how I’m going to use it. That’s what Educational Technology has become these days.

  • My current line of work

    There is no end to the amount of stuff I get asked to do. I like the technical aspects and not so much the other bits. There is a lot of communications and expectation setting. And therein lies the rub. (more…)

  • 2012 in review

    The WordPress.com stats helper monkeys prepared a 2012 annual report for this blog.

    Here’s an excerpt:

    4,329 films were submitted to the 2012 Cannes Film Festival. This blog had 17,000 views in 2012. If each view were a film, this blog would power 4 Film Festivals

    Click here to see the complete report.

  • Attempting to create an autounattend.xml file for work

    Image representing Windows as depicted in Crun...
    Image via CrunchBase

     

    Starting with this website tutorial I’m attempting to create a working config file that will allow me to install new Windows 7 Professional installs without having to interact or click any buttons.

     

    http://sergeyv.com/blog/archive/2009/12/17/unattended-install-of-windows-7.aspx

     

    Seems pretty useful so far as Sergey provides an example autounattend file that I’m using as a template for my own. I particularly like his RunOnce registry additions. This makes it so much more useful than just simply being an answer file to the base OS install. True it is annoying that questions that come up through successive reboots during the specialize pass on a Windows 7 fresh install. But this autounattend file does a whole lot of default presetting behind that  scenes, and that’s what I want when I’m trying create a brand new WIM image for work. I’m going to borrow those most definitely.

     

    I also discovered an interesting sub-section devoted to joining a new computer to a Domain. Ever heard of djoin.exe?

     

    http://technet.microsoft.com/en-us/library/Dd392267.aspx

     

    Very interesting stuff where you can join the computer without first having to login to the domain controller and create a new account in the correct OU (which is what I do currently) and save a little time putting the Computer on the Domain. Sweet. I’m a hafta check this out further and get the syntax down just so… Looks like there’s also a switch to ‘reuse’ an existing account which would be really handy for computers that I rebuild and add back using the same machine name. That would save time too. Looks like it might be Win7/Server2008 specific and may not be available widely where I work. We have not moved our Domains to Server 2008 as far as I know.

     

    djoin /provision /domain to be joined> /machine /savefile blob.txt

     

    http://technet.microsoft.com/en-us/library/dd391977(v=WS.10).aspx (What’s new in Active Directory Domain Services in Windows Server 2008 R2: Offline Domain provisioning.

     

    Also you want to be able to specify the path in AD where the computer account is going to be created. That requires knowing the full syntax of the LDAP:// path in AD

     

    http://serverfault.com/questions/22866/how-can-i-determine-my-user-accounts-ou-in-a-windows-domain

     

    There’s also a script you can download and run to get similar info that is Win 2000 era AD compliant: http://www.joeware.net/freetools/tools/adfind/index.htm

     

    Random Thoughts just now: I could create a Generic WIM with a single folder added each time and Appended to the original WIM that included the Windows CAB file for that ‘make/model’ from Dell. Each folder then could have DPInst copied into it and run as a Synchronous command during OOBE pass for each time the WIM is applied with ImageX. Just need to remember which number to use for each model’s set of drivers. But the description field for each of those appended driver setups could be descriptive enough to make it user friendly. Or we could opt just to include the 960 drivers as a base set covering most bases and then provide links to the CAB files over \\fileshare\j\deviceDrivers\ and let DPInst recurse its way down the central store of drivers to do the cleanup phase.

     

    OK, got a good autounattend.xml formulated. Should auto-activate and register the license key no problem-o. Can’t wait to try it out tomorrow when I get home on the test computer I got setup. It’s an Optiplex 960 and I’m going to persist all the Device Drivers after I run sysprep /generalize /shutdown /oobe and capture the WIM file. Got a ton of customizing yet to do on the Admin profile before it gets copied to the Default Profile on the sysprep step. So maybe this time round I’ll get it just right.

     

    One big thing I have to remember is to set IE 8 to pass all logon information for the Trusted Sites Zone within the security settings. If I get that embedded into the thing once and for all I’ll have a halfway decent image that mirrors what we’re using now in Ghost. Next steps once this initial setup from a Win7 setup disk is perfected is to tweak the Administrator’s profile then set copy profile=true when I run Sysprep /generalize /oobe /config:unattend.xml (that config file is another attempt to filter the settings of what gets kept and what is auto-run before the final OOBE phase on the Windows setup). That will be the last step in the process.

     

     

  • Apple, Google Just Killed Portable GPS Devices | Autopia | Wired.com

    Note this is a draft of an article I wrote back in June when Apple announced it was going to favor its own Maps app over Google Maps and take G-Maps out of the Apple Store altogether. This blog went on hiatus just 2 weeks after that. And a whirlwind number of staff changes occurred at Apple as a result of the debacle of iOS Maps product. Top people have been let go not the least of which was the heir apparent in some people’s views of Steve Jobs; Scott Forstall. He was not popular, very much a jerk and when asked by Tim Cook to co-sign the mea culpa Apple put out over their embarrassment about the lack of performance and lack of quality of iOS Maps, Scott wouldn’t sign it. So goodbye Scott, hello Google Maps. Somehow Google and Apple are in a period of detente over Maps and Google Maps is now returned to the Apple Store. Who knew so much could happen in 6 months right?

    Garmin told Wired in a statement. “We think that there is a market for smartphone navigation apps, PNDs [Personal Navigation Devices] and in-dash navigation systems as each of these solutions has their own advantages and use case limitations and ultimately it’s up to the consumer to decide what they prefer.

    via Apple, Google Just Killed Portable GPS Devices | Autopia | Wired.com.

    That’s right mapping and navigation are just one more app in a universe of software you can run on your latest generation iPod Touch or iPhone. I suspect that the Maps will only be available on the iPhone as that was a requirement previously placed on the first gen Maps app on iOS. It would be nice if there were a lower threshold entry point for participation in the Apple Maps app universe.

    But I do hear one or two criticisms regarding Apple’s attempt to go its own way. Google’s technology and data set lead (you know all those cars driving around and photographing?) Apple has to buy that data from others, it isn’t going to start from scratch and attempt to re-create Google’s Street View data set. Which means it won’t be something Maps has as a feature probably for quite some time. Android’s own Google Maps app includes turn-by-turn navigation AND Street view built right in. It’s just there. How cool is that? You get the same experience on the mobile device as the one you get working in a web browser on a desktop computer.

    In this battle between Google and Apple the pure play personal navigation device (PND) manufacturers are losing share. I glibly suggested in a twee yesterday that Garmin needs to partner up with Apple and help out with its POI and map datasets so that potentially both can benefit. It would be cool if a partnership could be struck that allowed Apple to have feature that didn’t necessarily steal market share from the PNDs, but could somehow raise all boats equally. Maybe a partnership to create a Street View-like add-on for everyone’s mapping datasets would be a good start. That would help level the playing field between Google vs. the rest of the world.

  • End of the hiatus

    I am now at a point in my daily work where I can begin posting to my blog once again. It’s not so much that I’M catching up, but more like I don’t care as much about falling behind. Look forward to more Desktop related posts as that is now my fulltime responsibility there where I work.

    Posted from WordPress for Windows Phone

  • A Hiatus is being announced (carpetbomberz.com is taking a pause)

    Sadly, my job has changed. I was forced to be re-assigned to a different part of the same organization I have worked for the last 16 years. Luckily I GOT a job, and still have one. Which is more than some people who have suffered through these last 4 years of recession. So I thank my lucky stars that I can continue to pay bills for the foreseeable future. I am lucky, there’s no other word for it.

    As for my commentary on technology news, that will have to wait for a while until I can sort out my daily schedule. This may take a little while until I can develop a good work/life balance again and am able to follow tech news a little more closely and try to project what Future Trends may emerge. So I’m glad to have had a good consistent run for a while. And hopefully I can get back to a regular twice-weekly schedule again real soon. So enjoy the archive of older articles (there’s literally hundreds of them) and try throwing in some comments on some older articles. I’ll respond, no problem with that at all. And on that happy suggestion, I bid you adieu!

  • The wretched state of GPU transcoding – ExtremeTech

    The spring 2005 edition of ExtremeTech magazine
    The spring 2005 edition of ExtremeTech magazine (Photo credit: Wikipedia)

    For now, use Handbrake for simple, effective encodes. Arcsoft or Xilisoft might be worth a look if you know you’ll be using CUDA or Quick Sync and have no plans for any demanding work. Avoid MediaEspresso entirely.

    via By Joel Hruska @ ExtremeTech The wretched state of GPU transcoding – Slideshow | ExtremeTech.

    Joel Hruska does a great survey of GPU enabled video encoders. He even goes back to the original Avivo and Badaboom encoders put out by AMD and nVidia when they were promoting GPU accelerated video encoding. Sadly the hype doesn’t live up to the results. Even Intel’s most recent competitor in the race, QuickSync, is left wanting. HandBrake appears to be the best option for most people and the most reliable and repeatable in the results it gives.

    Ideally the maintainers of the HandBrake project might get a boost by starting up a fork of the source code that has Intel QuickSync support. There’s no indication now that that everyone is interested in proprietary Intel technology like QuickSynch as expressed in this article from Anandtech. OpenCL seems like a more attractive option for the Open Source community at large. So the OpenCL/HandBrake development is at least a little encouraging. Still as Joel Hruska points out the CPU still is the best option for encoding high quality at smaller frame sizes, it just beats the pants off all the GPU accelerated options available to date.

    Image representing AMD as depicted in CrunchBase
    Image via CrunchBase
  • Facebook smacks away hardness, sticks MySQL stash on flash • The Register

    Image representing Fusion-io as depicted in Cr...
    Image via CrunchBase

    Does Fusion-io have a sustainable competitive advantage or will it get blown away by a hurricane of other PCIe flash card vendors attacking the market, such as EMC, Intel, Micron, OCZ, TMS, and many others? 

    via Facebook smacks away hardness, sticks MySQL stash on flash • The Register.

    More updates on the data center uptake of PCI SSD cards in the form of two big wins from Facebook and Apple. Price/Performance for database applications seems to be skewed heavily to Fusion-io versus the big guns in large scale SAN roll-outs. It seems like due to the smaller scale and faster speed PCI SSD outstrips the resources needed to get an equally fast disk based storage array (including power, and square feet taken up by all the racks). Typically a large rack of spinning disks can be aggregated by using RAID drive controllers and caches to look like a very large high speed hard drive. The Fibre Channel connections add yet another layer of aggregation on top of all that so that you can start splitting the underlying massive disk array into virtual logical drives that fit the storage needs of individual servers and OSes along the way. But to get sufficient speed equal to a Fusion-io style PCI SSD, say to speed up JUST your MySQL server the number of equivalent drives, racks, RAID controllers, caches and Fibre Channel host bus adapters is so large and costs so much, it isn’t worth it.

    A single PCI SSD won’t quite have the same total storage capacity as say that larger scale SAN. But for a single, say one-off speed up of a MySQL database you don’t need the massive storage so much as the massive speed up in I/O. And that’s where the PCI SSD comes into play. With the newest PCI 3.0 interfaces and utilizing 8x (eight PCI lane) connectors the current generation of cards is able to maintain 2GB/sec through put on a single PCI card. To achieve that using the older SAN technology is not just cost prohibitive but seriously SPACE prohibitive in all but the largest of data centers. The race now is to see how dense and energy efficient a data center can be constructed. So it comes as no surprise that Facebook and Apple (who are attempting to lower costs all around) are the ones leading this charge of higher density and higher power efficiency as well.

    Don’t get me wrong when I tout the PCI SSD so heavily. Disk storage will never go away in my lifetime. It’s just to cost effective and it is fast enough. But for the SysOps in charge of deploying production Apps and hitting performance brick walls, the PCI SSD is going to really save the day. And if nothing else will act as a bridge for most until a better solution can be designed and procured in any given situation. That alone I think would make the cost of trying out a PCI SSD well worth it. Longer term, which vendor will win is still a toss-up. I’m not well versed in the scale of sales into Enterprises of the big vendors in the PCI SSD market. But Fusion-io is doing a great job keeping their name in the press and marketing to some big identifiable names.

    But also I give OCZ some credit to with their Z-Drive R5 though it’s not quite considered an Enterprise data center player. Design wise, the OCZ R5 is helping push the state of the art by trying out new controllers, new designs attempting to raise the total number of I/Os and bandwidth on single card. I’ve seen one story so far about a test sample at Computex(Anandtech) that a brand new clean R5 hit nearly 800,000 I/Os in benchmark tests. That peak peformance eventually eroded as the flash chips filled up and fell to around 530,000 I/Os but the trend is clear. We may see 1million IOPs on a single PCI SDD before long. And that my readers is going to be an Andy Grove style 10X difference that brings changes we never thought possible.

    Andy Grove: Only the Paranoid Survive
    In this book Grove mentions a 10x change is when things are improving, growing at a rate of one whole order of magnitude, reaching a new equilibrium
  • Very cool tips for monitoring the GPU usage on Win8. Didn’t know you could do this without have to install a card manufacturer’s own utility software.

    McAkins's avatarMcAkins Online

    To analyze your device’s GPU usage by Windows 8 please follow these simple steps:

    1. Download Process Explorer from Microsoft and install as usual.

    2. Run the Process Explorer as Admin by right-clicking on the Icon on Start Screen and on the App bar below elect to run as Admin. You need to do this to see the GPU readouts.

    image

    3. Click on any of the Graphs on the toolbar, System Information opens:

    image

    4. Select the GPU tab and Click the Engine button. GPU Engine History screen opens:

    image

    Run any Graphic intensive application or just use Windows 8 as usual and come back to see your GPU usage history.

    View original post