Blog

  • 2012 in review

    The WordPress.com stats helper monkeys prepared a 2012 annual report for this blog.

    Here’s an excerpt:

    4,329 films were submitted to the 2012 Cannes Film Festival. This blog had 17,000 views in 2012. If each view were a film, this blog would power 4 Film Festivals

    Click here to see the complete report.

  • Attempting to create an autounattend.xml file for work

    Image representing Windows as depicted in Crun...
    Image via CrunchBase

     

    Starting with this website tutorial I’m attempting to create a working config file that will allow me to install new Windows 7 Professional installs without having to interact or click any buttons.

     

    http://sergeyv.com/blog/archive/2009/12/17/unattended-install-of-windows-7.aspx

     

    Seems pretty useful so far as Sergey provides an example autounattend file that I’m using as a template for my own. I particularly like his RunOnce registry additions. This makes it so much more useful than just simply being an answer file to the base OS install. True it is annoying that questions that come up through successive reboots during the specialize pass on a Windows 7 fresh install. But this autounattend file does a whole lot of default presetting behind that  scenes, and that’s what I want when I’m trying create a brand new WIM image for work. I’m going to borrow those most definitely.

     

    I also discovered an interesting sub-section devoted to joining a new computer to a Domain. Ever heard of djoin.exe?

     

    http://technet.microsoft.com/en-us/library/Dd392267.aspx

     

    Very interesting stuff where you can join the computer without first having to login to the domain controller and create a new account in the correct OU (which is what I do currently) and save a little time putting the Computer on the Domain. Sweet. I’m a hafta check this out further and get the syntax down just so… Looks like there’s also a switch to ‘reuse’ an existing account which would be really handy for computers that I rebuild and add back using the same machine name. That would save time too. Looks like it might be Win7/Server2008 specific and may not be available widely where I work. We have not moved our Domains to Server 2008 as far as I know.

     

    djoin /provision /domain to be joined> /machine /savefile blob.txt

     

    http://technet.microsoft.com/en-us/library/dd391977(v=WS.10).aspx (What’s new in Active Directory Domain Services in Windows Server 2008 R2: Offline Domain provisioning.

     

    Also you want to be able to specify the path in AD where the computer account is going to be created. That requires knowing the full syntax of the LDAP:// path in AD

     

    http://serverfault.com/questions/22866/how-can-i-determine-my-user-accounts-ou-in-a-windows-domain

     

    There’s also a script you can download and run to get similar info that is Win 2000 era AD compliant: http://www.joeware.net/freetools/tools/adfind/index.htm

     

    Random Thoughts just now: I could create a Generic WIM with a single folder added each time and Appended to the original WIM that included the Windows CAB file for that ‘make/model’ from Dell. Each folder then could have DPInst copied into it and run as a Synchronous command during OOBE pass for each time the WIM is applied with ImageX. Just need to remember which number to use for each model’s set of drivers. But the description field for each of those appended driver setups could be descriptive enough to make it user friendly. Or we could opt just to include the 960 drivers as a base set covering most bases and then provide links to the CAB files over \\fileshare\j\deviceDrivers\ and let DPInst recurse its way down the central store of drivers to do the cleanup phase.

     

    OK, got a good autounattend.xml formulated. Should auto-activate and register the license key no problem-o. Can’t wait to try it out tomorrow when I get home on the test computer I got setup. It’s an Optiplex 960 and I’m going to persist all the Device Drivers after I run sysprep /generalize /shutdown /oobe and capture the WIM file. Got a ton of customizing yet to do on the Admin profile before it gets copied to the Default Profile on the sysprep step. So maybe this time round I’ll get it just right.

     

    One big thing I have to remember is to set IE 8 to pass all logon information for the Trusted Sites Zone within the security settings. If I get that embedded into the thing once and for all I’ll have a halfway decent image that mirrors what we’re using now in Ghost. Next steps once this initial setup from a Win7 setup disk is perfected is to tweak the Administrator’s profile then set copy profile=true when I run Sysprep /generalize /oobe /config:unattend.xml (that config file is another attempt to filter the settings of what gets kept and what is auto-run before the final OOBE phase on the Windows setup). That will be the last step in the process.

     

     

  • Apple, Google Just Killed Portable GPS Devices | Autopia | Wired.com

    Note this is a draft of an article I wrote back in June when Apple announced it was going to favor its own Maps app over Google Maps and take G-Maps out of the Apple Store altogether. This blog went on hiatus just 2 weeks after that. And a whirlwind number of staff changes occurred at Apple as a result of the debacle of iOS Maps product. Top people have been let go not the least of which was the heir apparent in some people’s views of Steve Jobs; Scott Forstall. He was not popular, very much a jerk and when asked by Tim Cook to co-sign the mea culpa Apple put out over their embarrassment about the lack of performance and lack of quality of iOS Maps, Scott wouldn’t sign it. So goodbye Scott, hello Google Maps. Somehow Google and Apple are in a period of detente over Maps and Google Maps is now returned to the Apple Store. Who knew so much could happen in 6 months right?

    Garmin told Wired in a statement. “We think that there is a market for smartphone navigation apps, PNDs [Personal Navigation Devices] and in-dash navigation systems as each of these solutions has their own advantages and use case limitations and ultimately it’s up to the consumer to decide what they prefer.

    via Apple, Google Just Killed Portable GPS Devices | Autopia | Wired.com.

    That’s right mapping and navigation are just one more app in a universe of software you can run on your latest generation iPod Touch or iPhone. I suspect that the Maps will only be available on the iPhone as that was a requirement previously placed on the first gen Maps app on iOS. It would be nice if there were a lower threshold entry point for participation in the Apple Maps app universe.

    But I do hear one or two criticisms regarding Apple’s attempt to go its own way. Google’s technology and data set lead (you know all those cars driving around and photographing?) Apple has to buy that data from others, it isn’t going to start from scratch and attempt to re-create Google’s Street View data set. Which means it won’t be something Maps has as a feature probably for quite some time. Android’s own Google Maps app includes turn-by-turn navigation AND Street view built right in. It’s just there. How cool is that? You get the same experience on the mobile device as the one you get working in a web browser on a desktop computer.

    In this battle between Google and Apple the pure play personal navigation device (PND) manufacturers are losing share. I glibly suggested in a twee yesterday that Garmin needs to partner up with Apple and help out with its POI and map datasets so that potentially both can benefit. It would be cool if a partnership could be struck that allowed Apple to have feature that didn’t necessarily steal market share from the PNDs, but could somehow raise all boats equally. Maybe a partnership to create a Street View-like add-on for everyone’s mapping datasets would be a good start. That would help level the playing field between Google vs. the rest of the world.

  • End of the hiatus

    I am now at a point in my daily work where I can begin posting to my blog once again. It’s not so much that I’M catching up, but more like I don’t care as much about falling behind. Look forward to more Desktop related posts as that is now my fulltime responsibility there where I work.

    Posted from WordPress for Windows Phone

  • A Hiatus is being announced (carpetbomberz.com is taking a pause)

    Sadly, my job has changed. I was forced to be re-assigned to a different part of the same organization I have worked for the last 16 years. Luckily I GOT a job, and still have one. Which is more than some people who have suffered through these last 4 years of recession. So I thank my lucky stars that I can continue to pay bills for the foreseeable future. I am lucky, there’s no other word for it.

    As for my commentary on technology news, that will have to wait for a while until I can sort out my daily schedule. This may take a little while until I can develop a good work/life balance again and am able to follow tech news a little more closely and try to project what Future Trends may emerge. So I’m glad to have had a good consistent run for a while. And hopefully I can get back to a regular twice-weekly schedule again real soon. So enjoy the archive of older articles (there’s literally hundreds of them) and try throwing in some comments on some older articles. I’ll respond, no problem with that at all. And on that happy suggestion, I bid you adieu!

  • The wretched state of GPU transcoding – ExtremeTech

    The spring 2005 edition of ExtremeTech magazine
    The spring 2005 edition of ExtremeTech magazine (Photo credit: Wikipedia)

    For now, use Handbrake for simple, effective encodes. Arcsoft or Xilisoft might be worth a look if you know you’ll be using CUDA or Quick Sync and have no plans for any demanding work. Avoid MediaEspresso entirely.

    via By Joel Hruska @ ExtremeTech The wretched state of GPU transcoding – Slideshow | ExtremeTech.

    Joel Hruska does a great survey of GPU enabled video encoders. He even goes back to the original Avivo and Badaboom encoders put out by AMD and nVidia when they were promoting GPU accelerated video encoding. Sadly the hype doesn’t live up to the results. Even Intel’s most recent competitor in the race, QuickSync, is left wanting. HandBrake appears to be the best option for most people and the most reliable and repeatable in the results it gives.

    Ideally the maintainers of the HandBrake project might get a boost by starting up a fork of the source code that has Intel QuickSync support. There’s no indication now that that everyone is interested in proprietary Intel technology like QuickSynch as expressed in this article from Anandtech. OpenCL seems like a more attractive option for the Open Source community at large. So the OpenCL/HandBrake development is at least a little encouraging. Still as Joel Hruska points out the CPU still is the best option for encoding high quality at smaller frame sizes, it just beats the pants off all the GPU accelerated options available to date.

    Image representing AMD as depicted in CrunchBase
    Image via CrunchBase
  • Facebook smacks away hardness, sticks MySQL stash on flash • The Register

    Image representing Fusion-io as depicted in Cr...
    Image via CrunchBase

    Does Fusion-io have a sustainable competitive advantage or will it get blown away by a hurricane of other PCIe flash card vendors attacking the market, such as EMC, Intel, Micron, OCZ, TMS, and many others? 

    via Facebook smacks away hardness, sticks MySQL stash on flash • The Register.

    More updates on the data center uptake of PCI SSD cards in the form of two big wins from Facebook and Apple. Price/Performance for database applications seems to be skewed heavily to Fusion-io versus the big guns in large scale SAN roll-outs. It seems like due to the smaller scale and faster speed PCI SSD outstrips the resources needed to get an equally fast disk based storage array (including power, and square feet taken up by all the racks). Typically a large rack of spinning disks can be aggregated by using RAID drive controllers and caches to look like a very large high speed hard drive. The Fibre Channel connections add yet another layer of aggregation on top of all that so that you can start splitting the underlying massive disk array into virtual logical drives that fit the storage needs of individual servers and OSes along the way. But to get sufficient speed equal to a Fusion-io style PCI SSD, say to speed up JUST your MySQL server the number of equivalent drives, racks, RAID controllers, caches and Fibre Channel host bus adapters is so large and costs so much, it isn’t worth it.

    A single PCI SSD won’t quite have the same total storage capacity as say that larger scale SAN. But for a single, say one-off speed up of a MySQL database you don’t need the massive storage so much as the massive speed up in I/O. And that’s where the PCI SSD comes into play. With the newest PCI 3.0 interfaces and utilizing 8x (eight PCI lane) connectors the current generation of cards is able to maintain 2GB/sec through put on a single PCI card. To achieve that using the older SAN technology is not just cost prohibitive but seriously SPACE prohibitive in all but the largest of data centers. The race now is to see how dense and energy efficient a data center can be constructed. So it comes as no surprise that Facebook and Apple (who are attempting to lower costs all around) are the ones leading this charge of higher density and higher power efficiency as well.

    Don’t get me wrong when I tout the PCI SSD so heavily. Disk storage will never go away in my lifetime. It’s just to cost effective and it is fast enough. But for the SysOps in charge of deploying production Apps and hitting performance brick walls, the PCI SSD is going to really save the day. And if nothing else will act as a bridge for most until a better solution can be designed and procured in any given situation. That alone I think would make the cost of trying out a PCI SSD well worth it. Longer term, which vendor will win is still a toss-up. I’m not well versed in the scale of sales into Enterprises of the big vendors in the PCI SSD market. But Fusion-io is doing a great job keeping their name in the press and marketing to some big identifiable names.

    But also I give OCZ some credit to with their Z-Drive R5 though it’s not quite considered an Enterprise data center player. Design wise, the OCZ R5 is helping push the state of the art by trying out new controllers, new designs attempting to raise the total number of I/Os and bandwidth on single card. I’ve seen one story so far about a test sample at Computex(Anandtech) that a brand new clean R5 hit nearly 800,000 I/Os in benchmark tests. That peak peformance eventually eroded as the flash chips filled up and fell to around 530,000 I/Os but the trend is clear. We may see 1million IOPs on a single PCI SDD before long. And that my readers is going to be an Andy Grove style 10X difference that brings changes we never thought possible.

    Andy Grove: Only the Paranoid Survive
    In this book Grove mentions a 10x change is when things are improving, growing at a rate of one whole order of magnitude, reaching a new equilibrium
  • Very cool tips for monitoring the GPU usage on Win8. Didn’t know you could do this without have to install a card manufacturer’s own utility software.

    McAkins's avatarMcAkins Online

    To analyze your device’s GPU usage by Windows 8 please follow these simple steps:

    1. Download Process Explorer from Microsoft and install as usual.

    2. Run the Process Explorer as Admin by right-clicking on the Icon on Start Screen and on the App bar below elect to run as Admin. You need to do this to see the GPU readouts.

    image

    3. Click on any of the Graphs on the toolbar, System Information opens:

    image

    4. Select the GPU tab and Click the Engine button. GPU Engine History screen opens:

    image

    Run any Graphic intensive application or just use Windows 8 as usual and come back to see your GPU usage history.

    View original post

  • My colleagues over at the MedCenter are announcing the big upgrade this weekend. Let’s hope this downtime goes smoothly and everything just works come Monday (fingers crossed)

    Catherine Delia's avatarURMC Learn

    We will be updating the Blackboard Learning Management system from 9.1 Service Pack 5 to 9.1 Service Pack 7 on Saturday, June 9 between midnight and 10AM. The Blackboard Learning System functions will be unavailable. Portal services including UR ePay, Student/Instructor/Advisor Access and email will be available.

    Most of the changes are performance improvements and enabling some new data features.

    • Performance Improvements
    • SCORM Integration
    • Browser Compatibility updates

    Additionally, the Learning Objects Blog and Wiki tools will no longer be available. You can find out more about the Blackboard native Blog and Wiki tools and exporting your Learning Objects data before June 30, 2012.

    If you have questions please contact Blackboard Support via phone at 275-6865 or via email at blackboard@urmc.rochester.edu.

    View original post

  • Doc Searls Weblog · Won and done

    Doc Searls
    Doc Searls (Photo credit: Wikipedia)

    This tells me my job with foursquare is to be “driven” like a calf into a local business. Of course, this has been the assumption from the start. But I had hoped that somewhere along the way foursquare could also evolve into a true QS app, yielding lat-lon and other helpful information for those (like me) who care about that kind of thing. (And, to be fair, maybe that kind of thing actually is available, through the foursquare API. I saw a Singly app once that suggested as much.) Hey, I would pay for an app that kept track of where I’ve been and what I’ve done, and made  that data available to me in ways I can use.

    via Doc Searls Weblog · Won and done.

    foursquare as a kind of Lifebits I think is what Doc Searls is describing. A form of self-tracking a la Stephen Wolfram or Gordon Moore. Instead foursquare is the carrot being dangled to lure you into giving your business to a particular retailer. After that you accumulate points for numbers of visits and possibly unlock rewards for your loyalty. But foursquare no doubt accumulates a lot of other data along the way that could be use for the very purpose Doc Searls was hoping for.

    Gordon Moore’s work at Microsoft Research bootstrapping the My Lifebits project is a form of memory enhancement, but also logging of personal data that can be analyzed later. The collection or ‘instrumentation’ of one’s environment is what Stephen Wolfram has accomplished by counting things over time. Not to say it’s simpler than the My Lifebits, but it is in someways lighter weight data (instead of videos and pictures, mouse clicks and tallies of email activity, times of day, etc.) There is no doubt that foursquare could make a for profit service to paying users where they could collect this location data and serve it up to subscribers, letting them analyze the data after the fact.

    I firmly believe a form of My Lifebits could be aggregated across a wide range of free and paid services along with personal instrumentation and data collecting like the kind Stephen Wolfram does. If there’s one thing I’ve learned readings stories about inventions like these from MIT’s Media Lab is that it’s never an either or proposition. You don’t have to just adopt Gordon Moore’s technology or Stephen Wolfram’s techniques or even foursquare’s own data. You can do all or just pick and choose the ones that suit your personal data collection needs. Then you get to slice, dice and analyze to your heart’s content. What you do with it after that is completely up to you and should be considered as personal as any legal documents or health records you already have.

    Which takes me back to an article I wrote some time ago in reference to Jon Udell calling for a federated LifeBits type of service. It wouldn’t be constrained to one kind of data, but all the LifeBits aggregated potentially and new repositories for stuff that must be locked down and private. So add Doc Searls to the list of bloggers and long time technology writers who see an opportunity. Advocacy (in the case of Doc’s experience with foursquare) on behalf of sharing unfiltered data with the users on whom data is collected is one step in that direction. I feel Jon Udell is also an advocate for users gaining access to all that collected and aggregated data. But as Jon Udell asks, who is going to be the first to attempt to offer this up as a pay-for service in the cloud where you can for a fee access your lifebits aggregated into one spot (foursquare,twitter,facebook,gmail,flickr,photostream,mint,eRecords,etc.) so that you don’t spend your life logging on and logging off from service to service to service. Aggregation could be a beautiful thing.

    Image representing Foursquare as depicted in C...
    Image via CrunchBase