May the SandForce be with you
Nice writeup from Anandtech regarding the press release from LSI about it’s new 3rd generation flash memory controllers. The 3000 series takes over from the 2200 and 1200 series that preceded it as the era of SSDs was just beginning to dawn (remember those heady days of 32GB SSD drives?). Like the Frontier days of old, things are starting to consolidate and find an equilibrium of price vs. performance. Commidity pricing rules the day, but SSDs much less PCIe Flash interfaces are just creeping into the high end of the market of Apple laptops and soon Apple desktops (apologies to the iMac which has already adopted the PCIe interface for its flash drives, but the Mac Pro is still waiting in the wings).
Things continue to improve in terms of future-proofing the interfaces. From SATA to PCIe there was little done to force a migration to one or the other interface as each market had its own peculiarities. SSDs were for the price conscious consumer level market, and PCIe was pretty much only for the enterprise. You had pick and choose your controller very wisely in order to maximize the return on a new device design. LSI did some heavy lifting according to Anandtech by refactoring, redesigning the whole controller thus allowing a manufacturer to buy one controller and use it either way as a SATA SSD controller or as an PCIe flash memory controller. Speeds of each interface indicate this is true at the theoretical throughput end of the scale. LSI reports the PCIe throughput it not too far off the theoretical MAX, (~1.45GB/sec range). Not bad for a chip that can also be use as an SSD controller at 500MB/sec throughput as well. This is going to make designers and hopefully consumers happy as well.
On a more technical note as written about in earlier articles mentioning the great Peak Flash memory density/price limit, LSI is fully aware of the memory architectures and the faillure rates, error rates they accumulate over time.
Desktop Support in the raw (boring!)
One of the many things I would set about doing at my old job was getting things updated to install new Windows/Office disk images for rebuilt or newly built desktops and laptops. One of the first tasks was to build a from scratch WIM from the base install media (Win7 install.wim and the Office 2010 Pro .iso disk images) Now I want to customize the OCT (Office Customization Tool) and get the Office 2010 install just right for a first install on a newly rebuilt system. I’ve played with the OCT in the past, but there’s also an unattend.xml file one could use instead. Might go that route now that I got the Win7 setup running under autounattend.xml (no more OCT)
One thing that is making this XML learning process easier is the edit xml, install test build, observe failures and re-edit round trip is an Oracle VirtualBox I’ve got installed. I’m using the Win7 Setup .iso as a mount point within Virtual Box. It is the first CD drive. Then I have the autounattend.xml sitting in another .iso which I mount as the secondary CD drive. That combo forces the Win7 setup to ‘see’ the autounattend.xml file and start customizing the install as it goes along. One of the cool utilities included with the Windows Automated Install Kit (WAIK 3.1) is a command line program called oscdimg, it will create a .iso out of any folder you choose. That’s what I do to create that secondary CD mount point in VirtualBox. And I never once have to change it, all I have to do is create a new .iso every time I edit the autounattend.xml file (each just adding or deleting a comma and I can start all over again without reconfiguring the VM!) This has saved me countless hours of attempting to do this on real hardware (which is absolutely unnecessary in this case) until I can get it just right.
And let me say there’s a lot of ground to cover and barriers to entry before you can get it ‘just right’. Some of the features provided in Autounattend.xml don’t work. Literally hands down, attempting to add a trusted zone in IE during the Win7 Setup process doesn’t work. And better yet, there’s discussion board entries that CONFIRM it doesn’t work. I’m so glad people participate in these company sponsored fora for the whole world to see. I’m so glad, I didn’t beat my head and heart out trying to get this one ‘feature’ nay, ‘bug’ to work properly. There are a multitude of other ways to achieve the same goal, so I’m pursuing it doing the Copy Profile = true route and add the Trusted Zone URLs to my admin profile let that become the default profile on the machine. Then capture the whole kitten-ka-boodle using Sysprep/GImagex on that idealized Dell Optiplex 960 with all drivers persisted. That’s going to be my universal WIM to start out with. We’ll see how close I can hit that mark.
As it turns out I did go through a number of revisions of this disk image until I perfected it by Feb. 2013. Then I updated the drivers and patches and so forth in June to come out with a grand final WIM for doing all the desktops, laptops for my old office. Since then I had to turn this work over to a contractor who just got hired full-time. He’s now got an updated WIM file using the Virtual Box as a kind of Virtual Build Lab for both updated, creating and applying the WIM images. That work then allows us to put it onto a WinPE flash drive and apply it as needed for a full-touch manual image of a computer. I’m still holding out hope that this can be improved and be less manual, less high-touch than in the past. One further refinement along those lines was adding a “drivers path” to the Windows unattend.xml file. That allowed us to robocopy the drivers for a particular machine into a known folder path on the newly imaged machine (no matter which one it was) and it would just suck up all the drivers on the OOBE steps on the first/second reboots after the machine was imaged for the first time. Heady stuff and I have to say once you start tweaking it speeds stuff up a lot and it just works! It never breaks or wrecks the process. So it’s very reliable to make those single changes that take a step out of the (re)imaging process.
I’m still playing around with these ideas and have trained my replacement on how to use them. Next steps are Windows ADK 8.1, WinPE5 builds and coming up with an image with Office 2013 and Win8.1. I think that will become our next reference standard before long.
Most of us consider our posts to be the fundamental elements of our blog's content. In their never-ending path towards our growing archive, they receive the vast majority of our energy, and most of our readers' attention.
If we think of our posts as a renewable, fresh stream of content, pages, on the other hand, are often treated as no more than stagnant puddles of old information.
How Amazon is building substations, laying fiber and generally doing everything to keep cloud costs down
If there's anyone still left wondering how it is that large cloud providers can keep on rolling out new features and lowering their prices even when no one is complaining about them, Amazon Web Services Vice President and Distinguished Engineer James Hamilton spelled out the answer in one word during a presentation Thursday at the company's re:Invent conference: Scale.
Scale is the enabler of everything at AWS.