Batteries take the lithium for charge boost • The Register

Embed from Getty Images

To do that, the researchers coated a lithium anode with a layer of hollow carbon nanospheres, to prevent the growth of the dendrites.

via Batteries take the lithium for charge boost • The Register.

As research is being done on incremental improvements in Lithium Ion batteries, some occasional discoveries are being made. In this instance, the anode is being switched to pure lithium with a coating to protect the very reactive metal surface. The problem with using pure lithium is the growth of micro crystalline “dendrites”, kind of like stalagmites/stalactites in caves, along the whole surface. As the the dendrites build up, the anode loses it’s efficiency and that battery slowly loses it’s ability to charge all the way. This research has shown how to coat a pure lithium anode with a later of carbon nanotubes to help act as a permeable layer between the the electrolytic liquid in the battery and the pure lithium anode.

In past articles on Carpetbomberz.com we’ve seen announcements of other possible battery technologies like Zinc-Air, Lithium-Air and possible use of carbon nanotubes as a anode material. This announcement is promising in that it’s added costs might be somewhat smaller versus a wholesale change in battery chemistry. Similarly the article points out how much lighter elemental Lithium is versus the current anode materials (Carbon and Silicon). If the process of coating the anode is sufficiently inexpensive and can be done on a industrial production line, you will see this get adopted. But with most experiments like these, scaling up and lowering costs is the hardest thing to do. Hopefully this is one that will make it into shipping products and see the light of day.

 

Advertisements

MIT Puts 36-Core Internet on a Chip | EE Times

Partially connected mesh topology
Partially connected mesh topology (Photo credit: Wikipedia)

Today many different interconnection topologies are used for multicore chips. For as few as eight cores direct bus connections can be made — cores taking turns using the same bus. MIT’s 36-core processors, on the other hand, are connected by an on-chip mesh network reminiscent of Intel’s 2007 Teraflop Research Chip — code-named Polaris — where direct connections were made to adjacent cores, with data intended for remote cores passed from core-to-core until reaching its destination. For its 50-core Xeon Phi, however, Intel settled instead on using multiple high-speed rings for data, address, and acknowledgement instead of a mesh.

via MIT Puts 36-Core Internet on a Chip | EE Times.

I commented some time back on a similar article on the same topic. It appears now the MIT research group has working silicon of the design. As mentioned in the pull-quote, the Xeon Phi (which has made some news in the Top 500 SuperComputer stories recently) is a massively multicore architecture but uses a different interconnect that Intel designed on their own. These stories as they appear get filed into the category of massively multicore or low power CPU developments. Most times the same CPUs add cores without significantly drawing more power and thus provide a net increase in compute ability. Tilera, Calxeda and yes even SeaMicro were all working along towards those ends. Either through mergers, or cutting of funding each one has seemed to trail off and not succeed at its original goal (massively multicore, low power designs). Also along the way Intel has done everything it can to dull and dent the novelty of the new designs by revising an Atom based or Celeron based CPU to provide much lower power at the scale of maybe 2 cores per CPU.

Like this chip MIT announced Tilera too was originally an MIT research product spun off of the University campus. Its principals were the PI and a research associate if I remember correctly. Now that MIT has the working silicon they’re going to benchmark and test and verify their design. The researchers will release the verilog hardware description of chip for anyone use, research or verify for themselves once they’ve completed their own study. It will be interesting to see how much of an incremental improvement this design provides, and possibly could be the launch of another Tilera style product out of MIT.

AnandTech | Intel SSD DC P3700 Review: The PCIe SSD Transition Begins with NVMe

We don’t see infrequent blips of CPU architecture releases from Intel, we get a regular, 2-year tick-tock cadence. It’s time for Intel’s NSG to be given the resources necessary to do the same. I long for the day when we don’t just see these SSD releases limited to the enterprise and corporate client segments, but spread across all markets – from mobile to consumer PC client and of course up to the enterprise as well.

via AnandTech | Intel SSD DC P3700 Review: The PCIe SSD Transition Begins with NVMe.

Big news in the SSD/Flash memory world at Computex in Taipei, Taiwan. Intel has entered the fray with Samsung and SandForce issuing a fully NVMe compliant set of drives running on PCIe cards. Throughputs are amazing, but the prices are overly competitive. You can enter the market for as low as $600 for a 400GB PCIe card running as an NVMe compliant drive. On Windows Server 2012 R2 and Windows 8.1 you get native support for NVMe drives. This is going to get really interesting. Especially considering all the markets and levels of consumers within the market. On the budget side is the SATA Express interface which is an attempt to factor out some of the slowness inherent in SSDs attached to SATA bus interfaces. Then there’s M.2 which is the smaller form factor PCIe based drive interface being adopted by manufacturers making light and small form factor tablets and laptops. That is a big jump past SATA altogether and also has a speed bump associated with it as it communicates directly with the PCIe bus. Last and most impressive of all is the NVMe devices announced by Intel with yet a further speed bump as it’s addressing multiple data lanes on PCI Express. Some concern trolls in the gaming community are quick to point out the data lanes are being lost to I/O when they already are maxing them out with their 3D graphics boards.

The route forward it seems would be Intel motherboard designs with a PCIe 3 interface with the equivalent data lanes for two full speed 16x graphics cards, but using that extra 16x lane to devote to I/O instead or maybe a 1.5X arrangement with a fully 16X lane and 2 more 8X lanes to handle regular I/O plus a dedicated 8X NVMe interface? It’s going to require some reengineering and BIOS updating no doubt to get all the speed out of all the devices simultaneously. That’s why I would also like to remind readers of the Flash-DIMM phenomenon as well sitting out there on the edges in the high speed, high frequency trading houses in the NYC metro area. We haven’t seen nor heard much since the original product announcement from IBM for the X6-series servers and the options for Flash-DIMMs on that product line. Smart Memory Technology (the prime designer/manufacturer of Flash-DIMMs for SanDisk) has now been bought out by SanDisk. Again now word on that product line now. Same is true for the Lenovo takeover of IBM’s Intel server product line (of which the X6-series is the jewel in the crown). Mergers and acquisitions have veiled and blunted some of these revolutionary product announcements, but I hope eventually that Flash-DIMMs see the light of day and gain full BIOS support and eventually make it into the Desktop computer market. As good as NVMe is going forward, I think we need too a mix of Flash-DIMM to see the full speed of the multi-core X86 Intel chips.

Corning Announces Availability of USB 3.Optical Cables

English: A TOSLINK fiber optic cable with a cl...
English: A TOSLINK fiber optic cable with a clear jacket that has a laser being shone onto one end of the cable. The laser is being shone into the left connector; the light coming out the right connector is from the same laser. (Photo credit: Wikipedia)

Currently available in lengths of 10 meters, Corning will also be releasing USB 3.Optical cables of 15 and 30 meters later this year.  These cables can be purchased online at Amazon and Accu-Tech.

via Corning Announces Availability of USB 3.Optical Cables.

As I’ve had to deal with using webcams stretched across very long distances in classrooms and lecture halls, a 30 meter cable can be a godsend. I’ve used 10 meter long cables with built-in extenders and even that was a big step up. Here’s hoping prices eventually come down to a reasonable price level, say below $100. I’m impressed the power can run across the same cable with the optical fiber. I assume both ends are electrical-optical converters, meaning they need to be powered. So compared to CAT-5 cables with extenders it seems pretty light weight. No need for outlets to power the extenders on both ends.

Of course CAT-5 based extenders are still very price competitive and come in so many formats, USB 3.0 is trivial and probably more price competitive in the 30 meter range. But cable runs in CAT5 can be 50 to 100 meters for data running over TCP/IP on network switches. So CAT-5 with extenders converting to USB will still have the cost and performance advantage for some time to come.

Enhanced by Zemanta

Battery vendors push ultracapacitor wrappers to give Li-ions more bite • The Register

Lithium ion battery by Varta (Museum Autovisio...
Lithium ion battery by Varta (Museum Autovision Altlußheim, Germany) (Photo credit: Wikipedia)

A pair of battery vendors are hoping that a new design which incorporates the use of an ultracapacitor material will help to improve and extend the life of lithium-ion battery packs.

via Battery vendors push ultracapacitor wrappers to give Li-ions more bite • The Register.

First a little background info on what is a capacitor: https://en.wikipedia.org/wiki/Ultracapacitor#History

In short it’s like a very powerful, high density battery for smoothing out the “load” of an electrical circuit. It helps prevent spikes and dips in the electricity as it flows through a device. But with recent work done on ultra-capacitors they can be more like a full-fledged battery that doesn’t ever lose it’s charge over time. When they are combined up with a real live battery you can do some pretty interesting things to both the capacitor and the battery to help them work together, allowing longer battery life, higher total amount of charge capacity. Many things can flow from combining ultracapacitors with a really high end Lithium ion battery.

Any technology, tweak or improvement that promises at minimum 10% improvement over current Lithium ion battery designs is worth a look. They’re claiming a full 15% in this story from The Reg. And due to the re-design it would seem it needs to meet regulatory/safety approval as well. Having seen the JAL Airlines suffer battery issues on the Boeing 787, I couldn’t agree more.

There will be some heavy lifting needing to be done between now and when a product like this hits the market. Testing and failure analysis will ultimately decide whether or not this ultra-capacitor/Lithium ion hybrid is safe enough to use for consumer electronics. I’m also hoping Apple and other manufacturer/design outfits like Apple are putting some eyes, ears and phone calls on this to learn more. Samsung too might be interested in this, but are seemingly more reliant for battery designs outside of their company. That’s where Apple has the upperhand long term, they will design every part if needed in order to keep ahead of the competition.

Enhanced by Zemanta

UW Researchers Create World’s Thinnest LED | EE Times

Boron nitride
Boron nitride (Photo credit: Wikipedia)

 

The researchers harvested single sheets of tungsten selenide (WSe2) using adhesive tape, a technique invented for the production of graphene. They used a support and dielectric layer of boron nitride on a base of silicon dioxide on silicon, to come up with the thinnest possible LED.

via UW Researchers Create World’s Thinnest LED | EE Times.

 

Wow, it seems like the current research in graphene has spawned at least one other possible application, using adhesive tape to create thin layers of homogeneous materials. This time it’s a liquid crystal material with possible applications in thin/flexible LCD displays. As the article says until now Organic LED (OLED) has been the material of choice for thin and even flexible displays. It’s also reassuring MIT was able to publish some similar work in the same edition of Nature magazine. Hopefully this will spur some other researchers to put some money and people on pushing this further.

 

With all early announcements like this in a fully vetted, edited science journal, we won’t see the products derived from this new technology very soon. However, hope spring eternal for me, and I know just like with OLED, eventually if this can be further researched and it’s found to be superior in cost/performance, it will compete in the market place. I will say the steps in fabrication the researchers used are pretty novel and show some amount of creativity to quickly produce a workable thin film without inordinately expensive fabrication equipment. I thinking about specifically the epitaxial electron beam devices folks have used for nano-material research. Like a 3D printer for atoms these devices are a must-have for many electronics engineering and materials researchers. And they are notoriously slow (just like 3D printers) and expensive for each finished job (also similar to 3D printers). The graphene approach to manufacturing devices for research started with making strands of graphite filaments by firing a laser at a highly purified block of carbon, until after so many shots, eventually you might get a shard of a graphene sheet showing up. Using adhesive tape to “shear” a very pure layer of graphite into a graphene sheet, that was the lightning bolt. Simple adhesive tape could get a sufficiently homogeneous and workable layer of graphene to do real work. I feel like there’s a similar approach or affinity at work here for the researchers who used the same technique to make their tungsten selenide thin films for their thin LED displays.

 

English: Adhesive tape
English: Adhesive tape (Photo credit: Wikipedia)

 

 

 

Enhanced by Zemanta

Virtual Reality | Oculus Rift – Consumer Reports

Oculus Intel
Oculus Intel (Photo credit: .michael.newman.)

Imagine being able to immerse yourself in another world, without the limitations of a TV or movie screen. Virtual reality has been a dream for years, but judging by current trends, it may not be just a dream for much longer.

via Virtual Reality | Oculus Rift – Consumer Reports.

I won’t claim that when a technology gets written up in Consumer Reports it has “jumped the shark”, no. Instead I would rather give Consumer Reports kudos for keeping tabs on others writing up and lauding the Oculus Rift VR headset. The specifications of this device continue to improve even before it is hitting the market. Hopes are still high for the prices to be reasonable (really it needs to cost no more than a bottom of the line iPad if there’s any hope of it taking off). Whether the price meets everyone’s expectations is very dependent on the sources for the materials going into the headset, and the single most expensive item are the displays.

OLED (Organic LED) has been used in mobile phones to great effect, the displays use less power and have somewhat brighter color than backlit LCD panels. But they cost more, and the bigger the display the higher the cost. The developers of Oculus Rift have now pressed the cost maybe a little higher by choosing to go with a very high refresh rate and low latency for the OLED screens in the headset. This came after first wave of user feedback indicating too much lag and subsequent headaches due to the screen not keeping up with head movements (this is a classical downfall of most VR headsets no matter the display technology). However Oculus Rift has continued to work on the lag in the current generation head set and by all accounts it’s nearly ready for public consumption. It’s true, they might have fixed the lag issue and most beta testers to date are complimenting the changes in the hardware. This might be the device that launches a thousand 3D headsets.

As 3D goes, the market and appeal may be very limited, that historically has been the case. Whether it was used in academia for data visualization or in the military for simulation, 3D Virtual Reality was an expensive niche catering to people with lots of money to spend. Because Oculus Rift was targeted at a lower price range, but with fantastic performance visually speaking who knows what market may follow it’s actual release. So as everyone is whipped up into a frenzy over the final release of the Oculus Rift VR Headset, keep an eye out for this. It’s going to be hot item in limited supply for a while I would bet. And yes, I do think I would love to try one out myself, not just for gaming purposes but for any of the as yet unseen applications it might have (like the next Windows OS or Mac OS?)

Enhanced by Zemanta