Popular Posts

Tuesday, 23 December 2014

New drug for ebola may originate from Bangalore, India


India may well be the place where the drug to fight the killer Ebola virus originates, with two drug candidates discovered in Bangalore among the first 20 to be evaluated in the UK for efficacy against deadly virus. Efforts are on world over, including the UK, to create compounds and molecules that can fight Ebola, but Bangalore’s contribution will make India the only developing nation to have done any serious work.
The two novel compounds or drug candidates have been developed by Dr Jayanta Haldar from the Jawaharlal Nehru Centre for Advanced Scientific Research (JNCASR) and two of his students - Chandra Dhish Ghosh and Mohini Mohan Konai. Haldar has been working on drugs that can fight drug-resistant virus, bacteria and other parasites.
“The two Indian drug candidates which we’ve codenamed NCK-8 and D-LANA-14 are lead candidates from two classes of peptide mimics (a molecule global researchers have created to fight the virus) with high activity against a range of multi-drug resistant bacteria and malarial parasites, including clinical isolates,” said Haldar.
Stating that he cannot disclose the exact materials used in NCK-8 and D-LANA given that they are “novel discoveries.” “They are made in three steps with easily available and inexpensive starting materials which do not require any difficult conditions for synthesis,” said Haldar.
“Currently there is a lot of interest on peptide mimics. We hope to work with Public Health England  PHE over time to evaluate and develop all our anti-microbial mimics, including these two lead candidates that have entered the first round of screening against Ebola,” said Haldar.
“The in-vitro tests will begin mid-January next year. We’ll need about 2.5-3 months to see how it is working, following which the in-vigo tests will begin, which will be followed by clinical studies and talks with pharmaceutical companies,” said Haldar.
Supported by an award from the Wellcome Trust, Public Health England (PHE) scientists are evaluating potential treatment options in their high-containment laboratories at Porton Down, Wiltshire, to determine the most viable candidates for further development.
“We have a track record of scientific innovation and development, and this funding from Wellcome Trust will allow us to utilise our experience and expertise to assist in the fight against Ebola,” said Dr Seshadri Vasan, Senior Business Development Manager, PHE who is one of the co-investigators of the project.
Dr Jayanta Haldar seems to have made it a full time job to fight drug-resistant virus, bacteria, parasites among other things that spread infections.

Chemical camouflage helps fish hide from predators


A reef fish can hide from predators by adopting the smell of the coral it eats, according to researchers in Australia. This is the first time that chemical crypsis – the ability of an organism to avoid detection by using odour-based camouflage – has been observed in vertebrates.
It has been known for centuries that animals can visually blend into their environment to hide from predators. But given that vertebrates rely on more than just their sight to find prey, researchers have questioned whether other types of camouflage exist.
“With the importance of smell for a lot of animals, it makes sense to be chemically camouflaged but there’s very little evidence for it,” said study author Rohan Brooker formerly from James Cook University, Australia.
He goes on to explain that until now caterpillars were the only creature discovered that hid themselves from predators in this way. The caterpillars eat their plant habitat, assume their smell by absorbing certain chemicals in the plant, and become invisible to predatory ants. “A [coral-eating fish and coral] system is analogous to the chemical system of the caterpillar. So we thought maybe they were doing a similar sort of thing,” explained Brooker.
Brooker and his colleagues looked at the harlequin filefish, a reef-dweller that looks like a coral branch. They wanted to find out if the fish could replicate a reef’s smell through its coral-based diet and whether predators could sniff them out.
The team placed a piece of coral and the filefish at opposite ends of a water tank. A coral-dwelling crab was placed in the centre of the tank and ‘blind-folded’ so it was unable to see which end the coral was at and had to rely on other senses such as smell. The team found that the crab was just as likely to move towards the fish as the coral. “[It] suggests that the smell was a pretty good match. A lot of them did get confused,” said Brooker.
The group carried out a similar test but replaced the crab with cod, a filefish predator, to see if the coral fish could fool the cod’s acute sense of smell. They found that the cod were less interested in the filefish if they were close to coral they had fed on. If the filefish was close to a coral that was a different species to the one it had eaten or if there was no coral in the tank, the cod became much more active. The team state this is the first evidence of chemical camouflage in a vertebrate but they do not yet know how it achieves this.
Martin Stevens, an ecologist at the University of Exeter, UK, believes the study could have a wide impact on the wildlife community. ‘I think it could be very important in stimulating work looking at chemical camouflage,’ he tells Chemistry World. But Stevens adds that more research will need to be carried out on this system before it can be translated to other vertebrates and mammals. ‘It’s potentially a really exciting and important study. The question, I think next, is how does [this chemical camouflage] work and how widespread is it,” added Stevens.

Scientists discover biologics with lesser side effects


Promising treatments known as biologics are on the market and under development for many serious illnesses such as cancer, but some of them come with high risks, even lethal ones. Now scientists have produced a novel class of molecules that could be as effective but without the dangerous side effects. They reported their work on these compounds, which they tested on prostate cancer cells, in ACS’ Journal of the American Chemical Society.
Researcher David A Spiegel and colleagues explained that biologics are protein-based therapies that have revolutionized cancer treatment over the past decade. These compounds work by latching onto malignant cells and then triggering the immune system to destroy them - an approach known as immunotherapy. More than 400 kinds are currently undergoing testing in clinical trials. Although they’re very effective at clearing out cancer cells, biologics have serious drawbacks - including potentially fatal allergic reactions - that are mainly due to their relatively large size. Spiegel’s team wanted to develop an alternative that would be just as effective but without the risks.
The researchers produced a set of molecules that they call synthetic antibody mimics, or SyAMs. These molecules act like biologics by sparking an immune response but are far smaller. In lab tests, a subgroup called SyAM-Ps worked well against prostate cancer cells. Because of their small size, the researchers suggest that SyAMs could avoid many of the pitfalls that have plagued biologics. The compounds could represent an entirely new direction in immunotherapy for treating cancer and other diseases, the researchers concluded.

Laser light on nanoparticle photocell converts sunlight into electricity


Four pulses of laser light on nanoparticle photocells in aUniversity of Oregon spectroscopy experiment has opened a window on how captured sunlight can be converted into electricity. The work, which potentially could inspire devices with improved efficiency in solar energy conversion, was performed on photocells that used lead-sulfide quantum dots as photoactive semiconductor material. The research is detailed in a paper placed online by the journal Nature Communications.
In the process studied, each single photon, or particle of sunlight, that is absorbed potentially creates multiple packets of energy called excitons. These packets can subsequently generate multiple free electrons that generate electricity in a process known as multiple exciton generation (MEG). In most solar cells, each absorbed photon creates just one potential free electron.
Multiple exciton generation is of interests because it can lead to solar cells that generate more electrical current and make them more efficient. The UO work shines new light on the little understood process of MEG in nanomaterials.
While the potential importance of MEG in solar energy conversion is under debate by scientists, the UO spectroscopy experiment - adapted in a collaboration with scientists at Sweden’s Lund University - should be useful for studying many other processes in photovoltaic nanomaterials, said Andrew H Marcus, professor of physical chemistry and head of the UO Department of Chemistry and Biochemistry.
Spectroscopic experiments previously designed by Marcus to perform two-dimensional fluorescence spectroscopy of biological molecules were adapted to also measure photocurrent. “Spectroscopy is all about light and molecules and what they do together. It is a really great probe that helps to tell us about the reaction pathway that connects the beginning of a chemical or physical process to its end,” said Marcus.
“The approach is similar to looking at how molecules come together in DNA, but instead we looked at interactions within semiconductor materials. Our method made it possible to look at electronic pathways involved in creating multiple excitons. The existence of this phenomenon had only been inferred through indirect evidence. We believe we have seen the initial steps that lead to MEG-mediated photo conductivity,” said Marcus, an affiliate in UO’s Institute of Molecular Biology, Materials Science Institute and Oregon Center for Optics.
The controlled sequencing of laser pulses allowed the seven-member research team to see - in femtoseconds (a femtosecond is one millionth of one billionth of a second) - the arrival of light, its interaction with resting electrons and the subsequent conversion into multiple excitons. The combined use of photocurrent and fluorescence two-dimensional spectroscopy, Marcus said, provided complementary information about the reaction pathway.
UO co-author Mark C Lonergan, professor of physical and materials chemistry, who studies electrical and electrochemical phenomena in solid-state systems, likened the processes being observed to people moving through a corn maze that has one entrance and three exits.
People entering the maze are photons. Those who exit quickly represent absorbed photons that generate unusable heat. People leaving the second exit represent other absorbed photons that generate fluorescence but not usable free electrons. People leaving the final exit signify usable electrical current.
“The question we are interested in is exactly what does the maze look like. The problem is we don’t have good techniques to look inside the maze to discover the possible pathways through it. The techniques that Andy has developed basically allow us to see into the maze by encoding what is coming out of the system in terms of exactly what is going in. We can visualize what is going on, whether two people coming into the maze shook hands at some point and details about the pathway that led them to come out the electricity exit,” said Lonergan.
The project began when Tonu Pullerits, who studies ultrafast photochemistry in semiconductor molecular materials at Lund University, approached Marcus about adopting his spectroscopic system to look at solar materials. Khadga J Karki, a postdoctoral researcher in Pullerits’ lab, then visited the UO and teamed with the Marcus and Lonergan groups to reconfigure the equipment.
UO doctoral student Julia R Widom was a co-leading author on the paper. Other co-authors with Pullerits, Marcus and Lonergan were Joachim Seibt of Lund University and UO graduate student Ian Moody.

Graphene oxide paper can improve rechargeable batteries, finds new study


Kansas State University engineering team has discovered some of graphene oxide’s important properties that can improve sodium- and lithium-ion flexible batteries. Gurpreet Singh, Assistant Professor of mechanical and nuclear engineering, and Lamuel David, Doctoral Student in mechanical engineering, India, published their findings in the Journal of Physical Chemistry in the article “Reduced graphene oxide paper electrode: Opposing effect of thermal annealing on Li and Na cyclability.”
Graphene oxide is an insulating and defective version of graphene that can be converted to a conductor or a semiconductor when it is heated. Singh and his team studied graphene oxide sheets as flexible paper electrodes for sodium-  and lithium-ion batteries.
The researchers found that sodium storage capacity of paper electrodes depends on the distance between the individual layers that can be tuned by heating it in argon or ammonia gas. For example, reduced graphene oxide sheets, or rGO, produced at high temperature have near zero sodium capacity, while reduced graphene oxide sheets produced at 500 degrees C have the maximum capacity.
“The observation is important because graphite, which is a precursor for making graphene oxide, has negligible capacity for sodium and has long been ruled out as viable electrode for sodium-batteries. Graphite is the material of choice in current lithium-ion batteries because the interlayer spacing is just right for the smaller size lithium ions to diffuse in and out,” said Singh.
The researchers are the first to show that a flexible paper composed entirely of graphene oxide sheets can charge and discharge with sodium-ions for more than 1,000 cycles. Sodium perchlorate salt dissolved in ethylene carbonate served as the electrolyte in their cells.
“Most lithium electrode materials for sodium batteries cannot even last for more than a few tens of charge and discharge cycles because sodium is much larger than lithium and causes enormous volume changes and damage to the host material. This design is unique because the distance between individual graphene layers is large enough to allow fast insertion and extraction of the sodium ions, thanks to the oxygen and hydrogen atoms that prevent sheets from restacking,” said Singh.
Singh and his team also studied the mechanical behavior of the electrodes made of reduced graphene oxide sheets. The researchers measured the strain required to tear apart the electrodes. Through videography, they showed the ability of the crumpled graphene oxide papers to sustain large strains before failing.
“Such measurements and study of failure mechanisms are important for designing long-life batteries because you want the electrode to be able to expand and contract repeatedly without fracture for thousands of cycles, especially for larger nonlithium metal-ion batteries. These days, almost everyone is using crumpled graphene as either the conducting agent or elastic support or both,” said Singh.
Earlier this year, Singh and his team demonstrated large-scale synthesis of few-layer-thick sheets of molybdenum disulfide. They also showed the molybdenum disulfide/graphene composite paper has potential as a high-capacity electrode for sodium-ion battery. In that research, the scientists used graphene as an electron conductor for the molybdenum disulfide sheets and observed graphene to be largely inactive toward sodium.
Their latest research has shown that unlike sodium, the lithium capacity of rGO increases with increasing rGO synthesis temperature reaching maximum value for sample produced at 900 degrees C.
“It is only now we realize that sodium capacity of graphene, or rGO, is dependent on its processing temperature. The rGO specimens in our previous study were prepared at 900 degrees C,” said Singh.
Singh said that research into sodium and nonlithium batteries is important for several reasons. As the focus shifts from vehicles to stationary energy storage systems and large vehicles, stationary batteries need to be cheaper, safe and environmentally benign. Because of its large abundance, sodium is a potential candidate for replacing lithium-ion batteries.
By focusing on nanotechnology, Singh and his team were able to explore and design materials that can store sodium-ions reversibly and without damage. They found their answer in graphene oxide, which can cycle sodium-ions for more than 1,000 cycles.
Singh and his team will continue exploring new nanomaterials and focus on materials that can be mass-produced in a cost-effective manner.
“We would like to perform fundamental studies to understand the origins of first cycle loss, voltage hysteresis, and capacity degradation that are common to metal-ion battery anodes prepared from 2-D layered crystals such as transition metal chalcogenides, graphene, etc.,” said Singh.
The researchers are also looking at other nanomaterials that have been ruled out as battery electrodes, such as boron nitride sheets and silicon-nitrogen based ceramics.

New carbon-trapping ‘sponges’ could cut greenhouse gases


In the fight against global warming, carbon capture – chemically trapping carbon dioxide before it releases into the atmosphere – is gaining momentum, but standard methods are plagued by toxicity, corrosiveness and inefficiency. Using a bag of chemistry tricks, Cornell materials scientists have invented low-toxicity, highly effective carbon-trapping “sponges” that could lead to increased use of the technology.
A research team led by Emmanuel Giannelis, the Walter R. Read Professor of Engineering in the Department of Materials Science and Engineering, has invented a powder that performs as well or better than industry benchmarks for carbon capture. A paper with their results, co-authored by postdoctoral associates Genggeng Qi and Liling Fu, appeared in Nature Communications.
Used in natural gas and coal-burning plants, the most common carbon capture method today is called amine scrubbing, in which post-combustion, carbon dioxide-containing flue gas passes through liquid vats of amino compounds, or amines, which absorb most of the carbon dioxide. The carbon-rich gas is then pumped away – sequestered – or reused. The amine solution is extremely corrosive and requires capital-intensive containment.
The researchers have been working on a better, safer carbon-capture method since about 2008, and they have gone through several iterations. Their latest consists of a silica scaffold, the sorbent support, with nanoscale pores for maximum surface area. They dip the scaffold into liquid amine, which soaks into the support like a sponge and partially hardens. The finished product is a stable, dry white powder that captures carbon dioxide even in the presence of moisture.
Solid amine sorbents are used in carbon capture, Giannelis said, but the supports are usually only physically impregnated with the amines. Over time some of the amine is lost, decreasing effectiveness and increasing cost. The researchers instead grew their amine onto the sorbent surface, which causes the amine to chemically bond to the sorbents, meaning very little amine loss over time.
Qi said the next steps are to optimize the sorbent and to eventually demonstrate it for industry, possibly at Cornell for retrofitting its power plant. He also said the technology could be used on smaller scale – for example, in greenhouses, where the captured carbon dioxide can be used to enhance plant growth. KyuJung Whang, Cornell’s vice president for facilities services, heard a presentation by Giannelis on the topic at a board of trustees meeting earlier this year.
“We have made great strides in sustainability, particularly in the energy supply areas of alternative energy sources, and the demand side areas of energy conservation and building design standards. If we are truly to achieve neutrality, though, we also have to consider capturing and offsetting carbon. Emmanuel’s presentation got my attention, and I was hoping to learn more about it and explore ways we might be able to work together,” said Whang.

Turning hydrogen into “graphene”


New work from Carnegie’s Ivan Naumov and Russell Hemley delves into the chemistry underlying some surprising recent observations about hydrogen, and reveals remarkable parallels between hydrogen and graphene under extreme pressures. Their work is the cover story in the December issue of Accounts of Chemical Research.
Hydrogen is the most-abundant element in the cosmos. With only a single electron per atom, it is deceptively simple. As a result, hydrogen has been a testing ground for theories of the chemical bond since the birth of quantum mechanics a century ago. Understanding the nature of chemical bonding in extreme environments is crucial for expanding our understanding of matter over the broad range of conditions found in the universe.
Observing hydrogen’s behavior under very high pressures has been a great challenge for researchers. But recently teams have been able to observe that at pressures of 2-to-3.5 million times normal atmospheric pressure it transforms into an unexpected structure consisting of layered sheets, rather than a close-packed metal as had been predicted many years ago.
These hydrogen sheets resemble the carbon compound graphene. Graphene’s layers are each constructed of a honeycomb structure made of six-atom carbon rings. This conventional carbon graphene, first synthesized about a decade ago, is very light, but incredibly strong, and conducts heat and electricity very efficiently. These properties promise revolutionary technology, including advanced optical electronics for screens, high-functioning photovoltaic cells, and enhanced batteries and other energy storage devices.
The new work from Naumov and Hemley shows that the stability of the unusual hydrogen structure arises from the intrinsic stability of its hydrogen rings. These rings form because of so-called aromaticity, which is well understood in carbon-containing molecules such as benzene, as well as in graphene. Aromatic structures take on a ring-like shape that can be thought of as alternating single and double bonded carbons. But what actually happens is that the electrons that make up these theoretically alternating bonds become delocalized and float in a shared circle around the inside of the ring, increasing stability.
Naumov and Hemley’s study also indicates that hydrogen initially becomes a dark poorly conducting metal like graphite instead of a conventional shiny metal and a good conductor, as was originally suggested in theoretical calculations going back to the 1930’s using early quantum mechanical models for solids.
Though the discovery of this layered sheet character of dense hydrogen has come as a surprise to many, chemists 30 years ago - before the discovery of graphene - predicted the structure based on simple chemical considerations. Their work is validated and extended by the new findings.
“Overall, our results indicate that chemical bonding occurs over a much broader range of conditions than people had previously considered. However, the structural effects of that chemical bonding under extreme conditions can be very different than that observed under the ordinary conditions that are familiar to us,” said Hemley.
Read full story here - Turning hydrogen into “graphene”

Research towards making 3-D printing thermoplastic from squids


Research team at Penn State is using squid to make a thermoplastic that can be used in 3-D printing. “Most of the companies looking into this type of material have focused on synthetic plastics. Synthetic plastics are not rapidly deployable for field applications, and more importantly, they are not eco-friendly,” said Melik C Demirel, Professor of engineering science and mechanics, Penn State University.
Demirel and his team looked at the protein complex that exists in the squid ring teeth (SRT). The naturally made material is a thermoplastic, but obtaining it requires a large amount of effort and many squid. “We have the genetic sequence for six squid collected around the world, but we started with the European common squid,” said Demirel, who with his team collected the cephalopods.
The researchers looked at the genetic sequence for the protein complex molecule and tried synthesizing a variety of proteins from the complex. Some were not thermoplastics, but others show stable thermal response, for example, the smallest known molecular weight SRT protein was a thermoplastic. The results of their work were published in the current issue of Advanced Functional Materials and illustrates the cover.
Most plastics are currently manufactured from fossil fuel sources like crude oil. Some high-end plastics are made from synthetic oils. Thermoplastics are polymer materials that can melt, be formed and then solidify as the same material without degrading materials properties.
This particular thermoplastic can be fabricated either as a thermoplastic, heated and extruded or molded, or the plastic can be dissolved in a simple solvent like acetic acid and used in film casting. The material can also be used in 3D printing machines as the source material to create complicated geometric structures.
To manufacture this small, synthetic SRT molecule, the researchers used recombinant techniques. They inserted SRT protein genes into E. coli, so that this common, harmless bacteria could produce the plastic molecules as part of their normal activity and the thermoplastic was then removed from the media where the E. coli lived. Wayne Curtis, professor of chemical engineering and Demirel collaborating on this project together with their students worked on this aspect of the project.
“The next generation of materials will be governed by molecular composition -- sequence, structure and properties,” said Demirel.
The thermoplastic the researchers created is semi-crystalline and can be rigid or soft. It has a very high tensile strength and is a wet adhesive; it will stick to things even if it is wet. This thermoplastic protein has a variety of tunable properties, which can be adjusted to individual requirements of manufacturing. Because it is a protein, it can be used for medical or cosmetic applications.
“Direct extraction or recombinant expression of protein based thermoplastics opens up new avenues for materials fabrication and synthesis, which will eventually be competitive with the high-end synthetic oil based plastics,” reported the researchers.

Bacterial proteins transform iron and other minerals for energy, growth


Scientists review decades of work into bacterial proteins that transform iron and other minerals for energy and growth. Cleaning up polluted soil and growing crops for biofuels benefit from a deeper understanding of how microbes alter subsurface minerals. Scientists at Pacific Northwest National Laboratory, University of East Anglia, and University College London assess the state of understanding of a key enzymatic pathway employed by bacteria in these transformations: chains of proteins called multi-heme cytochromes. The proteins perform a variety of tasks, primarily acting as electron conduits, and take multiple forms. The review, which focuses on the microbe Shewanella oneidensis, appears in Journal of the Royal Society Interfaces. The article covers more than 150 studies of the protein, spanning more than three decades.
“The proteins participate in electron transfer reactions that contribute to biogeochemical cycling of nitrogen, sulfur, and iron on the global scale. The properties of multi-heme cytochromes have attracted multidisciplinary interest and contribute beyond environmental sciences to advances in bioenergy and bioelectronic devices,” said Dr Kevin Rosso, Geochemist, PNNL.
Called the subsurface environment by scientists, the soil, rocks, and water that stretch far beneath our feet are far from simple or static. Yet, scientists lack detailed knowledge of the complex processes that can cause local and regional fluxes to drive nutrients away from farms, or spread or immobilize uranium from nuclear waste sites. The review provides a one-stop shop of information on a key part of the complex, dynamic processes and highlights opportunities for additional research.
Another twist is that, one day, multi-heme cytochrome reactions could lead to bio-based batteries. When the microbes use iron, much like humans use oxygen, to burn fuel for energy, the associated flow of electrons can be captured and delivered to targets, such as electrodes in fuel cells. Lead researcher Professor Julea Butt at the University of East Anglia said, “these bacteria can generate electricity for us in the right environment.”
In their 27-page review article, the four researchers discuss the structures, properties, and functions of the multi-heme cytochromes, especially how they channel their electrons. “This is an exciting advance in our understanding of how some bacterial species move electrons from the inside to the outside of a cell and helps us understand their behavior as robust electron transfer modules,” said Butt.
While scientists have answered numerous questions about multi-heme cytochromes, just as many unanswered questions remain. “If we can unravel the relationships between encoding the protein and how it functions as an electron conduit, then we can start to think about customizing aspects of design for device purposes,” said Rosso.

New hybrid sodium ion capacitor from peanut shells


Scientists in Canada have created a hybrid sodium ion capacitor (NIC) from peanut shells in a pioneering study bridging the gap between conventional ion batteries and supercapacitors. A hybrid ion capacitor is capable of storing charge both electrostatically and electrochemically, providing an intermediate in terms of energy and power between traditional batteries and supercapacitors.
“In conventional batteries the cathode often limits performance and so what people are starting to do is swap regular cathodes for supercapacitor cathodes. Ions are adsorbed onto the surface of the cathode in an NIC, which avoids the degradation seen in batteries due to ion absorption into the bulk,” explained David Mitlin, University of Alberta, who led the research. These cathodes can drastically improve the cycling life of such devices. A high surface area cathode material is therefore crucial for achieving energy–power performance to rival other state-of-the-art energy storage devices.
According to Mitlin, peanut shells are easy to source, cheap and have limited commercial use, mostly ending up in landfill sites. However, the shells were hardly chosen at random – the team recognised important structural characteristics of both the inner and outer peanut shells to give desirable anode and cathode materials, respectively. The homogenous inner portion of the shell, primarily consisting of highly cross-linked polymer lignin, lent itself to the fabrication of inter-dilated graphene layers perfect for intercalating large sodium ions as an efficient anode. The cathode, a high surface area graphene-like material, was synthesised from the cellulose-rich heterogeneous outer peanut casing.
This careful precursor selection process was underscored by the poor device performance when the whole peanut shell was used to synthesise both electrode materials. The optimised system, however, achieved 88 per cent capacity retention after 100,000 cycles at 51.2A/g, a performance that is competitive with lithium ion capacitors.
This was surely no easy task as sodium has proven to be notoriously difficult to incorporate into such energy storage devices, relative to lithium, due to its larger ionic radius. Sodium, however, is cheaper and easier to get hold of. Mitlin admits there were difficulties along the way: ‘few people have actually done it, but this was also a challenge as there was limited literature to refer back to.’
Materials experts have praised the work. Yuping Wu from the University of Fudan in China was impressed by the excellent cycling stability of the electrodes: ‘These data show that this NIC can be a promising choice for applications.’ Chengdu Liang, of Oak Ridge National Laboratory in the US, admires the project but recognises that more investigation is necessary: ‘This research exemplifies the versatility of using biomaterials as the feedstock for energy storage devices. However, every aspect is still under scrutiny so from laboratory discoveries to real-world applications, there is a long way to go.’

Honeybee hives could hold potential hair loss therapy


Hair loss can be devastating for the millions of men and women who experience it. Now scientists are reporting that a substance from honeybee hives might contain clues for developing a potential new therapy. They found that the material, called propolis, encouraged hair growth in mice. The study appears in ACS’ Journal of Agricultural and Food Chemistry.
Researcher Ken Kobayashi and colleagues noted that propolis is a resin-like material that honeybees use to seal small gaps in their hives. Not only does it work as a physical barrier, but it also contains active compounds that fight fungal and bacterial invasions. People from ancient times had noticed propolis’ special properties and used it to treat tumors, inflammation and wounds. More recently, research has shown that the substance promotes the growth of certain cells involved in hair growth though no one had yet tested whether that in turn would result in new locks. Kobayashi’s team wanted to find out.
When the researchers tested propolis on mice that had been shaved or waxed, the mice that received the treatment regrew their fur faster than those that didn’t. The scientists also noticed that after the topical application, the number of special cells involved in the process of growing hair increased. Although they tried the material on mice that could grow fur rather than balding mice, the researchers noted that hair loss conditions often result from abnormal inflammation. Propolis contains anti-inflammatory compounds, so they expect it could help treat balding conditions. They added that further testing is needed to see if the beehive material affects human hair follicles.

New method to detect horse meat


UK chemists have developed a method to distinguish horse meat from beef using a benchtop NMR machine. The new test is quicker, cheaper and simpler than the current gold standard used by food safety bodies.
In early 2013, the discovery that some beef burgers contained horse meat caused a Europe-wide meat authenticity crisis. Researchers have since been looking for meat purity tests that can replace standard DNA tests, which are accurate but expensive, slow and don’t give reliable quantitative results.
Now a group of researchers from Norwich and Oxford, UK, has developed a method to distinguish beef from horse meat in only 10 minutes. The team analysed the levels of triglycerides in both horse and beef extracts using a 60MHz benchtop NMR machine. In two labs, the group tested 117 fresh and frozen meat samples, only misidentifying a single one. They now hope to extend their method to spot horse meat in processed foods that are supposed to only have beef in them.

Read full story here - New method to detect horse meat

Artificial skin developed using traditional electronics


An artificial skin that wraps around a prosthetic hand and senses touch and warmth has been developed using traditional electronics. The flexible sensors array was demonstrated in a rat model, where signals were transmitted and sensed in the brain.
Many robots and prosthetic limbs have been created, but their skins cannot sense their environment. This new stretchable prosthetic skin comes equipped with ultra-thin, single crystalline silicon nanoribbon sensors for strain, pressure and temperature, as well as humidity sensors, heaters and stretchable multi-electrode arrays for nerve stimulation. The skin is tuned to stretch as befits its location on the prosthetic.
The group said that their design can dramatically boost perception capabilities in changing environments. Integration of stretchable humidity sensors and heating elements allows for the sensation of skin moisture and body temperature regulation, respectively.
Electrical stimuli can be sent from the prosthetic skin to the body to stimulate peripheral nerves via an ultra-thin multi-electrode array. “The overarching goal is for arrays of stretchable sensors to capture information about the external environment and ultimately interface these devices with peripheral nerves to transmit signals to the brain,” explained study author Roozbeh Ghaffari.
“We built the systems very thin and that allowed for very low bending stiffness, which then allows you to wrap around any surface and to be able to stretch. You need to do some work on the geometry side to bring interconnects between individual sensors,” noted Ghaffari, who is a co-founder of MC10, a company developing wearable sensors based on the same technology. MC10 collaborated with Korean scientists to develop the artificial skin.
“This is a fantastic demonstration from a technology point of view. It is a combination of well-known systems and approaches, but they have made something that looks really impressive. Other groups have shown similar things, but not with the complexity shown here,” said soft matter physicist Siegfried Bauer at Johannes-Kepler University in Linz, Austria.
He pointed to a recent paper by George Whitesides’ team at Harvard University, US, where they developed an ionic skin that senses stimuli using ions – just like natural skin. “I like both approaches. But we’ll have to see which one wins out. This one relies on traditional materials but is extremely well advanced,” said Bauer.
“The authors reported some significant advances and wide-ranging demonstrations that pull together many different types of materials in sensors and actuators for the skin, with additional examples of interfaces to the peripheral nervous systems. The work powerfully illustrates the scalability and wide-ranging capabilities of ideas in ultra-thin, silicon-based, epidermal electronics,” said Materials Scientist John Rogers, University of Illinois.
Ghaffari noted that efforts are underway by others to selectively stimulate nerves. “Interfacing with peripheral nerves and stimulating individual nerve fibres such that you can pinpoint specific sensations is still very much in its infancy. In terms of doing this at high resolution, we are showing how to on the sensor side,” added Ghaffari.

Monday, 22 December 2014

New “high-entropy” metal alloy with higher strength-to-weight ratio


Researchers from North Carolina State University and Qatar University have developed a new “high-entropy” metal alloy that has a higher strength-to-weight ratio than any other existing metal material. High-entropy alloys are materials that consist of five or more metals in approximately equal amounts. These alloys are currently the focus of significant attention in materials science and engineering because they can have desirable properties. The NC State research team combined lithium, magnesium, titanium, aluminum and scandium to make a nanocrystalline high-entropy alloy that has low density, but very high strength.
“The density is comparable to aluminum, but it is stronger than titanium alloys. It has a combination of high strength and low density that is, as far as we can tell, unmatched by any other metallic material. The strength-to-weight ratio is comparable to some ceramics, but we think it’s tougher – less brittle – than ceramics,” said Dr Carl Koch, Kobe Steel Distinguished Professor of Materials Science and Engineering, NC State and senior author of a paper on the work.
There are a wide range of uses for strong, lightweight materials, such as in vehicles or prosthetic devices. “We still have a lot of research to do to fully characterize this material and explore the best processing methods for it. One thing we’ll be looking at is whether scandium can be replaced or eliminated from the alloy,” said Koch.
At this point, the primary problem with the alloy is that it is made of 20 per cent scandium, which is extremely expensive.

Measuring methane emissions from natural gas


A team of researchers from the Cockrell School of Engineering at The University of Texas at Austin and environmental testing firm URS reported that a small subset of natural gas wells are responsible for the majority of methane emissions from two major sources - liquid unloadings and pneumatic controller equipment - at natural gas production sites.
With natural gas production in the United States expected to continue to increase during the next few decades, there is a need for a better understanding of methane emissions during natural gas production. The study team believes this research will help to provide a clearer picture of methane emissions from natural gas production sites.
The UT Austin-led field study closely examined two major sources of methane emissions - liquid unloadings and pneumatic controller equipment - at well pad sites across the United States. Researchers found that 19 per cent of the pneumatic devices accounted for 95 per cent of the emissions from pneumatic devices, and 20 per cent of the wells with unloading emissions that vent to the atmosphere accounted for 65 per cent to 83 per cent of those emissions.
“To put this in perspective, over the past several decades, 10 per cent of the cars on the road have been responsible for the majority of automotive exhaust pollution. Similarly, a small group of sources within these two categories are responsible for the vast majority of pneumatic and unloading emissions at natural gas production sites,” said David Allen, Chemical Engineering Professor, Cockrell School.
Additionally, for pneumatic devices, the study confirmed regional differences in methane emissions first reported by the study team in 2013. The researchers found that methane emissions from pneumatic devices were highest in the Gulf Coast and lowest in the Rocky Mountains.
The study is the second phase of the team’s 2013 study, which included some of the first measurements for methane emissions taken directly at hydraulically fractured well sites. Both phases of the study involved a partnership between the Environmental Defense Fund, participating energy companies, an independent Scientific Advisory Panel and the UT Austin study team.
The unprecedented access to natural gas production facilities and equipment allowed researchers to acquire direct measurements of methane emissions.
Pneumatic devices, which use gas pressure to control the opening and closing of valves, emit gas as they operate. These emissions are estimated to be among the larger sources of methane emissions from the natural gas supply chain. The Environmental Protection Agency reports that 477,606 pneumatic (gas actuated) devices are in use at natural gas production sites throughout the US.
“Our team’s previous work established that pneumatics are a major contributor to emissions. Our goal here was to measure a more diverse population of wells to characterize the features of high-emitting pneumatic controllers,” said Allen.
The research team measured emissions from 377 gas actuated (pneumatic) controllers at natural gas production sites and a small number of oil production sites throughout the US.
The researchers sampled all identifiable pneumatic controller devices at each well site, a more comprehensive approach than the random sampling previously conducted. The average methane emissions per pneumatic controller reported in this study are 17 per cent higher than the average emissions per pneumatic controller in the 2012 EPA greenhouse gas national emission inventory (released in 2014), but the average from the study is dominated by a small subpopulation of the controllers. Specifically, 19 per cent of controllers, with measured emission rates in excess of 6 standard cubic feet per hour (scf/h), accounted for 95 per cent of emissions.
The high-emitting pneumatic devices are a combination of devices that are not operating as designed, are used in applications that cause them to release gas frequently or are designed to emit continuously at a high rate.
The researchers also observed regional differences in methane emission levels, with the lowest emissions per device measured in the Rocky Mountains and the highest emissions in the Gulf Coast, similar to the earlier 2013 study. At least some of the regional differences in emission rates can be attributed to the difference in controller type (continuous vent vs. intermittent vent) among regions.
After observing variable emissions for liquid unloadings for a limited group of well types in the 2013 study, the research team made more extensive measurements and confirmed that a majority of emissions come from a small fraction of wells that vent frequently. Although it is not surprising to see some correlation between frequency of unloadings and higher annual emissions, the study’s findings indicate that wells with a high frequency of unloadings have annual emissions that are 10 or more times as great as wells that unload less frequently.
The team’s field study, which measured emissions from unloadings from wells at 107 natural gas production wells throughout the US, represents the most extensive measurement of emissions associated with liquid unloadings in scientific literature thus far.
A liquid unloading is one method used to clear wells of accumulated liquids to increase production. Because older wells typically produce less gas as they near the end of their life cycle, liquid unloadings happen more often in those wells than in newer wells. The team found a statistical correlation between the age of wells and the frequency of liquid unloadings. The researchers found that the key identifier for high-emitting wells is how many times the well unloads in a given year.
Because liquid unloadings can employ a variety of liquid lifting mechanisms, the study results also reflect differences in liquid unloadings emissions between wells that use two different mechanisms (wells with plunger lifts and wells without plunger lifts). Emissions for unloading events for wells without plunger lifts averaged 21,000 scf (standard cubic feet) to 35,000 scf. For wells with plunger lifts that vent to the atmosphere, emissions averaged 1,000 scf to 10,000 scf of methane per event. Although the emissions per event were higher for wells without plunger lifts, these wells had, on average, fewer events than wells with plunger lifts. Wells without plunger lifts averaged fewer than 10 unloading events per year, and wells with plunger lifts averaged more than 200 events per year.   Overall, wells with plunger lifts were estimated to account for 70 percent of emissions from unloadings nationally.
Additionally, researchers found that the Rocky Mountain region, with its large number of wells with a high frequency of unloadings that vent to the atmosphere, accounts for about half of overall emissions from liquid unloadings.
The study team hopes its measurements of liquid unloadings and pneumatic devices will provide a clearer picture of methane emissions from natural gas well sites and about the relationship between well characteristics and emissions.