Lightweight, Wearable Tech Efficiently Converts Body Heat To Electricity

Researchers at North Carolina State University have developed a new design for harvesting body heat and converting it into electricity for use in wearable electronics. The experimental prototypes are lightweight, conform to the shape of the body, and can generate far more electricity than previous lightweight heat harvesting technologies.

The researchers also identified the optimal site on the body for heat harvesting.

“Wearable thermoelectric generators (TEGs) generate electricity by making use of the temperature differential between your body and the ambient air,” says Daryoosh Vashaee, an associate professor of electrical and computer engineering at NC State and corresponding author of a paper on the work. “Previous approaches either made use of heat sinks – which are heavy, stiff and bulky – or were able to generate only one microwatt or less of power per centimeter squared (µW/cm2). Our technology generates up to 20 µW/cm2 and doesn’t use a heat sink, making it lighter and much more comfortable.”

The new design begins with a layer of thermally conductive material that rests on the skin and spreads out the heat. The conductive material is topped with a polymer layer that prevents the heat from dissipating through to the outside air. This forces the body heat to pass through a centrally-located TEG that is one cm2. Heat that is not converted into electricity passes through the TEG into an outer layer of thermally conductive material, which rapidly dissipates the heat. The entire system is thin – only 2 millimeters – and flexible.

“In this prototype, the TEG is only one centimeter squared, but we can easily make it larger, depending on a device’s power needs,” says Vashaee, who worked on the project as part of the National Science Foundation’s Nanosystems Engineering Research Center for Advanced Self-Powered Systems of Integrated Sensors and Technologies (ASSIST) at NC State.

The researchers also found that the upper arm was the optimal location for heat harvesting. While the skin temperature is higher around the wrist, the irregular contour of the wrist limited the surface area of contact between the TEG band and the skin. Meanwhile, wearing the band on the chest limited air flow – limiting heat dissipation – since the chest is normally covered by a shirt.

In addition, the researchers incorporated the TEG into T-shirts. The researchers found that the T-shirt TEGs were still capable of generating 6 µW/cm2 – or as much as 16 µW/cm2 if a person is running.

“T-shirt TEGs are certainly viable for powering wearable technologies, but they’re just not as efficient as the upper arm bands,” Vashaee says.

“The goal of ASSIST is to make wearable technologies that can be used for long-term health monitoring, such as devices that track heart health or monitor physical and environmental variables to predict and prevent asthma attacks,” he says.

“To do that, we want to make devices that don’t rely on batteries. And we think this design and prototype moves us much closer to making that a reality.”

The paper, “Wearable thermoelectric generators for human body heat harvesting,” is published in the journal Applied Energy. Lead authors of the paper are Melissa Hyland, a graduate student at NC State, and Haywood Hunter, an undergraduate at NC State. They worked on the project while undergraduates, with support from an NSF Research Experiences for Undergraduates grant. Co-authors include Jie Liu, a postdoctoral researcher at NC State; and Elena Veety, education director of ASSIST at NC State. The work was funded by the NSF under grants EEC-1160483, ECCS-1351533 and CMMI-1363485; and by the Air Force Office of Scientific Research under grant FA9550-12-1-0225.

Calculating The Financial Risks Of Renewable Energy

For investors, deciding whether to invest money into renewable-energy projects can be difficult. The issue is volatility: Wind-powered energy production, for instance, changes annually — and even weekly or daily — which creates uncertainty and investment risks. With limited options to accurately quantify that volatility, today’s investors tend to act conservatively.

But MIT spinout EverVest has built a data-analytics platform that aims to give investors rapid, accurate cash-flow models and financial risk analyses for renewable-energy projects. Recently acquired by asset-management firm Ultra Capital, EverVest’s platform could help boost investment in sustainable-infrastructure projects, including wind and solar power.

Ultra Capital acquired the EverVest platform and team earlier this year, with aims of leveraging the software for its own risk analytics. The acquisition will enable the EverVest platform to expand to a broader array of sustainable infrastructure sectors, including water, waste, and agriculture projects.

“If an investor has confidence in the performance and risk they are taking, they may be willing to invest more capital into the sustainable infrastructure asset class. More capital means more projects get built,” says EverVest co-founder and former CEO Mike Reynolds MBA ’14, now director of execution at Ultra Capital. “We wanted to give people more firepower when it comes to evaluating risk.”

The platform’s core technology was initially based on research at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), by EverVest co-founder and former chief technology officer Teasha Feldman ’14, now director of engineering at Ultra Capital.

The strength of data

EverVest’s platform analyzes data on a variety of factors that may impact the performance of renewable-energy projects. Layout and location of a site, certain contracts, type of equipment, grid connection, weather, and operation and maintenance costs can all help predict the financial rate of return.

Today, financial analysts use Excel spreadsheets to find a flat, annual production average for the next 20 to 30 years. “It leaves a lot to the imagination,” Reynolds says. “Renewable energy is volatile and uncertain.”

By the time of its acquisition, EverVest had clients in the United States and Europe, including banks, investors, and developers for wind and solar power projects. Users enter information about their prospective project into the software, which would provide a detailed analysis of future cash-flow model, along with detailed statistical analysis of the project’s financial risks.

“It’s the strength of the data that we wanted to give to investors, banks, and developers, to get a better understanding of their assets,” Reynolds says.

For example, consider a wind farm. With location data, the platform can use public data sets to calculate the last few decades of wind speed and determine the project’s overall performance. Location can also help determine the project’s profitability in the market. California could be a better market than, say, Texas or Maine.

Specific types of equipment and manufacturers matter, too. If an investor considers a certain type of wind turbine, “we can pull data to determine that a turbine in that location is going to need $2 million of replacement parts in year five,” Reynolds says. “In year seven, you might have a 50 percent probability that something is going to fail, potentially resulting in a shut-down of the site.”

The end result is a more detailed projection of the rate of return, Reynolds says. While an electronic spreadsheet might give an average rate of return of, say, 12 percent, the EverVest’s platform would show a full analysis of the quarterly performance, including the statistical uncertainty of the rate of return. While 12 percent may be the average, the returns may vary between 4 and 18 percent. “By understanding that range of risk, you can understand the true value,” Reynolds says.

Now at Ultra Capital, Feldman is further developing the platform. Reynolds is using it to invest in a wide array of sustainable-infrastructure projects, including solar energy projects, waste-to-energy assets, water treatment facilities, and recycling plants. “We’ve brought our technology in-house and have expanded it a great deal,” Reynolds says. “Now I get to use the software we built to make better investments.”

EverVest: The happy accident

EverVest (formerly Cardinal Wind) began as a CSAIL research project that was refined and developed through MIT’s entrepreneurial ecosystem before going to market.

As an MIT junior in 2012, Feldman wanted to branch out from her theoretical physics coursework to focus on renewable energy. She discovered a CSAIL project, led by research scientist Una-May O’Reilly, that involved collecting and analyzing data on wind farm energy. “I showed up in [O’Reilly’s] office and begged her to let me work on the project,” Feldman says.

In a year, Feldman had designed a machine-learning algorithm that collected 30 years of wind data from airports and other sites, to predict future wind power there for the next 30 years. During that time, she sought enrollment in Course 15.366 (Energy Ventures), where students from across departments plan businesses around clean technologies. Undergraduates are seldom accepted. But as luck would have it, the class wanted O’Reilly to speak about her research — and O’Reilly told them to ask Feldman.

“I said, ‘Yes, I’m working on that research. You should just let me into the class,’” Feldman says, laughing.

Enrolling in fall 2013, Feldman pitched her algorithm to the class, and it caught the eye of one student. Reynolds had come to MIT Sloan School of Management, he says, “with scars from working on Wall Street in investment banking … and I wanted to open my horizons and work with engineers who were building amazing things at MIT.”

During his time as an investment banker, Reynolds dealt with funding large projects in infrastructure, energy, transportation. So Feldman’s prediction algorithm resonated immediately. “I saw her algorithm and thought of how great it would be for investors to have a more accurate way to measure the rate of return for a potential wind project investment,” Reynolds says.

Joining forces, Feldman and Reynolds launched Cardinal Wind in 2013. The startup was somewhat of a “happy accident,” Feldman says. “The company took an insane amount of hard work to start and build. But by showing up in a lab and convincing them to give me a job, and then bringing the research to class, we were able to determine that there was a great opportunity and need for better financial risk analysis tools in the marketplace.”

The following summer, Cardinal Wind entered the Global Founders’ Skills Accelerator (GFSA), run by the Martin Trust Center for MIT Entrepreneurship, “which was a huge boost,” Reynolds says. Mentors and entrepreneurs-in-residence offered guidance and feedback on pitches, and generous GFSA funding paid the startup’s bills. “And we worked alongside other startups going through the same challenges,” Reynolds says. “All those resources were incredibly helpful.”

By October 2015, Cardinal Wind had expanded Feldman’s algorithm into a full cash-flow modelling platform that also included analyses for solar power projects. That month, Cardinal Wind rebranded as EverVest, and this July it was acquired by Ultra Capital.

A key to EverVest’s success, Feldman says, was constantly developing the technology to fit customer needs — such as including solar power. “When we found the actual need was more than just predicting wind patterns, we departed from using that particular algorithm, and we’ve built a lot of our core platform since then,” she says.

Finding Could Improve Nuclear Reactors, Detectors

Found in nuclear fuel and nuclear weapons, plutonium is an incredibly complex element that has far-ranging energy, security, and environmental effects. To understand plutonium, scientists at Pacific Northwest National Laboratory and Washington State University delved into a plutonium compound with a relatively simple composition: plutonium tetrafluoride (PuF4). While the formula is simple, the four bonds proved to be more complex. The electrons stay relatively close to each atom, creating ionic bonds — not the expected electron-sharing covalent bonds. Even though the plutonium and fluorine atoms are tied together in a lattice, they act as isolated atoms and form ionic bonds.

“Bonding is one of the big questions for plutonium and its actinide neighbors on the Periodic Chart,” said Dr. Herman Cho at PNNL, who led the research. “Answering this question is of huge importance because plutonium’s chemistry depends on how it bonds. PuF4 leans toward electrostatic attraction. This work provides a clearer picture of why that is.”

Why It Matters: Plutonium is formidably complex because of the large cloud of electrons that surrounds its nucleus. It doesn’t always act as expected. Adding to the complexity of the element is the limited number of institutions that can safely handle and study the radioactive element. The team’s research sheds new light on plutonium’s true nature. It could provide insights about key molecules involved in nuclear power, national security, and environmental cleanup.

“Plutonium doesn’t fit within the simple pictures that apply to lighter elements,” said Cho. “This work answers tough questions as to why plutonium acts the way it does.”

Methods: The researchers began with highly radioactive PuF4 from the long-shuttered Plutonium Finishing Plant in Washington State. At the plant, scientists created hockey-puck sized “buttons” of plutonium. They analyzed the plutonium using nuclear magnetic resonance spectroscopy (NMR), which elucidates key features of electronic structure near plutonium centers. The instruments reside in the Radiochemical Processing Laboratory at PNNL and the Rad Annex of the U.S. Department of Energy’s EMSL, a national scientific user facility. The labs are two of the few in the world that can perform NMR measurements on plutonium-containing solids.

Cho and his colleagues examined the atoms in the PuF4. Specifically, they probed the fluorine atoms around the plutonium centers to measure the magnetic fields produced by plutonium (Pu4+), which revealed how the electrons were distributed in the sample. They determined that the plutonium and fluorine atoms aren’t particularly generous. Both atoms tend to hold their electrons, acting more like ions in a salt where electrostatic forces hold the atoms together.

This research brings scientists closer to understanding the nuances of plutonium and other actinides, other radioactive elements near the bottom of the Periodic Table.

What’s Next? Cho and his colleagues are continuing to delve into the nuances of plutonium as well as uranium, neptunium, thorium, and similar complex actinides to understand how these elements interact with other atoms and groups of atoms.

Making Catalysts Smarter

The industrial catalysts of the future won’t just speed up reactions, they’ll control how chemical processes work and determine how much of a particular product is made.

A team of researchers led by Phillip Christopher, assistant professor of chemical and environmental engineering at the University of California, Riverside’s Bourns College of Engineering, demonstrated this—as well as how these catalysts look in action—in a paper published Monday, Sept. 19, in the journal Nature Chemistry.

Titled, “Adsorbate-mediated strong metal-support interactions in oxide-supported Rh catalysts,” the paper describes a new approach to dynamically tune how a catalyst operates, enabling the researchers to control and optimize the product made in the reaction. The team, which includes scientists from the University of California, Irvine and Columbia University, also used advanced microscopy and spectroscopy approaches to view the catalyst in action on an atomic scale.

The researchers focused on an important chemical reaction that involves the conversion of carbon dioxide to carbon monoxide and synthetic natural gas. The benefits of this reaction are two-fold: it offers the potential for the removal of harmful carbon dioxide from the atmosphere, and the carbon monoxide and natural gas produced can be used as a chemical precursor and fuel, respectively. The team focused on understanding how the catalyst drives the reaction at the atomic scale, which will allow researchers to modify the catalyst’s properties to increase efficiency in the reaction.

Christopher said the findings unlock new opportunities for carbon dioxide conversion chemistry, and the dynamic tuning and visualization techniques demonstrated in this research could be replicated in a variety of other important chemical processes.

“The real uniqueness of the paper was being able to observe what was happening at an atomic scale and how physical changes in the catalyst affected the outcome of the carbon dioxide conversion reaction. The insights we gained pave the way for the design of more effective processes to produce fuels and chemicals,” Christopher said.

John Matsubu, a graduate student in chemical engineering in Christopher’s lab, was the lead author on the paper. Other contributors included Leo DeRita, also a graduate student in chemical engineering at UCR; Shuyi Zhang, George Graham and Xiaoqing Pan from the University of California Irvine; and Nebojsa Marinkovic and Jingguang Chen from Columbia University. The research was funded primarily by the National Science Foundation, with additional support from the U.S. Department of Energy.

We’re Not In Kansas Anymore: Fluorescent Ruby Red Roofs Stay As Cool As White

Elementary school science teaches us that in the sun, dark colors get hot while white stays cool. Now new research from Lawrence Berkeley National Laboratory (Berkeley Lab) has found an exception: Scientists have determined that certain dark pigments can stay just as cool as white by using fluorescence, the re-emission of absorbed light.

The researchers tested this concept by coloring cool roof coatings with ruby red (aluminum oxide doped with chromium). Led by Berkeley Lab scientist Paul Berdahl, they first found that white paint overlaid with a layer of ruby crystals stayed as cool as a commercial white coating. Next, they synthesized ruby pigment to mix into coatings. Their results were published recently in the journal Solar Energy Materials & Solar Cells, in an article titled “Fluorescent cooling of objects exposed to sunlight—The ruby example.”

Substantial research over the years from Berkeley Lab’s Heat Island Group has found that reflective roofs and walls can cool buildings and cars. This reduces the need for air conditioning and mitigates the urban heat island effect. By reflecting the sun’s rays back to space, these cool materials also release less heat into the atmosphere, thus cooling the planet and offsetting the warming effects of substantial amounts of greenhouse gas emissions.

However, wider adoption of cool roofs has been hindered by aesthetic considerations. “We’ve heard many times (from roofing materials manufacturers), ‘We can’t sell white or pastel roofs; our customers want dark green, dark brown, and so on,’” Berdahl said.

Over the past 15 years, Heat Island Group researchers have used special pigments that strongly reflect invisible “near-infrared” light to make dark surfaces that stay cooler in the sun than conventional dark surfaces, though still not as cool as white surfaces. This new work shows that fluorescent cooling can boost the performance of these pigments by re-emitting at longer wavelengths some of the visible light that the surface must absorb to appear dark.

This opens the door to darker colors of not only cool roofs but of any object that is subject to prolonged periods of sun exposure, including vehicles, ships, storage tanks, and PVC piping. “We do think cars will be a likely application,” Berdahl said. “And it’s not just a matter of comfort or saving energy by avoiding AC use. We learned from colleagues that with electric vehicles, the battery lifetime is degraded by higher temperatures, so if you can keep the automobile cooler with use of a suitable coating then it extends the life of the battery.”

Using fluorescence, or photoluminescence, for cool materials is a new concept, and Berdahl, who is a physicist by training, has a patent pending on the technology. “People understand that materials that fluoresce are emitting energy,” he said. “What’s new here is the use of the fluorescence process to keep buildings cooler.”

When light hits a fluorescent material, the material actively emits energy in response, rather than passively reflecting the energy. Berdahl’s idea was to find a material that would absorb visible light and fluoresce (re-emit) mostly or entirely in the invisible near-infrared portion of the sun’s spectrum. “There have been thousands of fluorescent compounds identified,” he said. “Ruby’s properties are well known and well studied, and I realized it’s a material that could work.”

His first experiment was to use an array of synthetic ruby crystals, which he purchased online and said were surprisingly inexpensive. Attached to a bright white coating and exposed to bright sunlight, the dark-red ruby-covered coating stayed cooler than an off-white surface.

Berdahl and Berkeley Lab research associate Sharon Chen then synthesized ruby powder, or aluminum oxide, doped with varying amounts of chromium to create different shades of red pigment. They prepared ruby paint from the powders, and applied these paints over bright white substrates. When exposed to sunlight, the ruby paint samples stayed as cool as white materials.

“The ruby powder does need more work to make it as deep red as the ruby crystal,” Berdahl said.

If the product were to be commercialized, Berdahl said that the cost is not expected to be substantial and its durability is expected to be similar to other coatings. “Rubies have a reputation for being expensive, but they’re mostly aluminum oxide, which sells for about 70 cents per kilogram (or about 30 cents per pound),” he said.

PPG Industries, a Pittsburgh, Pennsylvania-based coating manufacturer also involved in the research, is conducting weathering tests with prototype fluorescent coatings.

In follow-up work Berdahl has identified blue materials that also fluoresce and showed that they can be combined with other colors to yield green and even black materials that stay cool.

The research was supported by the Department of Energy’s Office of Energy Efficiency and Renewable Energy (EERE). Additional funding was provided by the California Energy Commission. Co-authors of the paper were Sharon Chen, Hugo Destaillats, Thomas Kirchstetter, and Ronnen Levinson of Berkeley Lab, and Michael Zalich of PPG’s Coatings Innovation Center.

Social Media Posts Reveal Bad-Air Days In Chinese Cities

Residents of China’s megacities who post comments about air quality to social media can give environmental scientists a window into pollution levels there.

A multidisciplinary study by Rice University researchers showed that the frequency of key words like dust, cough, haze, mask and blue sky can be used as a proxy measurement of the amount of airborne particulate matter in the country’s urban centers at any given time.

The words were culled from millions of posts to China’s Weibo, a popular microblogging platform. The posts were collected by Rice computer scientists for a study on Chinese censorship of social media three years ago.

The research led by Rice computer scientist Dan Wallach and environmental engineer Daniel Cohan appears this month in the open-access journal PLOS One.

“The big takeaway is that people grouse about air quality, and as it gets worse, people complain more,” said Wallach, a professor of computer science and electrical and computer engineering, whose lab collected the publicly available posts.

“When it’s really bad, it flattens out,” he said. “They’re as complained-out as they’re going to be. And if it gets good enough, few people complain. But there’s a zone in the middle where people really grouse, and we can measure that.

“A city the size of Beijing has air-quality meters, but not many,” Wallach said. “But if you have millions of people, you potentially have millions of meters. It’s a way of adding extra data.”

The researchers came up with a metric, the Air Discussion Index (ADI), based on the frequency with which pollution-related terms appeared in 112 million posts from 2011 to 2013 by residents of Beijing, Shanghai, Guangzhou and Chengdu, where pollution is thought to be most troublesome in China.

“We looked at what words correlated with the pollution-level data we had,” Wallach said. “Some words that came out were nonsense. But others, like cough or wheeze, clearly had something to do with the conditions. Others, like blue sky, inversely correlated with the weather or pollution.”

“There’s a lot of discussion about censorship in Chinese media, including in Dan Wallach’s work, but one of the things we like about this particular study is that it relies on data that are almost never censored, the most innocuous terms of all,” said co-author Aynne Kokas, an assistant professor of media studies at the University of Virginia and an affiliate of Rice University’s Baker Institute for Public Policy.

“These terms are almost impossible to censor because of how common they are,” she said. “As a result, we think this method is really effective not only in China but could also work in other contexts where there are heavily regulated social-media environments.”

The most accurate ADI readings were those for Beijing. When matched to hourly sensor readings from the U.S. Embassy there, the researchers found the technique analyzed pollution levels with an accuracy of 88.2 percent. ADI performance for the other cities where the pollution isn’t as severe and Weibo posts not as plentiful wasn’t as accurate: 63 percent for Shanghai, 42 percent for Guangzhou and 36 percent for Chengdu.

Particulate matter measuring less than 2.5 microns in diameter — about 30 times less than the diameter of the average human hair — is known to permanently damage the lungs. The United States’ air-quality standard for concentrations of this size of particulate matter is no more than 35 micrograms (millionths of a gram) per cubic meter over any 24-hour period and an annual average of no more than 12 micrograms per cubic meter.

Cohan said Chinese air pollution standards aren’t vastly different from those in the U.S., but the pollutant concentrations are. “Particulate matter levels in Beijing are often 10 times as high as we typically observe in U.S. cities,” he said.

Wallach said he was surprised by the level of air-quality information that was found in the Weibo posts — data that he and colleagues had collected for a 2013 study on social media censorship.

“I was chatting with Dan Cohan, and I said, ‘Hey, I’ve got all this data about China. Do you think we could measure something about pollution from all this data?’” Wallach recalled. “We all got together to see if the Weibo data told a story, and it turns out it did.”

Cohan said, “China is an ideal testbed, because the pollutant levels are so high and so variable that you can literally see the difference day to day. Still, I was surprised that social media posts could correlate so strongly with air-quality conditions.”

Wallach said it was interesting to note that the U.S. Embassy measurements correlated well with the Chinese government’s own ground-level reporting on urban pollution. “Some people in China think their government might be lying to them about air quality, but based on what we found, that isn’t the case,” he said.

Co-authors of the paper include Rice alumnus Zhu Tao, now at Google, and Rice postdoctoral fellow Rui Zhang, now at the National Park Service. Cohan is an associate professor of civil and environmental engineering.

Researchers Report Advance In Low-Cost Clean Energy Generation

Researchers at the University of Houston and Massachusetts Institute of Technology have reported a substantial advance in generating electricity through a combination of concentrating solar power and thermoelectric materials.

By combining concentrating solar power – which converts light into heat that is then used to generate electricity – with segmented thermoelectric legs, made up of two different thermoelectric materials, each working at different temperature ranges, researchers said they have demonstrated a promising new alternative solar energy technology.

Their findings are published in Nature Energy.

Zhifeng Ren, MD Anderson Professor of physics at the University of Houston and an author of the paper, said the work illustrates a new low-cost, nontoxic way to generate power. While it’s not intended to replace large-scale power plants, it could prove especially useful for isolated areas that aren’t on a traditional electric grid, powering small clusters of homes or businesses, for example, he said. In addition to generating electricity, the technology also can produce hot water – valuable for both private and industrial purposes.

In addition to Ren, other authors on the paper include Gang Chen, Daniel Kraemer, Kenneth McEnaney, Lee A. Weinstein and James Loomis, all of MIT, and UH researchers Qing Jie, Feng Cao and Weishu Liu.

Ren, who also is a principal investigator at the Texas Center for Superconductivity at UH, said the work draws on the researchers’ earlier work, which demonstrated proof of the concept. For this project, supported in part by the Department of Energy, they actually built a device to measure how well optical concentration worked to improve the overall system efficiency.

They demonstrated an efficiency of 7.4 percent but reported that based upon their calculations, the device could achieve an efficiency of 9.6 percent. Their previous work resulted in an efficiency of 4.6 percent.

“The performance improvement is achieved by the use of segmented thermoelectric legs, a high-temperature spectrally selective solar absorber enabling stable vacuum operation with absorber temperatures up to 600 (degrees) C, and combining optical and thermal concentration,” the researchers wrote. “Our work suggests that concentrating STEGs (solar thermoelectric generators) have the potential to become a promising alternative energy technology.”

To gain the higher efficiency, the researchers used a solar absorber, boosted by optical concentrators to increase the heat and improve the energy density. The absorber was placed on legs constructed of thermoelectric materials. While their previous work used only bismuth telluride – a well-known thermoelectric material – this version used skudderudite for the top half of the legs and bismuth telluride for the lower half.

Thermoelectric materials produce electricity by exploiting the flow of heat current from a warmer area to a cooler area. By using two materials, the researchers said they were able to take advantage of a broader range of temperatures produced by the solar absorber and boost generating efficiency.

Skutterudite, for example, performs best at temperatures above 200 degrees Centigrade, while bismuth telluride works optimally at temperatures below that level.

“The record-high efficiencies are achieved by segmenting two thermoelectric materials, skutterudite and bismuth telluride, coupled to a spectrally selective surface operated at close to 600 (degrees) C by combined optical and thermal concentration of the sunlight,” they wrote.

Unraveling Complexities Of Nuclear Reactors

In order to devise new designs for safer, more efficient nuclear reactors, it is essential to be able to simulate the reactors’ performance at a very high level of detail. But because the nuclear reactions taking place in these reactor cores are quite complex, such simulations can strain the capabilities of even the most advanced supercomputer systems.

That’s a challenge that Benoit Forget has been tackling throughout his research career: how to provide efficient, high-fidelity simulations on modern computing architectures, and thus enable the development of the next generation of reactors.

Addressing those challenges has earned Forget tenure in MIT’s Department of Nuclear Science and Engineering, where he is now an associate professor.

Forget grew up in the small town of Temiscaming in the province of Quebec, Canada. His father was the principal of the local high school and his mother was a teacher there. “My mom was my French, history, and geography teacher,” he recalls. His graduating class had about 20 students.

“Science was always second-nature to me since I was a kid,” he says. “I spent a lot of time just tinkering around, breaking stuff apart and building it back up.” He also had a very supportive science teacher in high school, he says, who encouraged him to explore.

He began studying engineering at the École Polytechnique de Montréal, where he earned a BS in chemical engineering and an MS in energy engineering, and he did an internship at Hydro-Quebec, the local utility, which had a single nuclear power plant at the time.

It was during those first few years in Montréal that Forget developed his interest in nuclear engineering. “That’s when I had my first modern physics class and studied quantum mechanics, and that’s when I got hooked and wanted to study nuclear engineering. … I had a very good professor who introduced us to some of these concepts,” he says. From that one initial class, Forget made the decision to pursue this area of study.

“The [relatively small] amount of energy in a chemical reaction compared to the energy in one fission event is quite remarkable,” he says. “If you want to produce a lot of power quickly with little fuel or waste, this is the way to do it. So that made it my career choice.”

Forget moved to the United States to work on his PhD in nuclear engineering at Georgia Tech. He received the degree in 2006 and then spent a year and a half working at Idaho National Laboratory, before accepting an appointment at MIT. His wife, an aerospace engineer who received her doctorate a week prior to his, took a job at the MITRE Corporation, allowing them to begin their jobs in the Boston area at the same time. The couple now has a three-year-old son, Thomas, who “keeps us very busy,” Forget says.

Since arriving at MIT, Forget has concentrated on developing new ways of streamlining the complex software needed to simulate the vast numbers of random interactions that take place inside a nuclear reactor core, in order to better understand how to develop new generations of improved reactor architectures.

His team, the MIT Computational Reactor Physics Group, consists of two faculty members (he and Kord Smith, the KEPCO Professor of Nuclear Science and Engineering), 15-20 graduate students, and between five and 10 undergraduates. They “focus on modeling and simulation of the nuclear reactor itself — the physics that describes what goes on inside a nuclear reactor, how heat is being generated, where it’s being deposited, and how we extract that heat from the system,” Forget says.

“We focus primarily on the neutron and photon transport in the core, which is essentially the source of the fission reaction — we want to know precisely how much power is being produced where, so we can stay below the temperature limits, the material limits, and control everything else that goes on inside the reactor.” But even using the most efficient, streamlined computer code, simulating the operation of a whole plant for just one instant in time can take 100,000 CPU-hours, he says.

Currently, there are a lot of computer models, developed over the last half-century, that simulate the present generation of nuclear reactors. “These codes perform very well for the current generation,” he says. “But in the near future, there’s a lot of interest in looking at advanced reactors, new concepts, new designs, new materials — all designed to have more inherent safety, better economics, better fuel utilization. All of these cannot necessarily rely on the methods of the past. We’re going to need some more advanced methodologies.”

Since it’s impractical to build test reactors for every new concept, “we rely much more on high-fidelity modeling and simulation,” he says. “We’re still going to need experiments, but we want to design better experiments, so that they provide better information at lesser cost.” That’s where his team’s expertise comes into play. Among other projects, the researchers have developed two large pieces of software, called OpenMC and OpenMOC, which are both open-source packages available to anyone.

One of these code packages, OpenMC, is based on Monte Carlo simulations — a statistical technique for simulating complex systems in which random events play a significant role, by generating vast numbers of simulations that each involve slight variations on the others. The system Forget and his team have developed uses new approaches made possible by massively parallel computing and distributed computation. “Now we end up with a new paradigm for modern architectures in hardware, where we can do a lot of calculations directly” that used to require huge lookup tables of precomputed data, he says. And as a result, the team can more precisely capture the details of the physics, while actually streamlining the computations.

“We kind of bridge a gap between fundamental physics and high-performance computing. We dig a little bit deeper, and try to reformulate and use more fundamental physical representations,” he says. As they develop the models, he hopes to be able to simulate both long-term and short-term transients in addition to current steady state capabilities. Ultimately, “the goal is to be able to simulate a whole reactor, over its full lifetime, with as much detail as possible, for all possible operating conditions,” he says.

New Technology Helps Pinpoint Sources Of Water Contamination

When the local water management agency closes your favorite beach due to unhealthy water quality, how reliable are the tests they base their decisions on? As it turns out, those tests, as well as the standards behind them, have not been updated in decades. Now scientists from Lawrence Berkeley National Laboratory (Berkeley Lab) have developed a highly accurate, DNA-based method to detect and distinguish sources of microbial contamination in water.

Using the award-winning PhyloChip, a credit card-sized device that can detect the presence of more than 60,000 species of bacteria and archaea, the new method was found to be more sensitive than conventional methods at assessing health risks. In tests at the Russian River watershed in Northern California, the Berkeley Lab researchers found instances where their method identified potential human health risks that conventional fecal indicator tests had failed to detect. Conversely, they also found instances where the conventional tests flagged bacteria that weren’t likely risks to human health.

The research was led by Eric Dubinsky and Gary Andersen, microbial ecologists at Berkeley Lab, and was published recently in the journal Water Research in an article titled, “Microbial source tracking in impaired watersheds using PhyloChip and machine-learning classification.” Steven Butkus of the North Coast Regional Water Quality Control Board, which supported part of the research, was also a co-author.

“With the PhyloChip, in an overnight test we can get a full picture of the microorganisms in any given sample,” Dubinsky said. “Instead of targeting one organism, we’re essentially getting a fingerprint of the microbial community of potential sources in that sample. So it gives us a more comprehensive picture of what’s going on. It’s a novel way of going about source tracking.”

What local water agencies currently do is collect water samples, culture the bacteria overnight, and then check the growth level of two types of bacteria, E. coli and Enterococcus, which are presumed to be indicators of fecal contamination.

Power of the PhyloChip

However, this method doesn’t distinguish between sources­. The bacteria could have come from humans, cows, ducks, sewage, or even decaying vegetation.

“These tests have been used for decades and are relatively primitive,” Dubinsky said. “Back in the 1970s when the Clean Water Act was developed and we had sewage basically flowing into our waters, these tests worked really well. Epidemiological studies showed an association of these bacteria with levels of illness of people who used the water. These bacteria don’t necessarily get you sick, but they’re found in sewage and fecal matter. That’s why they’re measured.”

As pollution from point sources—single identifiable sources such as sewage—has been cleaned up over time, the emerging concern has become what are known as nonpoint sources, or diffuse sources, throughout the watershed, such as agricultural lands.

“The picture is much more complicated now than it was back then, when the concern was really point sources,” Dubinsky added.

The PhyloChip, which was developed by Andersen and several other Berkeley Lab scientists, has been used for a number of medical, agricultural, and environmental purposes, including understanding air pollution, the ecology of coral reefs, and environmental conditions of the Gulf of Mexico after the BP oil spill. With 1 million probes, it identifies microbes based on variations of a specific gene, with no culturing needed.

“About seven years ago we started doing water quality work, and we realized the PhyloChip could provide a fundamentally new and improved method for doing source tracking,” Andersen said.

A Library of Poop

Determining the source of any particular pathogen is not a straightforward task. In most cases, a single microbe is not a definitive marker of an animal or other source. “A microbial community is complex,” Dubinsky said. “A cow may have 1,000 different organisms.”

So Andersen and Dubinsky had an idea. “We had Laleh Coté, an intern at the time and now a Lab employee, run around and basically collect poop from all sorts of animals,” said Andersen. “What we’ve done since then is develop a reference library of the microbial communities that occur in different types of poop—we have cows, horses, raccoons, humans, different types of birds, pigs, sea lions, and other animals, as well as sewage and septage. We used that library to develop a model.”

The new method takes the unknown sample and compares it against this microbial reference library. “We’ve used the PhyloChip in a way that it hasn’t been used before by using machine learning models to analyze the data in order to detect and classify sources,” Andersen said. “It’s essentially giving you a statistical probability that a microbial community came from a particular source.”

They validated their method by comparing it to about 40 other methods of microbial source tracking in a California study. “We were the only method that could detect all sources and get them right,” Dubinsky said.

If the source is an animal that is not in the reference library, their method can still point you in the right direction. “For example, in that study, one sample was a chicken,” said Dubinsky. “We hadn’t analyzed chickens, but we had geese, gulls, and pigeons. We were still able to determine that the sample was a bird.”

In extensive testing throughout the Russian River watershed, which is out of compliance with the Clean Water Act, the Berkeley Lab researchers found widespread contamination by human sources close to areas where communities rely on aging septic tanks.

They also found significant human contamination immediately after a weekend jazz festival, whereas testing by conventional methods yielded a much weaker signal after a time lag of a couple days. “Our method is more sensitive to human contamination than those fecal indicator tests are,” Dubinsky said.

Next Steps

The team is now working on characterizing the microbial community of naturally occurring E. coli and Enterococci, using Hawaii with its warm waters as a testing ground. “They can occur naturally in sediments and decaying kelp and vegetation,” Dubinsky said. “It is known that they do, but nobody has developed a test to definitively show that.”

The researchers will also be able to study whether climate affects microbial communities. “Does a Hawaiian cow look like a California cow in terms of fecal bacteria composition? That’s a good question and something we’ll be able to find out,” he said.

They are working closely with the U.S. Environmental Protection Agency (EPA), which is looking at new technologies for what it calls “next generation compliance.” Ultimately the goal is to develop their method—possibly with a downsized version of the PhyloChip—to the point where it can be universally used in any location and by non-experts.

Dubinsky says the method should also be useful with the burgeoning issue of algal blooms, to understand, for example, the processes by which they form, the microbial dynamics before and after a bloom, and specifically, whether runoff from livestock production in the Midwest is related to algal blooms in the Great Lakes, a question they’re investigating with the EPA.