Friday, July 31, 2015

Machine Learning’s Impact on Solar Energy

In 2013, solar was the second-largest source of new electricity generating capacity in the U.S., exceeded only by natural gas. A USA SunShot Vision Study suggests solar power could provide as much as 14% of U.S. electricity demand by 2030, and 27% by 2050.

There are currently two main customers for renewable energy forecasting technologies: utility companies and independent system operators (ISOs). However, the difficulty in producing accurate solar and wind forecasts has required electric utilities to hold higher amounts of energy reserves as compared to conventional energy sources. Yet, solar power installations grow each day, and future solar penetration levels require increased attention to the value of more accurate solar forecasting.

With better solar and wind forecasts, it’s possible that solar energy’s contribution to the U.S.’s energy will reach up to 50%. Until now, due to intermittency, solar energy won’t supply more than 20 to 30% of the U.S.’s energy. However, a collaboration between IBM and the U.S. Dept. of Energy (DOE) could double the accuracy of solar and wind forecasts within the next year with the help of IBM Research’s machine learning technology.

Superfast fluorescence sets new speed record

Researchers have developed an ultrafast light-emitting device that can flip on and off 90 billion times a second and could form the basis of optical computing.

At its most basic level, your smart phone's battery is powering billions of transistors using electrons to flip on and off billions of times per second. But if microchips could use photons instead of electrons to process and transmit data, computers could operate even faster.

But first engineers must build a light source that can be turned on and off that rapidly. While lasers can fit this requirement, they are too energy-hungry and unwieldy to integrate into computer chips.

Duke Univ. researchers are now one step closer to such a light source. In a new study, a team from the Pratt School of Engineering pushed semiconductor quantum dots to emit light at more than 90 billion gigahertz. This so-called plasmonic device could one day be used in optical computing chips or for optical communication between traditional electronic microchips.

Simulations lead to design of near-frictionless material

Argonne National Laboratory scientists used Mira to identify and improve a new mechanism for eliminating friction, which fed into the development of a hybrid material that exhibited superlubricity at the macroscale for the first time. Argonne Leadership Computing Facility (ALCF) researchers helped enable the groundbreaking simulations by overcoming a performance bottleneck that doubled the speed of the team's code.

While reviewing the simulation results of a promising new lubricant material, Argonne researcher Sanket Deshmukh stumbled upon a phenomenon that had never been observed before.

"I remember Sanket calling me and saying 'you have got to come over here and see this. I want to show you something really cool,'" said Subramanian Sankaranarayanan, Argonne computational nanoscientist, who led the simulation work at the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science User Facility.

How Much Is Too Much? Information Overload in Disease and Drug Research

Biology is a rapidly evolving science. Every new discovery uncovers new layers of complexity that must be unraveled in order to understand the underlying biological mechanisms of diseases and for successful drug development.

Driven both by community need and by the increased likelihood of positive returns on the large investment required, drug discovery research has often focused on identifying and understanding the most common diseases with relatively straightforward causes and those that affect large numbers of individuals. Today, companies continue to push the boundaries of innovation in order to alleviate the debilitating effects of complex diseases—those that affect smaller patient populations or show high variability from patient to patient. This requires looking deeper into available data.

The big data revolution
Key to understanding complex and variable diseases is the need to examine data from large numbers of afflicted patients. More than 90% of the world’s data has been created in the past two years, and the pace is accelerating. High-throughput technologies create ever-expanding quantities of data for researchers to mine. But in addressing one problem, another has developed—how can researchers find the specific information they need among the mass of data?

Violent or Not, Video Games May Cause Aggression

Next time you feel frustration’s vice grip closing, you may want to hold off on releasing your anger via video games.

A recent study from the Univ. of Wisconsin-Madison delved into how video games are used to handle emotions, and found that while video games may bolster mood, both violent and nonviolent video games can increase accessibility to aggressive cognitions.

“We believe our nonviolent game, which at some points required participants to respond quickly to stimuli to progress in the game, might have activated aggressive cognitions to motivate achievement,”  wrote the authors in their research paper published in Computers in Human Behavior.

The study consisted of 82 undergraduate communications students. Half the students were tasked with playing an online game called Maximum Frustration, a highly difficult game meant to induce frustration, prior to the students’ engagement with either a violent or nonviolent video game.

“The game is designed to be nearly impossible to complete, although the subjects were led to believe they should be able to go through all the levels in 10 min,” according to the university.

Afterwards, the student subjects played either the PlayStation 3 game “LittleBigPlanet 2,” a nonviolent title, or “Fist of the North Star: Ken’s Rage,” a violent title, both for 18 min.

Robots do check-in and check-out at cost-cutting japan hotel

From the receptionist that does the check-in and check-out to the porter that's an automated trolley taking luggage up to the room, this hotel in southwestern Japan, aptly called Weird Hotel, is "manned" almost totally by robots to save labor costs.

Hideo Sawada, who runs the hotel as part of an amusement park, insists using robots is not a gimmick, but a serious effort to utilize technology and achieve efficiency.

The receptionist robot that speaks in English is a vicious-looking dinosaur, and the one that speaks Japanese is a female humanoid with blinking lashes. "If you want to check in, push one," the dinosaur says. The visitor still has to punch a button on the desk, and type in information on a touch panel screen.

Henn na Hotel, as it is called in Japanese, was shown to reporters Wednesday, complete with robot demonstrations, ahead of its opening to the public Friday.

Cutting cost and power consumption for big data

Random-access memory, or RAM, is where computers like to store the data they’re working on. A processor can retrieve data from RAM tens of thousands of times more rapidly than it can from the computer’s disk drive.

But in the age of big data, data sets are often much too large to fit in a single computer’s RAM. The data describing a single human genome would take up the RAM of somewhere between 40 and 100 typical computers.

Flash memory—the type of memory used by most portable devices—could provide an alternative to conventional RAM for big-data applications. It’s about a tenth as expensive, and it consumes about a tenth as much power.

The problem is that it’s also a tenth as fast. But at the International Symposium on Computer Architecture, MIT researchers presented a new system that, for several common big-data applications, should make servers using flash memory as efficient as those using conventional RAM, while preserving their power and cost savings.

The researchers also presented experimental evidence showing that, if the servers executing a distributed computation have to go to disk for data even 5% of the time, their performance falls to a level that’s comparable with flash, anyway.

Opening a new route to photonics

A new route to ultrahigh density, ultracompact integrated photonic circuitry has been discovered by researchers with the Lawrence Berkeley National Laboratory (Berkeley Lab) and the University of California (UC) Berkeley. The team has developed a technique for effectively controlling pulses of light in closely packed nanoscale waveguides, an essential requirement for high-performance optical communications and chip-scale quantum computing.

Xiang Zhang, director of Berkeley Lab’s Materials Sciences Division, led a study in which a mathematical concept called “adiabatic elimination” is applied to optical nanowaveguides, the photonic versions of electronic circuits. Through the combination of coupled systems – a standard technique for controlling the movement of light through a pair of waveguides – and adiabatic elimination, Zhang and his research team are able to eliminate an inherent and vexing “crosstalk” problem for nanowaveguides that are too densely packed.