University of Washington researchers have created a system, called EnerJ, that reduces energy consumption in simulations by up to 50 percent, and potentially by as much as 90 percent.
'We all know that energy consumption is a big problem,' said Luis Ceze, assistant professor of computer science and engineering at the University of Washington, who led the study.
'With our system, mobile phone users would notice either a smaller phone, or a longer battery life, or both. Computing centres would notice a lower energy bill.'
The basic idea is to take advantage of processes that can survive tiny errors that happen when, say, voltage is decreased or correctness checks are relaxed, according to a Washington statement.
Some examples of possible applications are streaming audio and video, games and real-time image recognition for augmented-reality applications on mobile devices.
'Image recognition already needs to be tolerant of little problems, like a speck of dust on the screen,' said co-author Adrian Sampson, doctoral student and Ceze's colleague.
Some experts believe we are approaching a limit on the number of transistors that can run on a single microchip.
The team's approach would work like a dimmer switch, letting some transistors run at a lower voltage.
Other ways to use hardware to save energy are lowering the refresh rate and reducing voltage of the memory chip. Simulations of such hardware show that running EnerJ would cut energy by about 20 to 25 percent, on the average.
For one programme, the energy saved was almost 50 percent. Combining the software and hardware methods, researchers believe, could cut power use by about 90 percent.
These findings will be presented next week in San Jose at the Programming Language Design and Implementation annual meeting.