Thursday, May 28, 2015

Breaking Down Barriers: Streamlining Data Management to Boost Knowledge Sharing

Research in the pharmaceutical and industrial science industries has become increasingly global, multidisciplinary and data-intensive. This is made clear by the evolution in patent approvals, which can also be considered a reliable measure of innovation in these industries. Innovation itself, of course, is a cumulative effect, which requires access to multiple fragments of knowledge from disparate sources and exchange of technology and ideas.

While the benefits in innovation in such a competitive environment are clear, investment in research is primarily influenced by the strategic behavior of companies, and a deeper understanding of the importance of market share. Patents and publications help to establish corporate reputation, allowing for controlled technology transfer with strategic joint ventures and to raise barriers to prevent competitors from eroding market share.

Enron becomes unlikely data source for computer science researchers

Computer science researchers have turned to unlikely sources - including Enron - for assembling huge collections of spreadsheets that can be used to study how people use this software. The goal is for the data to facilitate research to make spreadsheets more useful.

"We study spreadsheets because spreadsheet software is used to track everything from corporate earnings to employee benefits, and even simple errors can cost organizations millions of dollars," says Emerson Murphy-Hill, an assistant professor of computer science at NC State and co-author of two new papers on the work.

Computer scientists speed up mine detection

Computer scientists at the Univ. of California, San Diego, have combined sophisticated computer vision algorithms and a brain-computer interface to find mines in sonar images of the ocean floor. The study shows that the new method speeds detection up considerably, when compared to existing methods—mainly visual inspection by a mine detection expert.

“Computer vision and human vision each have their specific strengths, which combine to work well together,” said Ryan Kastner, a professor of computer science at the Jacobs School of Engineering at UC San Diego. “For instance, computers are very good at finding subtle, but mathematically precise patterns while people have the ability to reason about things in a more holistic manner, to see the big picture. We show here that there is great potential to combine these approaches to improve performance.”

New chip architecture may provide foundation for quantum computer

Quantum computers are in theory capable of simulating the interactions of molecules at a level of detail far beyond the capabilities of even the largest supercomputers today. Such simulations could revolutionize chemistry, biology and material science, but the development of quantum computers has been limited by the ability to increase the number of quantum bits, or qubits, that encode, store and access large amounts of data.

In a paper appearing in the Journal of Applied Physics, a team of researchers at Georgia Tech Research Institute and Honeywell International have demonstrated a new device that allows more electrodes to be placed on a chip—an important step that could help increase qubit densities and bring us one step closer to a quantum computer that can simulate molecules or perform other algorithms of interest.

The next step in DNA computing

Conventional silicon-based computing, which has advanced by leaps and bounds in recent decades, is pushing against its practical limits. DNA computing could help take the digital era to the next level. Scientists are now reporting progress toward that goal with the development of a novel DNA-based GPS. They describe their advance in The Journal of Physical Chemistry B.

Jian-Jun Shu and colleagues note that Moore’s law, which marked its 50th anniversary in April, posited that the number of transistors on a computer chip would double every year. This doubling has enabled smartphone and tablet technology that has revolutionized computing, but continuing the pattern will come with high costs. In search of a more affordable way forward, scientists are exploring the use of DNA for its programmability, fast processing speeds and tiny size. So far, they have been able to store and process information with the genetic material and perform basic computing tasks. Shu’s team set out to take the next step.

Computing at the speed of light

Univ. of Utah engineers have taken a step forward in creating the next generation of computers and mobile devices capable of speeds millions of times faster than current machines.

The Utah engineers have developed an ultracompact beamsplitter—the smallest on record—for dividing light waves into two separate channels of information. The device brings researchers closer to producing silicon photonic chips that compute and shuttle data with light instead of electrons. Electrical and computer engineering associate professor Rajesh Menon and colleagues describe their invention in Nature Photonics.

Silicon photonics could significantly increase the power and speed of machines such as supercomputers, data center servers and the specialized computers that direct autonomous cars and drones with collision detection. Eventually, the technology could reach home computers and mobile devices and improve applications from gaming to video streaming.

Digitizing neurons

Supercomputing resources at the U.S. Dept. of Energy (DOE)’s Oak Ridge National Laboratory (ORNL) will support a new initiative designed to advance how scientists digitally reconstruct and analyze individual neurons in the human brain. Led by the Allen Institute for Brain Science, the BigNeuron project aims to create a common platform for analyzing the 3-D structure of neurons.

Mapping the complex structures of individual neurons, which can contain thousands of branches, is a labor-intensive and time-consuming process when done by hand. BigNeuron’s goal is to streamline this process of neuronal reconstruction—converting two-dimensional microscope images of neurons into 3-D digital models.