ORNL’s computational expertise is built on a foundation of computer science, mathematics, and “big data”—or data science. The projects we undertake run the gamut from basic to applied research, and our ability to efficiently apply the massive computing power available at ORNL across a range of scientific disciplines sets us apart from other computing centers. We have decades of experience in developing applications to support basic science research in areas ranging from chemistry and materials science to fission and fusion, and we apply that expertise to solving problems in a number of other areas.
For example, our experience developing materials science applications has allowed us to build a “virtual” nuclear reactor that our scientists and industrial partners use to optimize current and future reactor designs. Similarly, our computational capabilities enable us to create highly detailed climate models and interactive simulations designed to improve the reliability and efficiency of the nation’s electric grid and transportation infrastructure.
ORNL has a 60-year history of computing stretching from Titan, currently one of the world’s most powerful supercomputers, back to ORACLE (Oak Ridge Automatic Computer and Logical Engine), the fastest computer in the world in 1954. Our experience in providing computational expertise and facilities to the U.S. Department of Energy has given us the in-depth understanding of computer science needed to wring the greatest scientific benefit from every dollar invested in these big machines.
Making the most of these world-class supercomputers requires a dedicated team of computer scientists, mathematicians, and computational scientists. Having a talented, interdisciplinary staff, and the resources to support them, also appeals to potential collaborators and prospective employees seeking broad opportunities for their interests and abilities.
Recent Research Highlights
January 02, 2014 — A team led by Oak Ridge National Laboratory’s Jeremy Smith, the director of ORNL’s Center for Molecular Biophysics and a Governor’s Chair at the University of Tennessee, has uncovered information that could help others harvest energy from plant mass. The team’s conclusion—that less ordered cellulose fibers bind less lignin—was published in the August edition of Biomacromolecules.
The Need for Speed
January 02, 2014 — Since the Oak Ridge Leadership Computing Facility’s (OLCF’s) Titan supercomputer began accepting its full suite of users on May 31st, science has been picking up steam. With its hybrid architecture featuring traditional CPUs alongside GPUs, Titan represents a revolutionary paradigm in high-performance computing’s quest to reach the exascale with only marginal increases in power consumption for the world’s leading systems.
Simulation Shuffles Protons and Electrons
December 16, 2013 — Advances may lead to better catalysts and more efficient solar energy. Plants solved the solar energy challenge billions of years ago, with photosynthesis. The sunlight-fueled process begins with two very plentiful molecules—carbon dioxide (CO2) and water (H2O).
OpenCore seeks to help businesses make best use of data
January 25, 2014 — Working as a research assistant at Oak Ridge National Laboratory, James Horey helped a number of federal agencies use their data more effectively.
Titan propels GE wind turbine research into new territory
January 17, 2014 — The amount of global electricity supplied by wind, the world’s fastest growing energy source, is expected to increase from 2.5 to as much as 12% by 2020. Under ideal conditions, wind farms can be three times more efficient than coal plants, but going wherever the wind blows is not always so easy.
CORAL: the next big thing in supercomputing; next-gen machines for Oak Ridge, Argonne, Livermore
January 17, 2014 — Three of the nation’s premier national labs — Oak Ridge, Argonne and Lawrence Livermore — are banding together to push the boundaries of supercomputing and acquire next-generation supercomputers for each of the labs at the best value for the U.S. government.