Google Search

Friday, April 5, 2013

Texas Unleashes Stampede for Science

Stampede_system_shot_750_web

2dtower

knowlogo

By Aaron Dubrow, Texas Advanced Computing Center
Published: March 27, 2013

Texas Advanced Computing Center’s latest supercomputer powers transformative discoveries across science and engineering

You hear it before you see it — a roar like a factory in full production. But instead of cars or washing machines, this factory produces scientific knowledge.

Stampede, the newest supercomputer at the Texas Advanced Computing Center (TACC) and one of the most advanced scientific research instruments in the world, fills aisle after aisle of a new 11,000-square-foot data center on the J.J. Pickle Research Campus. Through the glass machine room doors, you can see 182 racks holding more than 500,000 interconnected computer processors. Inside, wind whips from in-row coolers, wires snake over the racks and chilled water courses below the floor as Stampede performs calculations on behalf of scientists and engineers nationwide.

Stampede Research Bits

Check out examples of research already underway using Stampede’s supercomputing power.

( Click on images )

 
 
 
 Getting Beneath the Tip of the IcebergOmar Ghattas’ team at The University of Texas at Austin is using Stampede to better understand and represent the flow of ice from Antarctica into the sea using detailed numerical models. Getting Beneath the Tip of the Iceberg
Omar Ghattas’ team at The University of Texas at Austin is using Stampede to better understand and represent the flow of ice from Antarctica into the sea using detailed numerical models.

Predicting the Big OneThomas Jordan, of the Southern California Earthquake Center, is using Stampede to forecast the frequency of damaging earthquakes in California. Predicting the Big One
Thomas Jordan, of the Southern California Earthquake Center, is using Stampede to forecast the frequency of damaging earthquakes in California.

Analyzing AudioResearchers in the social sciences, digital humanities and arts are using Stampede to enable new discoveries. Stampede provides high-performance computing and large-scale visualization to sound archivists to help them search for patterns and gain insights into spoken language and music. Analyzing Audio
Researchers in the social sciences, digital humanities and arts are using Stampede to enable new discoveries. Stampede provides high-performance computing and large-scale visualization to sound archivists to help them search for patterns and gain insights into spoken language and music.

Over the past year TACC staff designed, built and deployed Stampede, working closely with Dell and Intel engineers and university researchers. TACC and The University of Texas at Austin competed against the top supercomputing centers and universities to claim one of the most advanced systems in the world — and won. The award was funded by the National Science Foundation (NSF) with an estimated investment of more than $50 million over a four-year period. The project may be renewed in 2017, which would enable four more years of open science research.

According to the November 2012 Top 500 list of supercomputers, Stampede is the seventh-most powerful advanced computing system on the planet and the most powerful in the U.S. dedicated to academic research, capable of outperforming 100,000 home computers.

On Wednesday, March 27, leaders from government, academia and industry, including The University of Texas at Austin’s President Bill PowersJay Boisseau of TACC, Marius Haas of Dell, Diane Bryant of Intel, Farnam Jahanian of the NSF and U.S. Congressman Lamar Smith, will dedicate Stampede and kick off a new era of advanced computing at the university and nationally.

“This is a proud moment for The University of Texas,” President Powers says. “Stampede will serve UT and the nation as it enables scientific exploration that would not otherwise be possible, and it continues TACC’s tradition of providing powerful, comprehensive and leading-edge advanced computing technologies to the open science community.”

As its name suggests, Stampede harnesses the power of a half-million computer processors and combines them to tackle ever larger and more challenging computational problems. Sixteen times more powerful than the recently decommissioned Ranger system (which was most recently ranked as the 50th-fastest supercomputer in the world), Stampede will enable scientists to address new classes of problems they’ve never been able to approach before.

“How often does a scientist get an instrument that’s an order of magnitude more powerful than the one it replaces?” says Omar Ghattas, the John A. and Katherine G. Jackson Chair in Computational Geosciences. “It’s a massive step.”

In recent years, supercomputers have become critical general-purpose instruments for conducting scientific research. Known as the “third pillar” of science, computer simulations and models complement theory and experimentation and allow researchers to explore phenomena that cannot be captured via observation or laboratory experiments.

Supercomputers also allow scholars to mine massive databases of information for digital needles-in-haystacks that otherwise would go unnoticed — such as subtle changes in DNA or the signs of a newly discovered galaxy in a far corner of the universe.

Closer to home, the weather report you checked before going outside, the car you drove to work, the flu shot that protected you from illness — all of these were, at least in part, designed, improved or predicted by a supercomputer.

The reason supercomputers are so important is simple: The universe is governed by mathematical equations, and computers can solve these equations far faster than humans. Enormous supercomputers, like Stampede, enable researchers to solve scientific problems that humans along would find impossible — the kind that help predict where a hurricane will make landfall, how a new drug will interact with its target, or what mutations in our genetic code make us prone to developing certain diseases.

Stampede acts as a “computational microscope” that allows scientists to explore the inner dynamics of the cell better than with the best imaging devices; helps astronomers peer deeper into the universe’s past than is possible with the most powerful telescopes; enables researchers to develop new materials to remove CO2 from the atmosphere; identifies brain tumors more accurately; and discovers new medicines faster and less expensively than in a laboratory. Stampede will even help researchers in the digital arts and humanities study literature and music in ways they never imagined before.

In the past, supercomputers were often used for a small subset of science and engineering problems. But systems like Stampede, with its comprehensive capabilities, allow many more users to simulate, visualize, analyze, store and share their knowledge with others around the world.

In the first three months of operations, approximately 600 projects and more than 1,200 scientists have used Stampede, which came online in January 2013. These include top researchers in every field of inquiry from mechanical engineering to linguistics to neuroscience. In its lifetime, Stampede is expected to deliver the equivalent of more than 400,000 years of computing to tens of thousands of scientists. Imagine the results Stampede will enable across all fields of knowledge.

Read more about research already underway using Stampede:

Improving Brain Tumor Imaging

The Chemistry of Water

A New Era in Computational Biology

Carbon Dioxide Capture and Conversion

_____________________________________________________________

_____________________________________________________________

ResearchBlogging.org
Aaron Dubrow (2013).
Texas Unleashes Stampede for Science
UT – Texas Advanced Computing Center

_____________________________________________________________

T 3 v2


View the original article here

Thursday, April 4, 2013

Video Tip of the Week: Enzyme Portal and User-Centered Design

This week’s video tip of the week introduces you to Enzyme Portal, an interface to explore data about these important proteins, from the EBI. In the video, Jenny Cham–one of the authors of the paper below–takes you through the main features of their newly designed resource.

I learned about the new effort from this blog post at BMC: Designing better web experiences for bioinformatics. In this post, the team talks about the backstory and the philosophy of user-centered design that they employed to create the site. They also note that the article describes not only their experience, but also offers guidance for people who might be building resources of their own.

The resource they deliver provides categorized and integrated information about the proteins, genes, EC numbers, structure, pathways, disease relationships, small molecules, and the literature. So from that perspective it might sound similar to other resources. But their re-organization of that data into the easy tab navigation, and the quick way to switch among species, is easier than some other resources I’ve used. I do like the quick access to the graphical representations like you can see on this page: http://www.ebi.ac.uk/enzymeportal/search/P09104/reactionsPathways . And I like that they link to Reactome, which is one of my preferred pathway resources. But there are links to many other useful tools and resources as well–exactly the ones I’d expect to need when seeking out more details.

In the paper I liked their summary of the “challenges” associated with applying user-centered design (UCD) to bioinformatics. I have seen some of the resistance to this first-hand, beginning over 15 years ago when a friend of mine was trying really hard to encourage usability and design for bioinformatics tools (right Michael?). And getting very little support for that. Alas. I hope people begin to appreciate this at some point….

So have a look and think about how you are using this tool. And offer them feedback–I’m sure they’d want your input. If you are creating tools for end-users, think about ways you might incorporate some of their strategies. A lot of tools I’ve seen could benefit from a bit more thought about how it’s going to be used by people who don’t write the code.

Quick link:

Enzyme Portal at EBI: http://www.ebi.ac.uk/enzymeportal/

Reference:

de Matos, P., Cham, J., Cao, H., Alcántara, R., Rowland, F., Lopez, R., & Steinbeck, C. (2013). The Enzyme Portal: A case study in applying user-centred design methods in bioinformatics BMC Bioinformatics, 14 (1) DOI: 10.1186/1471-2105-14-103

Comments: -


View the original article here

Wednesday, April 3, 2013

X-ray Laser Explores How to Write Data with Light


A look inside the RCI sample chamber while researchers close up the chamber for vacuum for an experiment at LCLS. (Credit: Diling Zhu/SLAC) A look inside the RCI sample chamber while researchers close up the chamber for vacuum for an experiment at LCLS. (Credit: Diling Zhu/SLAC)

SLAC_Logo_hires

As the laser light hits the sample, iron spin currents are generated that transfer their angular momentum to gadolinium spins within nanoscale regions of the sample. (Credit: Greg Stewart/SLAC) As the laser light hits the sample, iron spin currents are generated that transfer their angular momentum to gadolinium spins within nanoscale regions of the sample. (Credit: Greg Stewart/SLAC)

By Glenn Roberts Jr.

Using laser light to read and write magnetic data by quickly flipping tiny magnetic domains could help keep pace with the demand for faster computing devices.

Now experiments with SLAC’s Linac Coherent Light Source (LCLS) X-ray laser have given scientists their first detailed look at how light controls the first trillionth of a second of this process, known as all-optical magnetic switching.

The experiments show that the optically induced switching of the magnetic regions begins much faster than conventional switching and proceeds in a more complex way than scientists had thought – a level of detail long sought by the data storage industry, which is eager to learn more about the key drivers of optical switching. The new insight could help guide efforts to engineer materials that better control and speed this process.

“This is really one of the first examples of new materials science that can be done with LCLS, which allows you to look at very short time scales and very small length scales,” said Hermann Dürr, a staff scientist for the Stanford Institute for Materials and Energy Sciences (SIMES) and a principal investigator of the multinational team that performed the experiment, detailed in the March 17 issue of Nature Materials. SIMES is a joint institute of SLAC and Stanford.

The experiments were performed on tiny samples of a metallic alloy containing iron, cobalt and gadolinium, a combination singled out years ago by the data storage industry for its unique magnetic properties.

In magnetic storage devices, information is stored and retrieved by quickly flipping the orientations of electrons’ spin – which have two possible directions, similar to the poles of a magnet – to produce the equivalent of “ones” and “zeroes.” In conventional devices this is done by applying an electrical current or magnetic field. But the hope is that light can make the switch faster, and thus speed computing.

Watch a slideshow via :
http://www.flickr.com//photos/slaclab/sets/72157633026273379/show/

In the LCLS experiments, scientists started the switching process by hitting a sample with ultrafast pulses of visible light. Then they hit the same sample with a carefully timed pulse from the LCLS X-ray laser, and used a technique called X-ray scattering to probe the evolution of the switching process during the first trillionth of a second. This revealed previously unknown activity at the near-atomic scale of the magnetic regions.

The results showed that the optical laser flipped the magnetic state of the material up to 1,000 times faster, as well as more efficiently, than magnetic switching used in current commercial devices.

The experiments also revealed that the elements within the sample were distributed less uniformly at the nanoscale than previously believed, and that this strongly affected the switching process.

The sample was found to contain tiny regions that were rich in iron or gadolinium. In the instant following the optical laser pulse, the electron spins of the two types of regions began to interact and align. This “spin current” flowed most actively from iron-rich to smaller gadolinium-rich areas, which appeared to serve as localized “traps” for transferring and switching spins.

“People are trying to understand how laser light interacts with spins on these ultrafast timescales in magnetic materials,” said Catherine Graves, a graduate student in the Department of Applied Physics at Stanford University who was a lead author of the research paper. “If you can understand why these areas are acting as spin traps – what’s happening microscopically – you can design that into a material.”

The varied landscape of the material at the nanoscale was unexpected, Graves added: “Nobody thought they would see this variation in the material” or realized it would greatly impact the optical switching process.

The team of researchers is planning follow-up LCLS experiments using faster detectors and a variety of samples, with a goal of finding better-optimized materials for optical switching at the nanoscale and better understanding and optimizing the switching process.

“There is still so much to learn about ultrafast magnetic processes,” said Alexander Reid, a SIMES research associate and another lead author of the research paper. “People are still trying to understand the mechanisms. This gives us a very nice piece of the puzzle.”

Participants in the LCLS experiment routinely conduct related research at SLAC’s Stanford Synchrotron Radiation Lightsource (SSRL), Lawrence Berkeley National Laboratory’s Advanced Light Source and other synchrotrons. This unique combination of scientific facilities is ideal, Reid said. “Only now, through tools like the LCLS and SSRL, can we really begin to examine how magnetism behaves at its fundamental length and time scales,” he said.

Collaborators in the research were from SLAC’s LCLS and SSRL; SIMES; Stanford University; Radboud University Nijmegen in the Netherlands; Nihon University in Japan; the Swiss Federal Institute of Technology (ETH); and the German Electron Synchrotron (DESY), Center for Free-Electron Laser Science (CFEL), PNSensor, Max Planck Institute for Extraterrestrial Physics and Jülich Research Center in Germany.

Citation: C. E. Graves et al., Nature Materials, 17 Mar 2013 (10.1038/nmat3597)

_____________________________________________________________

_____________________________________________________________

ResearchBlogging.org
Glenn Roberts Jr. (2013).
X-ray Laser Explores How to Write Data with Light
SLAC National Accelerator Laboratory News

_____________________________________________________________

T 3 v2


View the original article here

Tuesday, April 2, 2013

Modern Privacy: More Access to Cells than Toilets

Posted on March 27, 2013 by Katja Keuchenius

phonesWhat ‘s happening to privacy in today’s world? On first sight it seems to get bigger. Most people in the world now live in a city, feeling pretty anonymous. An even bigger number of people don’t have to defocate out in the open anymore, but have access to a toilet. And then there’s this growing amount of people that can talk in private on their cells. But beware, these are false senses of privacy.

In Nature’s Scientific Reports researchers from Massachusets and Belgium published their findings on what you can do with mobility data. With probably thousands of other people permanently moving around you, it might feel as if your own whereabouts easily get lost in the data pool of all the others. But it turns out the way you move around is very unique.

If someone knows just four recent locations of your phone, they can allready trace your identity, at least in 95 percent of the cases. And it’s not only the phone calls you that make you traceable. Think about all the apps that retrieve your geographic location and the wifi spots you use.

Mobility data is nowadays among the most sensitive data currently being collected. And more and more people are getting plugged into this data pool. UN just found out that out of the 7 billion people on our planet, 4.5 billion people have access to working toilets, while 6 billion have access to mobile phones.

Photo: Flickr, Scallop Holden
Source: Time, Phys.org
de Montjoye YA, Hidalgo CA, Verleysen M, & Blondel VD (2013). Unique in the Crowd: The privacy bounds of human mobility. Scientific reports, 3 PMID: 23524645

cell phone privacy


View the original article here