141662998
submission
aarondubrow writes:
A study in the Bulletin of the Seismological Society of America presents results from a new earthquake simulator, RSQSim, that simulates hundreds of thousands of years of seismic history in California. Coupled with another code, CyberShake, the framework can calculate the amount of shaking that would occur for each quake. The framework makes use of two of the most powerful supercomputers on the planet: Frontera, at the Texas Advanced Computing Center, and Summit, at Oak Ridge National Laboratory.
The new approach improves seismologists' ability to pinpoint how big an earthquake might occur at a given location, allowing building code developers, architects, and structural engineers to design more resilient buildings that can survive earthquakes.
135162729
submission
aarondubrow writes:
NASA JPL are developing autonomous capabilities that could allow future Mars rovers to go farther, faster and do more science. Training machine learning models on the Maverick2 supercomputer at the Texas Advanced Computing Center, their team developed and optimized models for Drive-By Science and Energy-Optimal Autonomous Navigation. The team presented results of their work at the IEEE Aerospace Conference in March 2020. The project was a finalist for the NASA Software Award.
115018674
submission
aarondubrow writes:
The Texas Advanced Computing Center (TACC) at The University of Texas at Austin today launched Frontera, the fastest supercomputer at any university and the 5th most powerful system in the world. TACC is also home to Stampede2, the second fastest supercomputer at any American university. The launch of Frontera solidifies UT Austin among the world's academic leaders in this realm.
Frontera has been supporting science applications since June and has already enabled more than three dozen teams to conduct research on a range of topics from black hole physics to climate modeling to drug design, employing simulation, data analysis, and artificial intelligence at a scale not previously possible.
First announced in August 2018, Frontera was built in early 2019, and earned the #5 spot on the twice-annual TOP500 list in June, achieving 23.5 PetaFLOPS (23.5 thousand million million floating-point operations per second) on the high-performance LINPACK benchmark, a measure of the system's computing power.
In August, Frontera added two new subsystems — using technologies from NVIDIA, IBM and Green Revolution Cooling (GRC) — which provide 11 PetaFLOPS of additional single precision performance and to explore alternate computational architectures for the future.
108165540
submission
aarondubrow writes:
Supercomputers at the Texas Advanced Computing Center (TACC) made vital contributions to the first-ever image of a black hole in the galaxy M87. Those systems helped lay the groundwork for black hole imaging, and provided the theoretical foundation that allowed scientists to read the mass, underlying structure, and orientations of the black hole and its environment. Using data collected by the Event Horizon Telescope (EHT), a global network of radio telescopes, research teams employed TACC's Stampede1 and Stampede2 supercomputers to three-dimensionally simulate the physical properties of M87, and predict observational features of the black hole. Further models rendered the dynamics of the phenomenon into an image of how it would appear from Earth, using ray-tracing methods. Another team used TACC's Jetstream cloud environment to develop cloud-based data analysis pipelines, used to combine massive EHT data troves, and to share the data worldwide.
99283547
submission
aarondubrow writes:
Astronauts and future space tourists face risks from radiation, which can cause illness and injure organs. Researchers from Texas A&M, NASA and the University of Texas Medical Branch used supercomputers at the Texas Advanced Computing Center to investigate the radiation exposure related to the Manned Orbiting Laboratory mission, planned for the 1960s and 70s, during which a dangerous solar storm occurred. They also explored the historical limitations of radiation research and how such limitations could be addressed in future endeavors.
90813589
submission
aarondubrow writes:
Finding new drugs that can more effectively kill cancer cells or disrupt the growth of tumors is one way to improve survival rates for ailing patients. Researchers are using supercomputers at the Texas Advanced Computing Center to find new chemotherapy drugs and to test known compounds to determine if they can fight different types of cancer. Recent efforts have yielded promising drug candidates, potential plant-derived compounds and new target sites that can lead to more effective drugs.
82305663
submission
aarondubrow writes:
Researchers working on the Severe Hail Analysis, Representation and Prediction (SHARP) project at University of Oklahoma used the Stampede supercomputer to gain a better understanding of the conditions that cause severe hail to form, and to produce hail forecasts with far greater accuracy than those currently used operationally. The model the team used is six times more resolved that the National Weather Service's highest-resolution official forecasts and applies machine learning algorithms to improve its predictions. The researchers will publish their results in an upcoming issue of the American Meteorological Society journal Weather and Forecasting.
81855943
submission
aarondubrow writes:
Computer science researchers from the University of Rochester developed an app for health departments that uses natural language processing and artificial intelligence to identify likely food poisoning hot spots. Las Vegas health officials recently used the app, called nEmesis, to improve the city's inspection protocols and found it was 63% more effective at identifying problematic venues than the current state of the art. The researchers estimate that if every inspection in Las Vegas became adaptive, it could prevent over 9,000 cases of foodborne illness and 557 hospitalizations annually. The team presented the results at the 30th Association for the Advancement of Artificial Intelligence conference in February.
79537953
submission
aarondubrow writes:
Today, NSF, in partnership with DOD, NASA, NSF, NIH and USDA, announced $37 million in new awards to spur the development and use of co-robots, robots that work cooperatively with people. From unmanned vehicles that can inspect and fix ailing infrastructure to co-robots that can collaborate with workers on manufacturing tasks, scientists and engineers are developing the next generation of robots that can handle critical tasks in close proximity to humans, providing for unprecedented safety and resilience. This year, the initiative funded 66 new research proposals to 49 distinct institutions in 27 states.
77257209
submission
aarondubrow writes:
The National Science Foundation announced $74.5 million in grants for basic research in cybersecurity. Among the awards are projects to understand and offer reliability to cryptocurrencies; invent technologies to broadly scan large swaths of the Internet and automate the detection and patching of vulnerabilities; and establish the science of censorship resistance by developing accurate models of the capabilities of censors. According to NSF, long-term support for fundamental cybersecurity research has resulted in public key encryption, software security bug detection, spam filtering and more.
76406769
submission
aarondubrow writes:
Robin Murphy, director of Center for Robot-Assisted Search and Rescue at Texas A&M University and one of the leading researchers in the field of disaster robotics, has used robots and UAVs for search-and-rescue missions and structural inspections during more than 20 disasters, from 9/11 to Katrina to Fukushima and the 2015 Texas floods. The Huffington Post carried a story where she describes five lessons she's learned from her robot deployments and research.
76129133
submission
aarondubrow writes:
To its advocates and participants, the Maker Movement resonates with those characteristics that we believe makes America great: independence and ingenuity, creativity and resourcefulness. But as impressive as today's tools are, they're not accessible to many Americans simply because of their cost and high technological barrier to entry. An article in the Huffington Post describes efforts supported by the National Science Foundation and other federal agencies to create new tools, technologies and approaches to make the Maker movement more inclusive and democratic.
71660173
submission
aarondubrow writes:
The National Science Foundation funds basic cyberlearning research and since 2011 has awarded roughly 170 grants, totaling more than $120 million, to EdTech research projects around the country. However, NSF's approach to cyber-learning has been different from other public, private and philanthropic efforts. NSF funds compelling ideas, helps rigorously test them and then assists in transitioning the best ideas from research to practice. A story in the Huffington Post describes 7 examples of leading cyberlearning projects, from artificial intelligence to augmented reality, that are transforming education.
70023811
submission
aarondubrow writes:
Automakers have presented a vision of the future where the driver can check his or her email, chat with friends or even sleep while shuttling between home and the office. However, to AI experts, it's not clear that this vision is a realistic one. In many areas, including driving, we'll go through a long period where humans act as co-pilots or supervisors before the technology reaches full autonomy (if it ever does). In such a scenario, the car would need to communicate with drivers to alert them when they need to take over control. In cases where the driver is non-responsive, the car must be able to autonomously make the decision to safely move to the side of the road and stop. Researchers from the University of Massachusetts Amherst have developed 'fault-tolerant planning' algorithms that allow semi-autonomous machines to devise and enact a "Plan B."
67018411
submission
aarondubrow writes:
As supercomputing becomes central to the work and progress of researchers in all fields, new kinds of computing resources and more inclusive modes of interaction are required. Today, the National Science Foundation (NSF) announced $16M in awards to support two new supercomputing acquisitions for the open science community. The systems — "Bridges" at the Pittsburgh Supercomputing Center (PSC) and "Jetstream," co-located at the Indiana University Pervasive Technology Institute (PTI) and The University of Texas at Austin's Texas Advanced Computing Center (TACC) — respond to the needs of the scientific computing community for more high-end, large-scale computing resources while helping to create a more inclusive computing environment for science and engineering.