Massive-Scale Analytics Applied to Real-World Problems
David A. Bader (Georgia Institute of Technology, US)
PASC18 PUBLIC LECTURE
This event is free of charge and open to the general public. The lecture is given in English.
Emerging real-world graph problems include: detecting and preventing disease in human populations; revealing community structure in large social networks; and improving the resilience of the electric power grid. Unlike traditional applications in computational science and engineering, solving these social problems at scale often raises new challenges because of the sparsity and lack of locality in the data, the need for research on scalable algorithms and development of frameworks for solving these real-world problems on high performance computers, and for improved models that capture the noise and bias inherent in the torrential data streams.
In this talk, Bader will discuss the opportunities and challenges in massive data-intensive computing for applications in social sciences, physical sciences, and engineering.
Video available here.
David Bader is Professor and Chair of the School of Computational Science and Engineering at Georgia Institute of Technology, and is regarded as one of the world’s leading experts in data sciences. His interests are at the intersection of high performance computing (HPC) and real-world applications, including cybersecurity, massive-scale analytics, and computational genomics. Bader has co-authored over 200 articles in peer-reviewed journals and conferences, and is an associate editor for high-impact publications including IEEE Transactions on Computers, ACM Transactions on Parallel Computing, and ACM Journal of Experimental Algorithmics. He is a Fellow of the IEEE and AAAS, and has served on a number of advisory committees in scientific computing and cyber-infrastructure, including the White House's National Strategic Computing Initiative. Bader has served as a lead scientist in several DARPA programs and is a co-founder of the Graph500 list, a rating of "Big Data" computing platforms. He was recognized as a “Rock Star of HPC” by InsideHPC and as HPCwire's “People to Watch” in 2012 and 2014.
Challenges in the First Principles Modelling of Magneto Hydro Dynamic Instabilities and their Control in Magnetic Fusion Devices
Marina Becoulet (CEA/IRFM, France)
The main goal of the International Thermonuclear Experimental Reactor (ITER) project is the demonstration of the feasibility of future clean energy sources based on nuclear fusion in magnetically confined plasma. In the era of ITER construction, fusion plasma theory and modelling provide not only a deep understanding of a specific phenomenon, but moreover, modelling-based design is critical for ensuring active plasma control.
The most computationally demanding aspect of the project is first principles fusion plasma modelling, which relies on fluid models – such as Magneto Hydro Dynamics (MHD) – or increasingly often on kinetic models. The challenge stems from the complexity of the 3D magnetic topology, the large difference in time scales from Alfvenic (10-7s) to confinement time (hundreds of s), the large difference in space scales from micro-instabilities (mm) to the machine size (few meters), and most importantly, from the strongly non-linear nature of plasma instabilities, which need to be avoided or controlled.
The current status of first principles non-linear modelling of MHD instabilities and active methods of their control in existing machines and ITER will be presented, focusing particularly on the strong synergy between experiment, fusion plasma theory, numerical modelling and computer science in guaranteeing the success of the ITER project.
Marina Becoulet is a Senior Research Physicist in the Institute of Research in Magnetic Fusion at the French Atomic Energy Commission (CEA/IRFM). She is also a Research Director and an International Expert of CEA, specializing in theory and modelling of magnetic fusion plasmas, in particular non-linear MHD phenomena. After graduating from Moscow State University (Physics Department, Plasma Physics Division) in 1981, she obtained a PhD in Physics and Mathematics from the Institute of Applied Mathematics, Russian Academy of Science (1985). She worked at the Russian Academy of Science in Moscow, on the Joint European Torus in the UK, and since 1998 has been employed at CEA/IRFM, France.
Unraveling Earthquake Dynamics Through Extreme-Scale Multi-Physics Simulations
Alice Gabriel (Ludwig Maximilian University of Munich, Germany)
Earthquakes are highly non-linear multiscale problems, encapsulating geometry and rheology of faults within the Earth’s crust torn apart by propagating shear fracture and emanating seismic wave radiation.
This talk will focus on using physics-based scenarios, modern numerical methods and hardware specific optimizations to shed light on the dynamics, and severity, of earthquake behaviour. It will present the largest-scale dynamic earthquake rupture simulation to date, which models the 2004 Sumatra-Andaman event - an unexpected subduction zone earthquake which generated a rupture of over 1,500 km in length within the ocean floor followed by a series of devastating tsunamis.
The core components of the simulation software will be described, highlighting the benefits of strong collaborations between domain and computational scientists. Lastly, future directions in coupling the short-term elastodynamics phenomena to long-term tectonics and tsunami generation will be discussed.
Video available here.
Alice-Agnes Gabriel is an Assistant Professor of Geophysics at Ludwig Maximilian University of Munich. She received a PhD in seismology from ETH Zurich in 2013. She fuses expertise from Earth science, physics and computational mathematics to study the fundamentals of earthquake physics and develop methodological innovations for seismology. She is specifically interested in simulating waves and rupture processes within arbitrarily complex geological structures to enhance classic probabilistic seismic hazard assessment and a wide range of industry applications. Her career is distinguished by first-rate earthquake scenarios realized on some of the largest supercomputers worldwide.
Prediction : Use Science or History?
Eng Lim Goh (Hewlett Packard Enterprise, US)
PASC18 SPONSORED LECTURE
This event is promoted by Hewlett Packard Enterprise (PASC18 Platinum Sponsor)
Traditionally, scientific laws have been applied deductively - from predicting the performance of a pacemaker before implant, downforce of a Formula 1 car, pricing of derivatives in finance or the motion of planets for a trip to Mars. With Artificial Intelligence, we are starting to also use the data-intensive inductive approach, enabled by the re-emergence of Machine Learning which has been fueled by decades of accumulated data.
Video available here.
Eng Lim Goh is the VP and CTO, HPC and AI at Hewlett Packard Enterprise. His current research interest is in the progression from data intensive computing to analytic, inductive machine learning, deductive reasoning and artificial specific to general intelligence. In collaboration with NASA he is currently principal investigator of a year long experiment aboard the International Space Station - this project won both the 2017 HPCwire Top Supercomputing Achievement and Hyperion Research Innovation Awards. In 2005, InfoWorld named Dr. Goh one of the 25 Most Influential CTOs in the world. He was included twice in the HPCwire list of People to Watch. In 2007, he was named Champion 2.0 of the industry by BioIT World Magazine and received the HPC Community Recognition Award from HPCwire.
Dr. Goh completed his postgraduate work at Cambridge University, UK. He has been granted six U.S. patents with three pending.
From Weather Dwarfs to Kilometre-Scale Earth System Simulations
Nils P. Wedi (ECMWF, UK)
The European Centre for Medium-Range Weather Forecasts (ECMWF) leads a number of Horizon 2020 activities (ESCAPE) with innovation actions for developing a holistic understanding of energy-efficiency for extreme-scale applications using heterogeneous HPC architectures by: (a) defining and encapsulating the fundamental algorithmic building blocks ("Weather and Climate Dwarfs") underlying weather and climate services; (b) combining frontier research on algorithm development with hardware adaptation using DSLs; (c) developing benchmarks and cross-disciplinary Verification, Validation, and Uncertainty Quantification (VVUQ) for weather and climate applications; and (d) synthesizing the complementary skills of global numerical weather prediction with leading European researchers.
This talk will illustrate the need for and practicality of producing ensembles of km-scale simulations, summarize progress on accelerating state-of-the-art global weather and climate predictions, and discuss outstanding issues and future directions on producing and analysing big weather data while balancing time-critical customer needs with energy- and time-to-solution.
Video available here.
Nils P. Wedi has a PhD from Ludwig Maximilian University of Munich and joined ECMWF in 1995.
His career at ECMWF encapsulates a diverse range of work both technical and scientific. He leads ECMWF's Earth System Modelling section that addresses all aspects of scientific and computational performance relating to ECMWF's forecast model and the ensemble forecasting system. He develops strategies to secure the scalability of the model on future high-performance computing systems. He is the scientific coordinator of the European H2020 projects ESCAPE and ESCAPE-2 to address the challenges of rising energy cost for computing towards affordable, exascale high performance simulations of weather and climate, and he is a member of the World Meteorological Organization working group on numerical experimentation (WGNE).