Indiana University Hosts Big Data and Extreme-Scale Computing Workshop https://www.automation.com/files/pluginfiles/item_99808/field_376/iu_reg.jpg
4. December 2018 - High-performance computing researchers met at the Indiana University campus in Bloomington for the second Big Data and Extreme Scale Computing Workshop (BDEC2). 28th to 30th November in the Cyberinfrastructure Building
Professors Geoffrey Fox and Judy Qiu, both in the Department of Intelligent Systems Engineering at the IU School of Computer Science, Computer Science and Engineering, have jointly brought BDEC2 to IU. The workshop was funded by the US National Science Foundation and Intel Corp. financed.
Participants from around the world - including national and university labs and supercomputing centers in Japan, China, Saudi Arabia, Spain and the United States - Strategies for brainstorming converge big data and the high-performance computing tools that are required to understand massive datasets.
How Big is Big Data? In 2020, when the amount of digital data is expected to reach 40 zettabytes, the number of devices connected to the network - sensors, actuators, instruments, computers and data storage - is expected to reach 50 billion euros, about five times more expected population of the planet for this year.
"Machine learning and big data in combination with edge computing have been a disruptive force at HPC," said Pete Beckman, co-organizer of BDEC2 and director of the Northwestern-Argonne Institute of Science and Technology at Argonne National Laboratory. "Now people want to connect millions of smart devices - the Internet of Things - and merge multiple datasets of satellites, medical databases, and more and run them continuously while using supercomputers, which is a challenge."  With the increasing internationalization of science, Beckman said that every solution developed must be global. "For example, climate modeling is not just done in one country, but using data from around the world and using HPC tools to understand the data in many different countries."
Satoshi Matsuoka reflects the feelings of Beckman. He is director of the Riken Center for Computational Science at Riken, the largest high performance computing center in Japan. There he oversees the K supercomputer and its upcoming exascale successor, the Post-K supercomputer.
"A solution to the convergence problem of big data and high-performance computing will not be achieved immediately, but we will solve it together with international cooperation," he said. "It's very important that we create infrastructure platforms and software components to bring it all together."
Matsuoka said a new concept is emerging in Japan: Society 5.0. In this fifth phase of human evolution, knowledge is gained from artificial intelligence data to solve humanity's biggest problems, such as hunger, climate change and disease.
Jack Dongarra from the University of Tennessee and the Oak Ridge National Laboratory BDEC2 co-organizer. He said the workshop was the ideal way to solve a complex problem and advance scientific discovery.
"By pooling a number of communities, we can focus on ideas that drive big data and computing at an extreme level, in ways that will not duplicate things," he said. "BDEC2 makes ideas accessible to other groups of people, enabling collaboration between universities, countries and institutions in a way that benefits science as a whole."
Did you like this article?
See our great e-newsletterSubscribe now
for more great articles.
we are supplier of ABB,endress hauser,MTL Intrinsic Safety Eaton MTL,Pepperl+Fuchs International. Industrial Sensors, Factory Automation ,P+F ,SMAR – Industrial Automation
for get this brands items please send us your inquiries as following link
Please send us your request with full details via the following link to supply your equipment in the fields of power, instrumentation and industrial computers. We will try to respond to you as soon as possible.