What do High-Performance Computing and exascale computing mean?
High-Performance Computing (HPC) is a branch of computing that deals with scientific and engineering tasks computationally so demanding that calculations cannot be performed using general-purpose computers. The machines used in HPC are often referred to as supercomputers.
The next supercomputing frontier is the exascale performance, (i.e. at least 1018 or 1 billion billion calculations per second), which is expected to be reached around 2021-2022.
What kind of scientific, industrial and societal challenges can HPC address?
Given its inter-disciplinary nature and its ability to process large amounts of data and carry out complex computations, HPC is essential to address a wide range of scientific, industrial and societal challenges:
- Earth science and Climate:
HPC underpins climate study and prediction by allowing a more accurate and real-time weather forecasting, predicting and managing large-scale natural disasters such as devastating hurricanes, or studying the behaviour of oceans. Severe weather cost 149.959 lives and EUR 270 billion in economic damages in Europe between 1970 and 2012. Weather forecasting heavily depends on numerical simulation on supercomputers. The more powerful the supercomputer is, the more precisely and in advance climate scientists can predict the size and paths of storms and floods and help take decisions such as activate early warning systems in a timely manner for evacuating the population and saving human lives.
HPC technologies also provide an ever higher resolution simulation in climate change (for example, studying the behaviour of the oceans) and a more precise monitoring of earth resource evolution. HPC also improves our knowledge of geophysical processes and of the structure of the interior of the Earth, which helps us to better understand natural disasters, for example earthquakes. To create earthquake hazard maps, seismologists use the ground motion data recorded by more than 10,000 seismometers installed worldwide. But processing this huge amount of data can only be managed using powerful HPC infrastructures.
- Secure, clean and efficient energy:
HPC is a critical tool in designing renewable energy parks or high performance photovoltaic materials, testing of new and more efficient forms of materials for solar panels but also optimising turbines for electricity production. For instance, the commercial viability of wind farms can be predicted with accurate wind resource assessments, farm design and short-term micro-scale wind simulations to forecast the daily power production. For the wind energy generation industry, HPC is a crucial tool, especially in sites with complex terrain characteristics.
Thanks to fusion energy, today's nuclear power plants could be replaced by a safer, greener and virtually inexhaustible power source. Current experimental fusion reactors are using HPC to simulate and resolve the behaviours of fusion plasma, including instabilities, turbulent transport, plasma-wall interaction, and heating.
- Health, demographic change and wellbeing:
HPC is a driving force in developing new forms of medicine. Personalised and precision medicine strongly relies on HPC to process information about a person's genes, proteins, and environment to prevent, diagnose, and treat diseases. For example, in the case of cancer, each disease has its own genetic makeup, giving each tumour cell and tissue a unique character with specific tendencies and vulnerabilities. Personalised and precision medicine will redirect patients to the right treatment while answering their specific needs.
The early detection of rare diseases is another challenge that HPC can address in a more efficient manner. Thanks to HPC technologies, diagnosis and analysis that today would take weeks could be completed in a few days.
HPC also enables faster and more effective analysis of genome sequences. Roughly 4,100 genetic diseases affect humans, one of the main causes of infant deaths.
In biomolecular research, HPC is also used for investigating the dynamics of biomolecules and proteins in human cells, which is crucial to treat more efficiently autoimmune diseases but also cancer and diabetes. In brain research such as in the Human Brain Project FET Flagship, HPC is used for multi-scale and high-resolution simulation and modelling of the human brain to understand its organisation and functioning.
Last but not least, HPC is the cornerstone for developing new drugs. The time required to develop a new drug ranges between 10 and 17 years. There are also rising costs that make this increasingly unaffordable for both companies and patients. The testing of drug candidate molecules can be greatly accelerated by using HPC. HPC can also help to repositioning existing drugs for new diseases. This will benefit the treatment of the patients while strongly reducing the costs of the process.
- Food security, sustainable agriculture, marine research and the bio-economy:
HPC is crucial to develop a more sustainable agriculture by optimising the production of food, analysing sustainability factors and monitoring plagues, disease control and pesticides effects. HPC-enabled applications are for instance using radio-frequency identification tags (RFIDs) which can hold and automatically download a mass of data on the bale's moisture content, weight and GPS position. In the future, micro-tags of the size of soil particles will be deployed extensively to measure things as moisture, disease burden and even whether the crop is ready to harvest or not.
HPC technologies can also help to manage water and agricultural resources in a more efficient manner and assist vulnerable communities in the region through improved drought management and response.
- Cybersecurity and defence:
HPC is also essential for national security and defence, for example in developing complex encryption technologies, in tracking and responding to cyberattacks and in deploying efficient forensics, or in nuclear simulations.
In cybersecurity, HPC in combination with Artificial Intelligence and Machine Learning techniques is used to detect strange systems behaviour, insider threats and electronic fraud, very early cyber-attack patterns (in a matter of few hours, instead of a few days), or potential misuse of systems and take automated and immediate actions in order to act before hostile events occur.
HPC is also increasingly used in the fight against terrorism and crime, for example for face recognition or detection of suspicious behaviour in cluttered public spaces.
- Smart, green and integrated urban planning:
HPC technologies support the development of smarter citiesthanks to a more efficient control of large transport infrastructure which requires real time analysis of huge amounts of data.
The development of autonomous vehicles will for instance rely on HPC since these vehicles will use a large variety of data to constantly monitor and optimise navigation, condition of the road, state of the vehicle and passenger comfort and safety. Driverless cars will permanently exchange data with management and supervising systems and will sync up with large data-bases that are constantly feeding them with real-time information about the local environment, traffic situation, emergency alerts and weather conditions.
- Cosmology and astrophysics
Scientists are using HPC for observing the space more accurately, simulating violent events following the Big Bang that may have produced gravitational waves, for detecting supernovae and binary star systems, or for understanding dark matter and energy.
What is the relevance of HPC to the Digital Single Market?
The aim of the Digital Single Market is to tear down virtual barriers to move from 28 national markets to a single one. A fully functional Digital Single Market could contribute EUR 415 billion per year to the EU's economy and create hundreds of thousands of new jobs. HPC has huge potential for creating jobs as part of the Digital Single Market perspective.
HPC is a key factor in particular for the digitisation of industry, its innovation and competitiveness. EuroHPC environment will provide European industry, and in particular small and medium-sized enterprises (SMEs), with a better access to supercomputers to develop innovative products.
By handling and processing huge amounts of data in real time, HPC is fundamental to build a vibrant data economy and an integrated exascale computing and big data ecosystem will enable the EU to make the most of it, while ensuring a high level of data protection and security. The EuroHPC infrastructure will permit sensitive data to be processed in Europe, while keeping their privacy, ownership and right for access and exploitation in Europe.
Why is there a need for the EU to take initiative on HPC?
Despite efforts and investments made so far, the EU does not have the most performant supercomputers, and those existing depend on non-European technology. The available supply of computation time cannot satisfy an ever growing demand. To fill the gap, European scientists and industry increasingly process their data outside the EU. This can create problems related to privacy, data protection, commercial trade secrets, and ownership of data in particular for sensitive applications.
The European HPC technology supply chain is weak and the integration of European technologies into operational HPC machines remains insignificant. Without clear prospects of a lead market and of selling an exascale machine to the public sector, the European suppliers will not take the risk to develop the machines on their own.
Moreover, today each Member State is investing on its own for developing and acquiring HPC infrastructure. Despite significant investments both at national level but also at the Union level, compared to its competitors from USA, China or Japan, Europe is clearly underinvesting in HPC with a funding gap of EUR 500-750 million per year. The scale of the resources and financial investments that are needed to realise a sustainable exascale level HPC ecosystem has now become so important that no single country in Europe has the capacity to sustainably build it in timeframes that are compatible with those of non-EU competitors. Therefore, the Member States need to coordinate their HPC investment strategies at European level and to pool resources.
Pooling and rationalising efforts at EU level is a must. A shared infrastructure and common use of existing capabilities would benefit everyone, from industry, SMEs, science, public sector and especially the (smaller) Member States without self-sufficient national HPC infrastructures. It would secure in particular EU's own independent access to top HPC technology.
What is the EuroHPC initiative?
On 10 May 2017, in the Communication on the mid-term review of the Digital Single Market strategy, the European Commission confirmed its plans to invest in HPC and announced its intention to propose a new legal instrument that provides a procurement framework for an EU integrated exascale supercomputing and data infrastructure. The objective was to find an effective and efficient way to have Europe and Member States co-investing jointly to create a leading European HPC and Big Data ecosystem in terms of technology, applications and skills, underpinned by a world-class high performance computing and data infrastructure.
The initiative will allow the joint procurement of HPC machines, providing all Member States access to supercomputers with a performance comparable to the best machines in the world. These machines, integrated in a pan-European infrastructure, will be available to the scientific and industrial researchers and the public sector, independently of their location. The increased availability and accessibility of top HPC resources will motivate the users to keep their activities and data in Europe, helping to keep critical know-how and human potential in Member States.
Why is the Commission proposing a Joint Undertaking to implement the EuroHPC initiative?
Current funding instruments have limitations when applied to such large-scale cooperation on supercomputers. An impact assessment found that it is best to implement the EuroHPC through a Joint Undertaking. This legal instrument allows to join forces with the Member States to support the development of a pan-European High-Performance Computing and data infrastructure. It will address three urgent needs:
- to procure and deploy in Europe in competitive timeframes a world-class pre-exascale HPC infrastructure;
- to make it available to public and private users for developing leading scientific and industrial applications;
- to support the timely development of the next generation European HPC technologies and their integration into exascale systems in competitive timeframes with respect to our world competitors.
The EuroHPC Joint Undertaking will allow to efficiently combining joint procurement and ownership of supercomputers, as well as investment in the development of technology for the procured machines between the Commission and the Member States.
Who will be the members of the EuroHPC Joint Undertaking?
There will be two categories of members in the EuroHPC Joint Undertaking: public and private members. The public members will be the European Union (represented by the European Commission) and the 13 Member States and associated countries which have already signed the EuroHPC Declaration. Other Members States and associated countries can join the Joint Undertaking at any moment.
The private members of the Joint Undertaking will be representatives from HPC and Big Data stakeholders, including academia and industry. Two contractual PPPs (the ETP4HPC and the Big Data Value Association) have submitted letters of support to the implementation of the EuroHPC Joint Undertaking.
The EuroHPC Joint Undertaking is foreseen to start operating in 2019 and will remain operational until the end of 2026.
What will be the budget of the EuroHPC Joint Undertaking?
The EuroHPC Joint Undertaking shall be jointly funded by its members. The Union's financial contribution will cover administrative and operational costs and be up to EUR 486 million, through budgetary commitments made in the current Multiannual Financial Framework (MFF) and more specifically in both Horizon 2020 and Connecting Europe Facility (CEF) programmes.
This amount is to be matched by a similar amount from the EuroHPC participating states, as part of their national programmes on HPC. The private entities should also provide in-kind contributions, as part of their current commitment to the contractual public-private-partnerships ETP4HPC and BDVA.
With a total budget of approximately EUR 1 billion, the Joint Undertaking will function until 2026.
The EuroHPC Joint Undertaking will provide financial support in the form of procurement or Research & Innovation grants to participants following open and competitive calls. These calls are similar to the ones the Commission is operating under Horizon 2020 or for innovation procurement purposes.
What is the link between HPC, Artificial Intelligence and deep learning technologies?
Deep learning technologies are facilitated by the use of HPC. Supercomputing capacity used together with Artificial Intelligence allows Machine Learning to become faster and more efficient, which in turn helps to create more innovative solutions and technologies that improve our daily lives.
Recently, using Artificial Intelligence and deep learning techniques in combination with HPC has also led to major breakthroughs in areas like image segmentation (pattern recognition), speech recognition (recognition and translation of spoken language into text by computers) or self-driving cars.
The combination of HPC, Artificial Intelligence and deep learning technologies is important for fields like cybersecurity, where it helps to detect in an early stage strange system behaviour, insider threats and electronic fraud, and other cyber-attack patterns (in a matter of few hours, instead of a few days). It also helps to identify potential misuse of systems and take automated and immediate actions before hostile events occur.
For More Information