Отправляет email-рассылки с помощью сервиса Sendsay
←  Предыдущая тема Все темы Следующая тема →
пишет:

Introducing the high-class product!

What does the term «exascale» mean? You can probably confuse it with something related to the realm of scientific fiction. However, it's defined as a pretty much straightforward machine that can conduct more than one quintillion calculations in the shortest time possible, namely the time it takes to pronounce the words “one” and “Mississippi.” In this way, it's impossible to compare it to the human brain, which requires a second to complete a single calculation.

 Exascale Computation Project (ECP)

Frontier was the first model of the exascale computing machine that emerged in the global arena in 2022. It was produced at Oak Ridge National Laboratory, a highly acclaimed research institution. It showed impressive calculating speed as the new computer outperformed the model that ranked as the second-fastest computing machine in the world. The new electronic brain was 2.5 times more potent than the previous model. It's anticipated that competition will be fierce in the near future due to the emergence of other machines of this sort, like El Capitan from Lawrence Livermore National Lab and Aurora from Argonne National Lab.

It should come as no surprise that these top-notch computers are created by experts from renowned institutions for scientific innovation. The National Nuclear Security Administration, one of the DOE's main agencies, is responsible for developing these items. Exascale computers are crucial for carrying out the necessary calculations for this federal institution, which is in charge of keeping track of the nuclear arsenal reserves. In the field of theoretical research, exascale machines can also be used for other intended objectives.

Scientists seek to learn the most important details about Frontier in many fields as soon as they cease implementing the project that is intended to be speculative. For instance, they will use pertinent simulations to illustrate topics like the method of energy production and the theory of the universe's evolution, to mention a few. Even with the powerful computing machines that were the subject of earlier research and development, it would have been impossible to finish undertakings of this nature.

Douglas Kothe, associate director for the Computing and Computational Sciences Directorate at the Department of Energy's Oak Ridge National Lab, states that, at its core, humanity could have designed and employed an exascale computer well before, but this machine wouldn't have been workable, viable, and affordable by existing regulations and norms. Such requirements were hindered by issues including massively parallel processing, exaenergy consumption, dependability, memory, and storage, as well as a lack of software that could be used with such super-computers. These obstacles were eventually removed to the satisfaction of scientists after years of concerted cooperation with the high-performance computing sector.

In comparison to its predecessors, Frontier can work seven times quicker and store four times as much data in its memory. It consists of roughly 10,000 central processing units (CPUs), which carry out computer instructions and are typically developed from integrated circuits, and approximately 38,000 graphics processing units (GPUs). GPUs were developed to depict graphic elements in video games rapidly and seamlessly. But because they are effective at handling data in parallel, they have been reused for scientific computation.

Inside a high-class computer, one can find a combination of two types of processors. The GPUs carry out repetitious algebraic operations in parallel. Kothe assumes that this approach allows for exempting the CPUs in order to let them complete tasks quicker and more rationally. By using the method of dividing complex issues into minor ones, it's easy for processors to work effectively. The obtained findings are transformed into the ultimate reply. In this scenario, CPUs resemble superiors, while GPUs resemble employees.

Moreover, there are 9,472 various nodes in the newly developed computing system. They are interconnected appropriately to convey data promptly from one location to another. Significantly, Frontier fails to merely operate quicker than computing machines of past years. The key point is that its memory is larger, and for this reason, it can carry out greater simulations and store a raft of information in the same location where the data is handled. This process is both convenient and time-consuming.

With these amazing characteristics, Frontier and the super machines that will come after it can instruct people about the things around them in ways that previously wouldn't have been clear. These machines can improve meteorological forecasting, making it more detailed and timely. Also, Frontier can enhance chemical studies, including experiments with various molecular arrangements to find out which of them are appropriate for the generation of excellent medicinal molecules or superconductors. Besides, Frontier has already studied all the genetic alterations of the virus known as SARS-CoV-2 that contribute to the spread of COVID infection. In other words, it helps to better understand how these alterations affect the virus's transmissibility. Researchers have enough time to make changes to their theories and run a series of new cyber experiments at a quick pace.

According to Kothe, scientists no longer need to rely on the same approximate findings they used in the past. This is due to this level of computational capacity. When he employed earlier systems, he frequently had to admit that the terms he used were inconsequential. Perhaps he didn't require that type of equation. The phrase making a "spherical cow" refers to the process of highly simplifying a complicated phenomenon, such as a bovine, into something much simpler, such as a ball. Scientists want to make a difference with exascale computing machines and replicate a cow, something that is tightly bound to a depiction of reality.

Mainly due to Frontier's enhanced hardware, that transformation was made. Yet without software that can take advantage of the machine's increased power, technology by itself doesn't really help scientists all that much. Because of this, 24 new science-coding initiatives have been funded. The creation of supercomputers by the Exascale Computation Project (ECP) has led to a collaboration that unites the Department of Energy and its National Nuclear Security Administration with other players in the industry.

Such software projects cannot merely upload outdated code designed to foster simulation, you know, the one of the growth of unexpected bad weather conditions. They don't just claim that it produced an acceptable forecast instantly. They require a boosted and optimized collection of codes in order to obtain a more precise outcome. Kothe, who serves as ECP's director, states that these methods are focused on results.

However, Salman Habib, who is responsible for a scientific initiative called ExaSky, claims that it can be challenging to obtain significant results. In his account, supermachines are recognized as crude instruments. It's our duty to begin utilizing them wisely. So how should one go about doing it? Habib, the head of Argonne's computer science unit, believes that it's important to test diverse scenarios, including the universe's origin and the development of its elements. The simulation framework enables the expansion of knowledge about complicated issues, with a focus on the study of the pertinent circumstances of progress across time.

Professional astronomers are engaged in serious studies, such as Arizona's Dark Energy Spectroscopic Instrument. They have already revealed the shadowy schemes of space by explicating how galaxies develop, take on their current shapes, and expand as the universe grows. Unfortunately, it's impossible to give an explanation to all phenomena identified in the cosmos.

ExaSky is one of the effective theory and modeling techniques that may be useful to perform this task. Theorists believe that dark energy can behave in a particular way or that our understanding of gravity is incorrect. They can modify the simulation to incorporate those ideas into their research. Astronomers can then examine how the digital universe matches or departs from the data their telescopes detect. For theorists and modelers, a computing machine serves as a virtual cosmos, according to Habib.

ExaSky is based on more complex engineering processes than those developed for supercomputers. However, simulation models haven't produced influential scientific discoveries in the field of the universe's elements. Habib states that the study researchers have initiated today provides massive opportunities for modeling practices. Exascale computing devices make it possible to manage the simulation of larger amounts of space when the right techniques are used. Hence, the issue will soon become clear.

ExaStar, a further Frontier solution being directed by Daniel Kasen of the Lawrence Berkeley National Lab, will investigate a fresh dark space secret. This project will conduct the simulation of supernova stars, specifically the explosive behavior of large stars and the synthesis of heavy metals. The way these stars disintegrate is only loosely understood by researchers. This phenomenon continues to be a mystery.

The majority of supernova-related simulations that were conducted in the previous years acknowledge that stars have spherical symmetry. Researchers employed simple physics techniques to make the scenario easier to understand. Exascale computers help researchers produce more complex three-dimensional modeling. In addition, rather than merely executing the code for a single explosion, they may perform entire suites, studying which variables result in what experts actually observe in the sky. These suites can include many types of stars and physics concepts.

Kasen believes that star bursts are remarkable phenomena. But they also play a significant role in the universe's history. They supplied the materials that created Earth, humanity, and the telescopes that allow us to see farther than our planet. Digital experiments are both feasible and less harmful than physical ones, even though their severe effects can't be precisely reproduced in real studies.

Exploring earth-bound extraordinary occurrences, such as atomic reactors and their processes, is the focus of another undertaking. Exascale computerization will be used in the ExaSMR study to analyze the behavior of small-sized modular nuclear reactors, which have the potential to be helpful in the future. In the past, supercomputers were able to carry out the modeling of one element of a nuclear reactor at a time. They were able to work on the complete system in the years that followed. One of the team's leaders, Steven Hamilton of Oak Ridge, asserts that the research being done today will advance the development of nuclear reactors.

Hamilton and his colleagues will explore the motion of neutrons, their impact on the chain reaction created by the process of nuclear fission, and the circulation of heat. In the past, the computing device's memory would not have allowed for immediate calculations for the entire process, preventing an in-depth understanding of the movement of heat in the system.

 According to Hamilton, the next stage is to develop more thorough designs for nuclear reactors to improve their performance and safety.

It makes sense that the development of fission-based weapons has always been closely related to nuclear energy.

 Teresa Bailey leads a 150-person research team at Lawrence Livermore that is working to prepare the coding system needed to start the simulation of weapons on El Capitan.

 She is the project's associate director and in charge of the Lawrence Livermore computational physics team.

 Her responsibility is to ensure that the Advanced Simulation and Computing initiative's key phases are under control.

 The ECP and the Advanced Technology Development and Mitigation program that aid in code modernization both provide support to research units from the NNSA laboratories engaged in R&D.

Ask any scientist if computing machines like Aurora, El Capitan, or Frontier are ultimately adequate, and you will never get a positive response.

 Many researchers would continually demand stronger and more powerful analytical abilities.

 Additionally, there is external pressure to continue developing computer science, not just for the sake of a good reputation, although this reason is perfect too, but also because more accurate simulations may result in the discovery of new medications, the creation of innovative materials, or the awarding of new Nobel Prizes, all of which would keep the nation at the highest level of development.

Scientists are already discussing the "post-exascale" future ages—what happens once they can solve one quintillion mathematical problems in a second—as a result of all those reasons.

 Quantum computing or enhancing exascale systems with additional artificial intelligence may be part of that future.

 Or perhaps it's something completely different.

 Perhaps a simulation should be performed to determine the most likely result or the best course of action.

 

Source: Apostol Dmitry (ADE) I express my deep support in Bcm, Bloomberg; Refinitiv.

Вступите в группу, и вы сможете просматривать изображения в полном размере

Это интересно
0

07.03.2023
Пожаловаться Просмотров: 86  
←  Предыдущая тема Все темы Следующая тема →


Комментарии временно отключены