Randomized algorithms in systems without coordination and centralization

Uloženo v:
Podrobná bibliografie
Název: Randomized algorithms in systems without coordination and centralization
Autoři: Oksana Kubaychuk, Denis Sai
Zdroj: Collection "Information Technology and Security". 12:80-90
Informace o vydavateli: Igor Sikorsky Kyiv Polytechnic Institute, 2024.
Rok vydání: 2024
Popis: Evaluating the complexity of algorithms based only on the possibility of the worst possible variant of the input data is often not justified. The development of algorithms that would predictably work quickly on all possible inputs is of practical importance. If for the problem there is a reasonable opportunity to model the distributions of input values, then you can use probabilistic analysis as a method of developing effective algorithms. When the information about the distribution of input values is not enough for their numerical modeling, algorithms are developed by giving a part of the algorithm itself a random character - randomized algorithms. The use of randomization ensures the operation of the algorithm with minimal needs to store internal states and events in the past, and the algorithms themselves look compact. The paper studies problems for which there are relatively effective deterministic algorithms for solving. But, as will be shown, the construction of appropriate randomized algorithms leads to effective and efficient parallel computing schemes with linear complexity on average. The advantages of randomization are especially evident in the case of large computer systems and communication networks that function without coordination and centralization. Examples of such distributed systems are, in particular, networks of currently popular cryptocurrencies. The use of randomized heuristics allows the system to adapt to changing operating conditions and minimizes the likelihood of conflicts between processes. The paper shows the advantages of using a randomized algorithm over deterministic algorithms for the problem of routing in a network with a hypercube topology. A theorem on estimating the expected number of steps required by Valiant's randomized algorithm to deliver all messages to an address is proved. The expected linear complexity of Valiant's algorithm is a direct consequence of the proven theorem.
Druh dokumentu: Article
ISSN: 2518-1033
2411-1031
DOI: 10.20535/2411-1031.2024.12.1.306274
Rights: CC BY
Přístupové číslo: edsair.doi...........d21d4639e5eb7016673e05bc9c8b0fea
Databáze: OpenAIRE
Popis
Abstrakt:Evaluating the complexity of algorithms based only on the possibility of the worst possible variant of the input data is often not justified. The development of algorithms that would predictably work quickly on all possible inputs is of practical importance. If for the problem there is a reasonable opportunity to model the distributions of input values, then you can use probabilistic analysis as a method of developing effective algorithms. When the information about the distribution of input values is not enough for their numerical modeling, algorithms are developed by giving a part of the algorithm itself a random character - randomized algorithms. The use of randomization ensures the operation of the algorithm with minimal needs to store internal states and events in the past, and the algorithms themselves look compact. The paper studies problems for which there are relatively effective deterministic algorithms for solving. But, as will be shown, the construction of appropriate randomized algorithms leads to effective and efficient parallel computing schemes with linear complexity on average. The advantages of randomization are especially evident in the case of large computer systems and communication networks that function without coordination and centralization. Examples of such distributed systems are, in particular, networks of currently popular cryptocurrencies. The use of randomized heuristics allows the system to adapt to changing operating conditions and minimizes the likelihood of conflicts between processes. The paper shows the advantages of using a randomized algorithm over deterministic algorithms for the problem of routing in a network with a hypercube topology. A theorem on estimating the expected number of steps required by Valiant's randomized algorithm to deliver all messages to an address is proved. The expected linear complexity of Valiant's algorithm is a direct consequence of the proven theorem.
ISSN:25181033
24111031
DOI:10.20535/2411-1031.2024.12.1.306274