МЕТОДИ ВЕБ-РОЗРОБКИ ТА ТЕСТУВАННЯ ТОКЕНІЗАЦІЇ ДАНИХ ТА ОБ'ЄКТІВ ДЛЯ РЕАЛІЗАЦІЇ ДОВІРЧИХ СОЦІАЛЬНИХ СЕРВІСІВ У РОЗПОДІЛЕНИХ КОМП'ЮТЕРНИХ МЕРЕЖАХ

Saved in:
Bibliographic Details
Title: МЕТОДИ ВЕБ-РОЗРОБКИ ТА ТЕСТУВАННЯ ТОКЕНІЗАЦІЇ ДАНИХ ТА ОБ'ЄКТІВ ДЛЯ РЕАЛІЗАЦІЇ ДОВІРЧИХ СОЦІАЛЬНИХ СЕРВІСІВ У РОЗПОДІЛЕНИХ КОМП'ЮТЕРНИХ МЕРЕЖАХ (Ukrainian)
Alternate Title: METHODS OF WEB DEVELOPMENT AND TESTING OF DATA AND OBJECT TOKENIZATION FOR TRUSTED SOCIAL SERVICES IN DISTRIBUTED COMPUTER NETWORKS. (English)
Authors: Міхав Володимир Володимирович, Мелешко Єлизавета Владиславівна, Якименко Микола Сергійович, Босько Віктор Васильович, Лисенко Ірина Анатоліївна
Source: Cybersecurity: Education, Science, Technique / Kiberbezpeka: Osvita, Nauka, Tekhnika; 2025, Vol. 3 Issue 31, p386-404, 19p
Subject Terms: WEB development, TEST methods, INFORMATION technology security, ELECTRONIC data processing, SEMANTIC Web, METADATA, BLOCKCHAINS
Abstract: The article presents a comprehensive study of web development methods and tools for implementing tokenization of data and objects in the Web 3.0 environment, with a focus on the backend level. Modern smart-contract standards ERC-20, ERC-721/1155, ERC-1400/3643, SPL, FA2, and Cadence are analyzed, along with metadata storage approaches (on-chain, IPFS, Arweave), indexing mechanisms such as The Graph, and scaling architectures including L1/L2, zk-rollup solutions, and alternative L1 networks. A classification of token types by purpose and of their implementation methods is developed, as well as a classification of web tools by functional layers, which made it possible to identify specific features and differences in requirements for performance, interoperability, compliance, and information security. The comparative analysis shows that EVM-based implementations are the most mature and optimal for utility tokens, whereas NFT systems impose higher demands on secure off-chain storage, and security and RWA tokens provide regulatory controls and access management at the cost of a lower degree of decentralization. Based on a review of projects, it is shown that second-layer technologies, in particular Polygon zkEVM and Immutable X, reduce gas costs and increase transaction throughput compared to base networks. The paper also proposes a unified formal methodology for testing tokenization processes, which covers verification of operation correctness, state invariants, access policies, cryptographic guarantees, and transaction execution efficiency. The methodology integrates formal models of token behavior with gas-efficiency criteria and enables comprehensive assessment of the reliability of tokenization systems независимо of the underlying standard or technological stack. The results can be used for designing Web 3.0 backend architectures, developing trust-based social services in distributed computer networks, and creating a methodological foundation for automated token testing in multichain ecosystems. [ABSTRACT FROM AUTHOR]
Copyright of Cybersecurity: Education, Science, Technique / Kiberbezpeka: Osvita, Nauka, Tekhnika is the property of Cybersecurity: Education, Science, Technique / Kiberbezpeka: Osvita, Nauka, Tekhnika and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
Database: Complementary Index
Description
Abstract:The article presents a comprehensive study of web development methods and tools for implementing tokenization of data and objects in the Web 3.0 environment, with a focus on the backend level. Modern smart-contract standards ERC-20, ERC-721/1155, ERC-1400/3643, SPL, FA2, and Cadence are analyzed, along with metadata storage approaches (on-chain, IPFS, Arweave), indexing mechanisms such as The Graph, and scaling architectures including L1/L2, zk-rollup solutions, and alternative L1 networks. A classification of token types by purpose and of their implementation methods is developed, as well as a classification of web tools by functional layers, which made it possible to identify specific features and differences in requirements for performance, interoperability, compliance, and information security. The comparative analysis shows that EVM-based implementations are the most mature and optimal for utility tokens, whereas NFT systems impose higher demands on secure off-chain storage, and security and RWA tokens provide regulatory controls and access management at the cost of a lower degree of decentralization. Based on a review of projects, it is shown that second-layer technologies, in particular Polygon zkEVM and Immutable X, reduce gas costs and increase transaction throughput compared to base networks. The paper also proposes a unified formal methodology for testing tokenization processes, which covers verification of operation correctness, state invariants, access policies, cryptographic guarantees, and transaction execution efficiency. The methodology integrates formal models of token behavior with gas-efficiency criteria and enables comprehensive assessment of the reliability of tokenization systems независимо of the underlying standard or technological stack. The results can be used for designing Web 3.0 backend architectures, developing trust-based social services in distributed computer networks, and creating a methodological foundation for automated token testing in multichain ecosystems. [ABSTRACT FROM AUTHOR]
ISSN:26634023
DOI:10.28925/2663-4023.2025.31.1027