Randomized Distributed Function Computation with Semantic Communications: Applications to Privacy
Randomized distributed function computation refers to remote function computation where transmitters send datato receivers which compute function outputs that are randomized functions of the inputs. We study the applications of semantic communications in randomized distributed function computation t...
Uložené v:
| Vydané v: | 2024 IEEE INTERNATIONAL WORKSHOP ON INFORMATION FORENSICS AND SECURITY, WIFS 2024 s. 1 - 6 |
|---|---|
| Hlavný autor: | |
| Médium: | Konferenčný príspevok.. |
| Jazyk: | English |
| Vydavateľské údaje: |
IEEE
02.12.2024
|
| Edícia: | IEEE International Workshop on Information Forensics and Security |
| Predmet: | |
| ISBN: | 9798350364439, 9798350364422 |
| ISSN: | 2157-4774 |
| On-line prístup: | Získať plný text |
| Tagy: |
Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
|
| Shrnutí: | Randomized distributed function computation refers to remote function computation where transmitters send datato receivers which compute function outputs that are randomized functions of the inputs. We study the applications of semantic communications in randomized distributed function computation to illustrate significant reductions in the communication load, with a particular focus on privacy. The semantic communication framework leverages generalized remote source coding methods, where the remote source is a randomized version of the observed data. Since satisfying security and privacy constraints generally require a randomization step, semantic communication methods can be applied to such function computation problems, where the goal is to remotely simulate a sequence at the receiver such that the transmitter and receiver sequences follow a target probability distribution. Our performance metrics guarantee (local differential) privacy for each input sequence, used in two different distributed function computation problems, which is possible by using strong coordination methods even without common randomness.This work provides lower bounds on Wyner's common information (WCI), which is one of the two corner points of the coordination-randomness rate region characterizing the ultimate limits of randomized distributed function computation. The WCI corresponds to the case when there is no common randomness shared by the transmitter and receiver. Moreover, numerical methods are proposed to compute the other corner point for continuous-valued random variables, for which an unlimited amount of common randomness is available. Results for two problems of practical interest illustrate that leveraging common randomness can decrease the communication load as compared to the WCI corner point significantly. We also illustrate that semantic communication gains over lossless compression methods are achieved also without common randomness, motivating further research on limited common randomness scenarios. |
|---|---|
| ISBN: | 9798350364439 9798350364422 |
| ISSN: | 2157-4774 |
| DOI: | 10.1109/WIFS61860.2024.10810724 |

