Information-theoretic bounds on average signal transition activity [VLSI systems]
Transitions on high-capacitance buses in very large scale integration systems result in considerable system power dissipation. Therefore, various coding schemes have been proposed in the literature to encode the input signal in order to reduce the number of transitions. In this paper, we derive lowe...
Uložené v:
| Vydané v: | IEEE transactions on very large scale integration (VLSI) systems Ročník 7; číslo 3; s. 359 - 368 |
|---|---|
| Hlavní autori: | , , |
| Médium: | Journal Article |
| Jazyk: | English |
| Vydavateľské údaje: |
Piscataway, NJ
IEEE
01.09.1999
Institute of Electrical and Electronics Engineers |
| Predmet: | |
| ISSN: | 1063-8210, 1557-9999 |
| On-line prístup: | Získať plný text |
| Tagy: |
Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
|
| Shrnutí: | Transitions on high-capacitance buses in very large scale integration systems result in considerable system power dissipation. Therefore, various coding schemes have been proposed in the literature to encode the input signal in order to reduce the number of transitions. In this paper, we derive lower and upper bounds on the average signal transition activity via an information-theoretic approach, in which symbols generated by a process (possibly correlated) with entropy rate H are coded with an average of R bits per symbol. The bounds are asymptotically achievable if the process is stationary and ergodic. We also present a coding algorithm based on the Lempel-Ziv data-compression algorithm to achieve the bounds. Bounds are also obtained on the expected number of ones (or zeros). These results are applied to determine the activity-reducing efficiency of different coding algorithms such as, entropy coding, transition signaling, and bus-invert coding, and determine the lower bound on the power-delay product given H and R. Two examples are provided where transition activity within 4% and 9% of the lower bound is achieved when blocks of eight symbols and 13 symbols, respectively, are coded at a time. |
|---|---|
| ISSN: | 1063-8210 1557-9999 |
| DOI: | 10.1109/92.784097 |