Complexity Lower Bounds from Algorithm Design
Since the beginning of the theory of computation, researchers have been fascinated by the prospect of proving impossibility results on computing. When and how can we argue that a task cannot be efficiently solved, no matter what algorithm we try to use?In this short article, I will briefly introduce...
Saved in:
| Published in: | Proceedings of the 36th Annual ACM/IEEE Symposium on Logic in Computer Science pp. 1 - 3 |
|---|---|
| Main Author: | |
| Format: | Conference Proceeding |
| Language: | English |
| Published: |
IEEE
29.06.2021
|
| Subjects: | |
| Online Access: | Get full text |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Since the beginning of the theory of computation, researchers have been fascinated by the prospect of proving impossibility results on computing. When and how can we argue that a task cannot be efficiently solved, no matter what algorithm we try to use?In this short article, I will briefly introduce some of the ideas behind a research program in computational complexity that I and others have studied, for the last decade. (The accompanying talk will contain more details.) The program begins with the observations that:(a) Computer scientists know a great deal about how to design efficient algorithms.(b) However, we do not know how to prove many weak-looking complexity lower bounds.It turns out that certain knowledge we have from (a) can be leveraged to prove complexity lower bounds in a systematic way, making progress on (b). For example, progress on faster circuit satisfiability algorithms (even those that barely improve upon exhaustive search) automatically imply circuit complexity lower bounds for interesting functions. 1 |
|---|---|
| DOI: | 10.1109/LICS52264.2021.9470522 |