Phil Karlton famously said there are only two hard things in Computer Science: cache invalidation and naming things. I gather many folks suppose "naming things" is about whether to use camel-case or not, or picking specific symbols we use to name things, which is obviously trivial and mundane. But I always assumed Karlton meant the problem of making references work: the task of relating intension (names and ideas) and extension (things designated) in a reliable way, which is also the same topic as cache invalidation when that's about when to stop the association once invalid.
https://web.archive.org/web/20130805122711/http://lambda-the...
1) cache invalidation
2) naming things
0) off-by-one
f[<Intension|Extension>] == 0
Also — “Jevon’s paradox”. That one is nasty! For example: just about anything we do to decrease use of fossils fuels by some small percent, makes the corresponding process more efficient, more profitable, and thus happen more. That’s a nasty nasty problem. I guess it’s not specific to computer science, but all engineering.