1 Comment
11 hrs ago·edited 7 hrs ago

As you note, it's important to distinguish between code size and data size. In 2003 I posted an article to comp.arch (now contained in https://jlforrest.wordpress.com/2017/03/06/the-forrest-conjecture/) that attempts to point out that even 32 bits is plenty now for code, while something larger than 32 bits is required for data. It wouldn't make sense to have a processor with different data and code sizes, so today's 64 bit code and data processors are a very good workaround.

As I mention in my paper, it's hard to imagine how long this will remain true, as long as programs are written by humans. I have no idea how long it will take for programs to be written by some kind of AI, and what the necessary processor model will have to be then.

Expand full comment