The Unix epoch problem is completely unrelated to a program being 32-bit or not. The architecture affects the maximum addressable memory space, not the size of individual types. You could easily define and use a 128-bit type in a 16-bit environment, for example.
The epoch problem is simply due to a bad design call a long time ago - one that proved foundational and incredibly difficult to change once it’d become an entrenched standard. They could have made timestamps 64-bit at the time, and probably would have if they’d known their work would survive the several decades it’d take for that decision to pay off.
The Unix epoch problem is completely unrelated to a program being 32-bit or not. The architecture affects the maximum addressable memory space, not the size of individual types. You could easily define and use a 128-bit type in a 16-bit environment, for example.
The epoch problem is simply due to a bad design call a long time ago - one that proved foundational and incredibly difficult to change once it’d become an entrenched standard. They could have made timestamps 64-bit at the time, and probably would have if they’d known their work would survive the several decades it’d take for that decision to pay off.