The really important old languages would be back-documented and old hardware simulated. I'm sure you could run multiple whole IBM 360 computers (from the 60s, and 370s from the 70s) on your laptop.
What generative AI (the current level) does is make things that would generate the prompt if kinda run in reverse. So prompt engineering is important. You don't just specify "build me a Java program that does what this 50 year old program does". You have to specify certain example inputs and outputs (test cases), specify that it use and interact with particular versions of libraries of code and database formats / query languages, etc. You can't really do that on a grand scale, but the more granular the pieces the easier they become. Pieces have interfaces with other pieces of code and those can all be specified in the "prompt".
Also, converting large old important programs is more than just running a stream of code through a magic translator. This is because interfaces, libraries, databases, and files have changed. As has programming knowledge. The Java libraries overlap the COBOL libraries only partially. Modern libraries and databases can be much more efficient but that requires refactoring code to take advantage of it. Data may have to be presented in novel arrangements.
Whereas an old DMV program might have wanted input such as for names in all CAPITAL letters with no punctuation and dead simple roman letter symbol codings, modern software can and should accommodate multi-lingual names and data with accents and cedillas and diacritical marks and dingbats. That has ramifications through the depth and breadth of the code.