I'm sure the folks who wrote this are every bit as good programmers as I ever was, but I'm still having trouble wrapping my head around this.ġ) PC-BASIC is written on top of Python, and your 28MB package basically includes a full Python distribution plus a number of third party libraries and a whole lot of other crud that may or may not be necessary for it to run. How does that happen? Were programmers really 200 times as diligent about being efficient? I know computers were smaller and necessity is the mother of invention, but REALLY? 200 times? That's extraordinarily astounding!! How does that happen? I'm a computer programmer and I simply can't imagine how that happens. Basically it takes 200x as much code to do the same thing we did over 30 years ago. For the sake of easy math let's say there was a BIOS that took 20KB for a total of 140KB. It ran on top of MS-DOS that I think took about 50KB. ![]() This is nothing on my computer with over 1TB of storage and 16GB for RAM, but it still amazes me when I consider that the GW-BASIC took 70KB. ![]() ![]() The directory for PC-Basic contains about 28MB of files and taskmanager is showing 26.1MB of RAM used. It did everything I needed it to do so I gave it 5 stars, but it truly amazes me how inflated computer programs have gotten.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |