17

It's quite clear that the 6502 version of Microsoft BASIC at all levels uses substantially the same structure and technique as the earlier 8080 and 6800 versions. As has been pointed out in various places on this site and elsewhere, this came at a noticeable cost in efficiency. As just one example, supercat mentions:

...if program lines and strings had been stored in reverse order, that would have made a lot of things more efficient on the 6502, since code to iterate through a string could simply output (stringPtr),y and decrement y until it hits zero, rather than having to check each index against the string length.

This no doubt was inefficient not only for speed but also for space, which was a known problem with the 6502 version. "8K BASIC" for the 8080 and 6800 did indeed run in an 8K RAM system (albeit leaving little room for a program) and easily into an 8K ROM. Despite all efforts, the code for "BASIC M6502 8K VER 1.1" came in at about 9 KB. (MS lost a sale to Atari due to the size; in 1978 Atari tried to squeeze it down to 8K, failed, and contracted out to Shepardson Microsystems for a new BASIC that did fit.)

What would cause them to code it this way, rather than optimising for the rather different architecture of the 6502?


Note: If you come across further particularly good references for the efficiency issues of the 6502 version, feel free to mention them in the comments or even edit them into this post.

12
  • 19
    As Larry Wall pointed out in Programming Perl, there are many things you can optimize for: "computer speed or programmer speed, verbosity or conciseness, readability or maintainability or reusability or portability or learnability or teachability". Microsoft chose programmer speed (and, with it, development cost and time-to-market) and the reliability of a proven piece of software. Keep in mind that (a) there were other BASICs around (hence the need to get their version out fast), and that BASIC would eventually go into ROM with no way to fix bugs in the field. Commented Jun 6, 2022 at 7:35
  • 8
    Indeed. "Optimizing" is meaningless until you say what you're optimizing for. Programmers are expensive, so there's a lot of sense in optimization for low programmer latency (which is, of course, why we have high-level languages).
    – dave
    Commented Jun 6, 2022 at 12:40
  • 2
    I would not expect over 10% of space to be saved without a radical rewrite, hence just changing directions of loops etc may have been discounted as it would not solve the problem. Commented Jun 7, 2022 at 10:06
  • 3
    @MauryMarkowitz Ghastly results for you, yes, if no BASIC is better for you than a slow BASIC. For Atari, it's probable that not having a computer with a working BASIC at the January 1979 CES would have been the far more ghastly outcome. And since they were spending the money and bearing the financial consequences, they got to choose.
    – cjs
    Commented Jun 7, 2022 at 16:28
  • 2
    @hotpaw2 I think you are confused. While the MS dialect of BASIC is based on DEC's BASIC-PLUS, which Bill Gates had used, I've seen nothing ever to indicate that any of the code of the interpreter itself took its design from another implementation. Further, even a casual look at BASIC-PLUS indicates that it had dramatic design differences, such as being a compile-and-go system using bytecode rather than an interpreter, lack of "tokenised" save formats for editable code, and so on.
    – cjs
    Commented Jun 16, 2022 at 10:29

1 Answer 1

28

The main reason was no doubt reliability and speed of development. Back in the late '70s comprehensive automated testing suites for microcomputer software were uncommon, at best, and Microsoft certainly didn't have one for the early versions of BASIC. When they were writing the 6502 version of Microsoft BASIC they already had two versions, 8080 and 6800, that were well-debugged by that point and quite reliable. Thus it made sense to start from those and re-use as much of the structure and techniques as possible since this would make development somewhat faster and save considerable time in testing.

(As Michael Graf points out, the pressure for reliability and speed may not have been entirely internal to Microsoft. The BASIC was intended for ROM, which would greatly increase the cost of any bugs, and in those early days there may have been both demand from clients to get something quickly for their upcoming computers and other competitors wanting to supply that demand.)

It's also possible that the developers of the 6502 version at that early point simply didn't understand the quirks of the 6502 architecture as well as they understood the 8080 and 6800. It was a relatively new CPU when MS BASIC was ported to it and hadn't yet had a lot of software written for it.

3
  • 1
    I suppose it's likely that even the developers of the 6502 didn't really understand how the various aspects of the design would fit together in practical applications, as evidenced by the design of pre-indexed addressing. I wonder if the silicon footprint would have had enough space to made that mode useful for working with 16-bit data by masking out the bottom bit of the operand when performing address calculations, but then made it so that the bottom address bit of the target address would be set out if the bottom bit of the operand was likewise? I think that mode could have been...
    – supercat
    Commented Jun 6, 2022 at 15:26
  • 1
    ...vastly more popular under such a scenario, whereas at present it's just about useless. The only programs I can think of that ever use it do so when X is zero, meaning all the circuitry associated with the pre-indexing address calculations is essentially wasted.
    – supercat
    Commented Jun 6, 2022 at 15:27
  • 1
    Jef Raskin quote on AppleSoft BASIC (which was a contracted customization of Microsoft's 6502 BASIC): "more bugs than an African swamp".
    – hotpaw2
    Commented Jun 7, 2022 at 22:36

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .