Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The Real Story of Hacking Together the Commodore C128

samzenpus posted about 9 months ago | from the back-in-the-day dept.

Hardware 179

szczys writes "Bil Herd was the designer and hardware lead for the Commodore C128. He reminisces about the herculean effort his team took on in order to bring the hardware to market in just five months. At the time the company had the resources to roll their own silicon (that's right, custom chips!) but this also meant that for three of those five months they didn't actually have the integrated circuits the computer was based on."

cancel ×

179 comments

Sorry! There are no comments related to the filter you selected.

Mistake (0)

Anonymous Coward | about 9 months ago | (#45642605)

"roll their of silicon" should be, roll their OWN silicon.

Re:Mistake (0, Offtopic)

girlintrainingpants (1954872) | about 9 months ago | (#45642731)

My first experience with programming started on this computer. For some reason, knowing that it was possible to create your own stuff (next to running somebody else's stuff) fascinated me. A cousin of mine (who already had some programming experience on the Commodore) showed me the basics. Moreover, I also owned several C= programming books (given to me by some relatives) which I used as a reference, although I was not always able to understand all these concepts as a kid.

The first C128 BASIC program I ever wrote looked basically like this:

10 INPUT "WHAT IS YOUR NAME";A$
20 PRINT "HELLO ";A$;"!"

It was just a very simple program which asked the user to type his name and responded by sending a friendly greeting to the user. Of course, these two lines were a little boring, so usually I added two lines in the beginning which cleared the screen, changed the color of the text and I used some POKE'ing to change to colors of the main screen and screen border to make the program look a little prettier.

Hehehe!
-GiTP =)

Re:Mistake (0, Offtopic)

mingot (665080) | about 9 months ago | (#45642815)

My first program was:

10 PRINT "FUCK YOU ";
20 GOTO 10

I lovingly entered it into every department store demo that I ever walked past.

Re:Mistake (4, Funny)

GrahamCox (741991) | about 9 months ago | (#45642959)

Ah, but you weren't a true C64 department-store hacker until you entered the couple of POKEs that disabled RUN/STOP and RESTORE keys before entering that loop.

Re:Mistake (1)

Dishevel (1105119) | about 9 months ago | (#45643271)

I used to do shit like that at Federated. Loved Fred Rated.

Re:Mistake (1)

Neo-Rio-101 (700494) | about 9 months ago | (#45644839)

10 FOR Z=1 TO 10000:NEXT:REM DUMMY LOOP TO ALLOW ME TO ESCAPE STORE
20 POKE649,0: REM BYE BYE KEYBOARD
30 SYS 64747: REM AND LEAVE NO TRACE OF TAMPERING!

Re:Mistake (3, Interesting)

Neo-Rio-101 (700494) | about 9 months ago | (#45644943)

sorry it should be SYS64767
Getting old here.

Re:Mistake (0)

Anonymous Coward | about 9 months ago | (#45643115)

I lovingly

You misspelled "smugly and self-adulatingly."

Re:Mistake (1)

game kid (805301) | about 9 months ago | (#45643509)

Hey, self-love is love too right?

Re:Mistake (1)

BancBoy (578080) | about 9 months ago | (#45644531)

"Hey, don't knock masturbation. It's sex with someone I love." - Woody Allen

Re:Mistake (0, Troll)

Anonymous Coward | about 9 months ago | (#45642847)

Want to do a crazy program you can't write on modern computers?

Simply loop through a sequence of poking two random numbers, and incrementing a number that you print.

Every time, the system will do different things.

If you did this on a modern computer, eventually it'd corrupt system files and the thing wouldn't boot.

It makes you wonder why modern OSes aren't hardened with the theory: No matter what the user does, allow the computer to boot up safely next time.

Re:Mistake (3, Insightful)

fisted (2295862) | about 9 months ago | (#45642919)

Want to do a crazy program you can't write on modern computers?

What?

Simply loop through a sequence of poking two random numbers, and incrementing a number that you print.

What?

Every time, the system will do different things.

What ?

If you did this on a modern computer, eventually it'd corrupt system files and the thing wouldn't boot.

WHAT?

It makes you wonder why modern OSes aren't hardened with the theory: No matter what the user does, allow the computer to boot up safely next time.

You're an idiot.

Re:Mistake (0)

Anonymous Coward | about 9 months ago | (#45643043)

No, you're the idiot for not providing example code!

Re:Mistake (0)

angel'o'sphere (80593) | about 9 months ago | (#45643047)

I second that.
Lol ... what a post ... I'm literally sitting in a pub and trying tomavoidnto rofl, but could not resist to slap my theights ... the whole pup is looking at me wondering why I'm laughing so hard ...

Re:Mistake (0)

Anonymous Coward | about 9 months ago | (#45644175)

Man I don't want to know what "theights" are, but I hope it's legal to slap them in a pub. And why would you bring a puppy to a pub? It's just Monday, you must be quite a tough act to follow by Friday!

Re:Mistake (1)

Charliemopps (1157495) | about 9 months ago | (#45643131)

What he's suggesting is possible but what's far more likely is the application and/or computer would crash long before anything truly bad could happen. In Linux I think you could even script such a thing with ptrace, not like I'd ever bother.

Re:Mistake (4, Informative)

nedlohs (1335013) | about 9 months ago | (#45643149)

Want to do a crazy program you can't write on modern computers?

What?

Yeah, can't is a blatant lie.

Yeah, that's trivial to do on a modern computer too. A trivial loadable kernel module in linux could do so, for example.

Simply loop through a sequence of poking two random numbers, and incrementing a number that you print.

What?

That is what it says, write a random value to a random memory location in a loop.

Every time, the system will do different things.

What ?

Of course it will. Sometimes you random memory location will be the memory mapped to the screen and a character will show up. Sometimes you'll change a return address on the stack and run some random code.

If you did this on a modern computer, eventually it'd corrupt system files and the thing wouldn't boot.

WHAT?

That's true, eventually you'll write over some file data just before it is flushed to disk and trash a file required for booting. Or screw with memory the file system is using and mess that up on the next write (though given the use of checksums that's pretty unlikely). The key is eventually since you'll have to run it a *lot* of times before it does something like that before crashing itself.

And of course not when running as a normal user process.

It makes you wonder why modern OSes aren't hardened with the theory: No matter what the user does, allow the computer to boot up safely next time.

You're an idiot.

Yes he is.

Computers that have the OS on ROM unsurprisingly aren't susceptible to making the system unbootable by screwing with boot files. The same is true of a modern computer hardwired to boot off of ROM as well though. And of course it makes upgrading that base OS essentially impossible (short of replacing the ROM, or actually using an EEPROM - and of course if software can do the upgrade then the random memory setting could also cause it to happen and screw up booting)

Re:Mistake (0)

Anonymous Coward | about 9 months ago | (#45643317)

Computers that have the OS on ROM unsurprisingly aren't susceptible to making the system unbootable by screwing with boot files. The same is true of a modern computer hardwired to boot off of ROM as well though. And of course it makes upgrading that base OS essentially impossible (short of replacing the ROM, or actually using an EEPROM - and of course if software can do the upgrade then the random memory setting could also cause it to happen and screw up booting)

Yes, but, according to the idiot earlier poster, this is what made old computers superior, the fact that he/she could concoct a very specific and patently absurd scenario that has no practical function whatsoever where, given enough time and being in good graces with the RNG, a modern system built for flexibility MIGHT eventually become nonbootable (completely ignoring reinstalling the OS from external media, which is a prerequisite for this ridiculous scenario to begin with) while an older computer would simply crash and come back up on the next reboot*.

The poster probably took a long time to come up with that back in The Day(tm) (read: adolescence), and that's probably the sole contribution to computing they've made, so, damnit, that's what makes those computers more superiorer.

*: Apart from poorly-made memory/voltage regulators in hardware where poking the wrong area of memory in just the wrong way could cause hardware damage, but again, we're just ignoring all of this so the idiot poster can feel better.

Re:Mistake (2, Interesting)

Anonymous Coward | about 9 months ago | (#45643567)

Well, he's saying that you can't write a program that simply twiddles random memory regions because all modern OSs employ protected memory schemes to avoid exactly this. Modern computers are multi-tasking systems that do hundreds or thousands of things at once and doing so would be disasterous.

On an older computer without protected memory you can write such a program and the results are beyond bizzare. Most forget that these old systems are very very "Bare metal". They don't have the layers of abstraction, exception handlers, memory protection, etc that you take for granted. The theory operation for a modern system is to look for dangerous program behavior and to stop, halt, throw an exception, or otherwise bring everything to a screeching halt. The idea is that there is nothing worse than data corruption (Especially silent corruption!) and it's safer to stop than let an unknown or erroneous state continue. Old systems didn't have any of those luxuries, and the aftermath of a program randomly changing memory regions is fascinating. Weird video, weird sounds, drives making odd noises. Those computers were simple and changing a single register would often bring very significant results.

I once read an account from a guy who wrote a simple "game of life" implantation. It worked great, but he forgot to add bounds. A self replicating/crawling pattern "escaped" the edge of the screen and continued crawling through other memory regions not intended for the game space. It said it was fascinating watching watching the computer glitch and and misbehave as the pattern activated registers for all sorts of things from keyboard lights to disk drives.

Re:Mistake (0)

labnet (457441) | about 9 months ago | (#45644659)

Yeah, that's trivial to do on a modern computer too. A trivial loadable kernel module in linux could do so, for example.

Simply loop through a sequence of poking two random numbers, and incrementing a number that you print.

What?

That is what it says, write a random value to a random memory location in a loop.

Every time, the system will do different things.

What ?

Of course it will. Sometimes you random memory location will be the memory mapped to the screen and a character will show up. Sometimes you'll change a return address on the stack and run some random code.

If you did this on a modern computer, eventually it'd corrupt system files and the thing wouldn't boot.

WHAT?

That's true, eventually you'll write over some file data just before it is flushed to disk and trash a file required for booting. Or screw with memory the file system is using and mess that up on the next write (though given the use of checksums that's pretty unlikely). The key is eventually since you'll have to run it a *lot* of times before it does something like that before crashing itself.

And of course not when running as a normal user process.

And you have just descibed why Evolution can't work!

Re:Mistake (1)

Anrego (830717) | about 9 months ago | (#45642949)

Indeed.

The "reset-ability" of older systems (I myself learned on a dragon32, which is basically a trs-80 knockoff) was very reassuring. You could crash the thing, but press the black button on the side and it's like it never happened.

Of course bringing this to modern computers would probably be hard and have all kinds of other consequences.

Re:Mistake (1)

Charliemopps (1157495) | about 9 months ago | (#45643055)

You can certainly do it on modern systems. LOTS of systems are written this way. You write your OS to media that can't be changed after the fact (DVD) and boot from that. No changes to the os can happen without re-burning it. That's how my hardware firewall is setup. It stores logs and such to the hard drive but the entire OS and all config files are stored on a DVD. There's no hacking that and making a permanent change.

The wright ability of modern OS's is definitely a "Feature" and very important, but it does come at a cost to security. But then again, if I had my firewall on a regular hard drive, it could update itself automatically and protect against 0-days before I even had heard about them. It's all a series of tradeoffs.

Re:Mistake (1)

Anonymous Coward | about 9 months ago | (#45643369)

The wright ability of modern OS's

Wright ability: bicycle repair and aeronautical engineering? Man, I'm definitely not taking full advantage of what these new-fangled OS's have to offer.

Re:Mistake (0)

Anonymous Coward | about 9 months ago | (#45644489)

Indeed.

The "reset-ability" of older systems (I myself learned on a dragon32, which is basically a trs-80 knockoff) was very reassuring. You could crash the thing, but press the black button on the side and it's like it never happened.

Of course bringing this to modern computers would probably be hard and have all kinds of other consequences.

And then there were the REALLY old systems. Mainframes. Where if you crashed the system, chances were that hundreds of irate users would hunt you down with torches and pitchforks. Which is why the hardware and OS was designed to make that as difficult as possible.

Of course, if you had security clearance and were fast enough, you could pull the Big Red Knob. That required an IBM Customer Engineer to come in and reset. You didn't do that unless something was actually on fire or something. And intended to look for a new job.

There are times when I think that CTRL+ALT+DEL was one of the worst things that ever happened to computing. It allowed "Get it RIght!" to be replaced with "Git 'er Dun!"

Re:Mistake (0)

Anonymous Coward | about 9 months ago | (#45642989)

When your OS is in ROM, it's real easy to get the computer to boot up safely next time, every time.

Re:Mistake (0)

Anonymous Coward | about 9 months ago | (#45643085)

delete system32 to maek it go faster!

Re:Mistake (2)

meerling (1487879) | about 9 months ago | (#45643151)

I talked to someone back in the win3x days that deleted his DOS directory because he didn't know what it was, so he figured it wasn't important.

OS in flash (1)

unixisc (2429386) | about 9 months ago | (#45643287)

Well, a ROM then was pretty small, but today, given how the BIOS has been redefined - w/ UEFI & all that, wouldn't it be possible again? Take a flash memory device that's 32GB, put an OS in it and make that the BIOS. For the rest of the stuff - the applications and all that, take a suitably sized SSD and put it on that. Anything portable would go on a USB drive.

Lock that OS BIOS, making it alterable only by the owner (in the same way that we currently alter BIOS) and all attacks that cripple an OS should disappear. Currently, the highest serial NOR flash memory density product is a 1Gb flash. How much of Windows 7 or 8 kernels can fit into it? How about Linux, XNU or the BSD kernels?

Re:Mistake (2)

istartedi (132515) | about 9 months ago | (#45643237)

It past the spell choker.

Re:Mistake (1)

MoreThanThen (2956881) | about 9 months ago | (#45643397)

and than it pissed the spill chuck

Mind blowing (4, Informative)

50000BTU_barbecue (588132) | about 9 months ago | (#45642635)

It's really cool to hear about this stuff. It's just sad to realize that the 128 was a terrible idea and Commodore spread itself too thin making all kinds of bizarre 8-bit computers around that time instead of making a true successor to the C64. The C65 should have been what made it to market, not the weird 128 with its obsolete the day it left the factory CP/M mode running at half the speed of its competitors.

The people I knew with 128s back then all used the 64 mode but used the 128 as an excuse to buy a better monitor. I never knew anyone using the CP/M mode.

Re:Mind blowing (1)

Grax (529699) | about 9 months ago | (#45642667)

This was my first computer. I tried booting in CPM mode about twice, the rest of the time I was happy in C64 mode or C128 mode.

Re:Mind blowing (2)

50000BTU_barbecue (588132) | about 9 months ago | (#45642759)

Things like Super Snapshot or Action Replay cartridges pretty much forced the machine into 64 mode anyhow. A better graphics chip and an extra SID on the 128 would have made it more compelling. The Apple IIgs was a powerful 6502/816 machine with superior graphics and sound so there was a market. A 640x400 interlaced display with at least 64 colors and 16 sprites and an Atari-style copper would have been awesome instead of the lame VDC.

No smooth scrolling on IIGS (1)

tepples (727027) | about 9 months ago | (#45643983)

The Commodore 64 had hardware pixel-level smooth scrolling and hardware sprites, putting it close to the 8-bit consoles (Sega Master System and Nintendo Entertainment System) in capability. The Apple IIGS had better color depth but no ability to scroll the screen, so games had to either flip screens or scroll jerkily, like ColecoVision and Spectrum and MSX games.

Re:No smooth scrolling on IIGS (1)

CronoCloud (590650) | about 9 months ago | (#45644753)

The C64 is not quite as good as the NES for sprite games, the NES can do more sprites, and the NES has tile based backgrounds.

However for pure bitmap and custom character set games (RPG's), the C64 had certain advantages....at least till the slow 1541 mattered.

Re:Mind blowing (1)

marsu_k (701360) | about 9 months ago | (#45643119)

Was my first as well, and ditto, I tried CP/M a few times and that was much pretty much it, didn't really serve a purpose for my seven-year old self.

What was cool about the C128 mode was the extended basic though. I was way too young / not autistic enough for assembler back then, but the basic had features like a rudimentary sprite editor and easy access to joystick input. I was able to create some "games" with moderate ease - they were horrible, of course, but at least I didn't just spend my time playing games. That, and Ultima V had music in it instead of just sound effects when it was run from the C128 mode :)

Re:Mind blowing (0)

Anonymous Coward | about 9 months ago | (#45642747)

I miss my VIC20.

Re:Mind blowing (0)

Anonymous Coward | about 9 months ago | (#45642749)

It's really cool to hear about this stuff. It's just sad to realize that the 128 was a terrible idea and Commodore spread itself too thin making all kinds of bizarre 8-bit computers around that time instead of making a true successor to the C64. The C65 should have been what made it to market, not the weird 128 with its obsolete the day it left the factory CP/M mode running at half the speed of its competitors.

The people I knew with 128s back then all used the 64 mode but used the 128 as an excuse to buy a better monitor. I never knew anyone using the CP/M mode.

It was crap. I had one as a kid and it's BASIC interpreter was garbage.

Any nice things I can say about it would go over you yong'ins heads.You're all abstract now - frameworks and whatnot - Java and shit like that .... do you kids even know what a register is? I think not ..

Never mind. Mod me down down - Matlock is on and there's Banana pudding tonight in the TV room so you can't hurt me!

Whippersnappers!

Re:Mind blowing (1)

K. S. Kyosuke (729550) | about 9 months ago | (#45642809)

Any nice things I can say about it would go over you yong'ins heads.You're all abstract now - frameworks and whatnot - Java and shit like that .... do you kids even know what a register is? I think not ..

Not all people are abstract today. [greenarraychips.com]

Re:Mind blowing (0)

Anonymous Coward | about 9 months ago | (#45643013)

Not all people are abstract today. [greenarraychips.com]

That is easily in the top 10 most useless architectures I've ever seen.

Re:Mind blowing (2)

K. S. Kyosuke (729550) | about 9 months ago | (#45643391)

Define useless. People care about bitops per joule these days.

Re:Mind blowing (0)

Anonymous Coward | about 9 months ago | (#45643431)

Chuck Moore is hardly a whippersnapper.

Re:Mind blowing (0)

Anonymous Coward | about 9 months ago | (#45642777)

You're right. Commodore should made a real successor. The C128 never really took off because it didn't have much to offer. And since it wasn't a big enough leap forward, not enough people bought them. And since not enough people bought them, not a whole lot of C128 software was made. It is the classic chicken and the egg problem that faces developers even today.

Re:Mind blowing (0)

Anonymous Coward | about 9 months ago | (#45642869)

Was there any C128 mode software at all besides Infocom text adventures?

Re:Mind blowing (1)

50000BTU_barbecue (588132) | about 9 months ago | (#45642955)

GEOS 128. The higher resolution video and faster processor helped but not enough it seems.

Re:Mind blowing (1)

Tempest_2084 (605915) | about 9 months ago | (#45643191)

There were a few 'upgraded' games that offered music or better graphics (Rocky Horror, Ultima V, Last V8, etc.) but most of the available 128 games were 80 column text adventures or homebrews. Even then there really aren't all that many (50 or so tops).

Re:Mind blowing (1)

marsu_k (701360) | about 9 months ago | (#45643997)

Quick googling turned up this list [commodore128.org] . A somewhat useless list though, as it doesn't specify what was different from the C64 version and whether the software was C128 only (I'm guessing very few were).

Re:Mind blowing (5, Interesting)

Webcommando (755831) | about 9 months ago | (#45642839)

I went from a Vic20 to C128 instead of a C64. I was amazed that I could use CPM and a very advanced basic. The power of this machine enabled me and a good friend to build a robot in college made of nothing but old car parts, DC motors, relays, and plates with holes drilled in them for encoders. That directly led to my first job as an automation engineer.

The C128 also was the last computer that fueled my dreams. I went to college to become a computer engineer so I could build what I called the "compatibility machine". This machine could execute all the major 8-bit computer software (they all had Z80's or 6502) without the user intervening or worrying what version of software they purchased. The C128 showed me it could be possible!

By the time I left school the writing was on the wall that Mac / IBM style PCs would rule the world. It didn't stop me from getting an Amiga, but it was pretty clear that CBM was on the way out.

Re:Mind blowing (1)

50000BTU_barbecue (588132) | about 9 months ago | (#45642905)

Ah good point, BASIC V7 was far better than 2.0. Did you use the user port or make a custom expansion cartridge? The closest I got to robotics back then was the Radio Electronics interface board to the Armatron... I never got the Armatron...

Re:Mind blowing (2)

Webcommando (755831) | about 9 months ago | (#45643109)

The closest I got to robotics back then was the Radio Electronics interface board to the Armatron... I never got the Armatron...

We used the user port to drive a board with 5 volt relays that, in turn, were used to turn on and off the DC motors (re-purposed windshield wiper motors). For input, I used the joystick ports since BASIC 7 had features to react to button presses, etc. and all the I/O was essentially just switches. I could POKE on the gripper motor and have the system react when the gripper closed "fire button" was hit before turning it off.

Reading and reacting to the encoders required a machine language routing to keep up with the pulses. I think that lived in the cassette buffer but I'm fuzzy if this was the case. One final cool feature: I could use the joysticks to train the robot to move by recording it's actions and then replaying them (after trading out the joystick for the I/O plug). This was fairly amazing to most people in the late 80's.

Re:Mind blowing (1)

50000BTU_barbecue (588132) | about 9 months ago | (#45643551)

Hey that does sound pretty cool. Reading joysticks in BASIC V2 was crappy and not very fast. You don't have any pictures?

Re:Mind blowing (1)

Miamicanes (730264) | about 9 months ago | (#45644737)

OMG. You just reminded me about my first (sort of) "robot" -- I connected an Erector Set motor's power lugs to the switched power traces on the cassette interface of my c64 using alligator clips, and attached a weak rubber band to pull it back. It was utterly useless, and did nothing besides pivot a rod back and forth, but it WAS technically a crude robot capable of moving atoms via software ;-)

Thank ${deity} I didn't fry the cassette port. That would have really sucked, and it's the kind of thing that doesn't even OCCUR to you when you're ~12 years old :-D

Re:Mind blowing (0)

Anonymous Coward | about 9 months ago | (#45643021)

Commodore had the Amiga in 1985. It should have been the only successor to C64.

Re:Mind blowing (4, Informative)

50000BTU_barbecue (588132) | about 9 months ago | (#45643521)

Commodore didn't design the Amiga, they bought it.

Re:Mind blowing (0)

Anonymous Coward | about 9 months ago | (#45643069)

The people I knew with 128s back then all used the 64 mode but used the 128 as an excuse to buy a better monitor. I never knew anyone using the CP/M mode.

I actually used CP/M to run a bulletin board for a few months. We had a fairly large CP/M group that met during our monthly computer club meetings.

In a way I really miss those days.

Megahertz myth and the 6502 (4, Informative)

goombah99 (560566) | about 9 months ago | (#45643357)

THe 6502 was an amazing processor. the Apple II was also a 6502. Unlike it's near contemporaries, the 8086 and Z-80 (and 6800), the instruction set was reduced. It had only 2 data registers (A,B) and two 8 bit address registers ( X Y) and fewer complicated ways to branch. Instead it effectively memory mapped the registers by using instructions like, offset Y by A, treat that as an address and get the byte at that location. Because it could do all that in one clock cycle, This effectively gave it 256 memory mapped registers. It also didn't have separate input lines for perifprials, and instead memory mapped those.

Nearly every instruction took a microsecond. Thus while the clock rate was 1 Mhz, it was much faster than a 4 Mhz 8080 series chip since those could take multiple cycles to do one instruction. Few memory chips (mainly static memory) could keep pace with that clock rate so the memory would inject wait states that further slowed the instruction time. The 6502's leisurley microsecond time was well matched to meory speeds. Moreover, on the 6502 only half the clock cycle was used for the memory fetch. This left the other half free for other things to access memory on a regular basis.

The regularity of that free memory access period was super important. it meant you could do two things. First you could backside the video memory access onto that period. On the 8080s using main memory you could often see gltiches on video displays that would happens when the video access was overridden by the CPU access at irregular clock cycles. As a result most 8080 series based video systems used dedicated video card like a CGA or EGA. Hence we had all these ugly character based graphics with slow video access by I/O in the Intel computer world. In the 6502 world, we had main memory mapped graphics. This is why the C64/Amiga/Apple were so much better at games.

This regular clock rate on the main meory had a wonderful side effect. It meant you could use Dynamic memory which was faster, cheaper, denser, and MUCH MUCH lower power than static memory. With the irregular access rates of the 8080 refreshing a page of dynamic memory requird all sorts tricky circuitry that trried to opportunistically find bus idle times to increment the dynamic refresh address, occasionally having to halt the CPU to do an emergency refresh cycle before the millisecond window of memory lifetime expired. As a result, the 8080 seris computers like Cromenco, Imsai, altair and Northstar all had whopper power supplies and big boxes to supply the cooling and current the static memory needed.

So the C64s and Apples were much nicer machines. However they had a reputation of being gaming machines. At the time that didn't mean "high end" like it does now. It mean toys. the Big Iron micros were perceived as bussiness machines.

Oddly that was exactly backwards. But until Visicalc, the bussiness software tended to be written for the 8080 series.

I think it was this memory mapping style rather than formal I/O lines to dedicated cards for periphrials (keyboard decoders, video, etc..) that lead apple to strive for replacing chips with software. they software decoed the serial lines (rather than using USART chips) they soft sectored the floppy drives rather than using dedicated controller chips, etc... And that was what lead to making the macintosh possible: less hardware to fit in the box, lower cost chip count, lower power more efficient power supplies.

Eventually however the megahertz myth made the PCs seem like more powerful machines than the 68000 and powerPC.

Re:Megahertz myth and the 6502 (3)

PhantomHarlock (189617) | about 9 months ago | (#45643455)

And as a descendent to that is was amazing what the Amiga did with the 68000 and its custom graphics and sound chips, as you mention at the very end. you never saw smooth scrolling and sprite movement on a PC. The Amiga and the C=64 both had arcade quality graphics locked to a 60hz interlaced or 1/2 vertical res (single field) refresh rate of a standard NTSC television signal. Since the whole thing was timed to that frequency, you never got tearing. The only downside was interlace flicker without a frame doubler, but not a lot of applications used interlaced mode.

EGA and VGA scrolling (2)

tepples (727027) | about 9 months ago | (#45644123)

you never saw smooth scrolling and sprite movement on a PC.

This was true of CGA, but after EGA and VGA became popular, John Carmack figured out how to use these newer cards' scroll registers and built Commander Keen in 1990.

Re:Megahertz myth and the 6502 (1)

Anonymous Coward | about 9 months ago | (#45643931)

Your post is really bizarre.

THe 6502 was an amazing processor. the Apple II was also a 6502. Unlike it's near contemporaries, the 8086 and Z-80 (and 6800),

Ok, with you there.

Nearly every instruction took a microsecond. Thus while the clock rate was 1 Mhz, it was much faster than a 4 Mhz 8080 series chip since those could take multiple cycles to do one instruction.

Well, that's just a bunch of crap: http://www.obelisk.demon.co.uk/6502/reference.html (look at the "Cycles" column.)

Few memory chips (mainly static memory) could keep pace with that clock rate so the memory would inject wait states that further slowed the instruction time. The 6502's leisurley microsecond time was well matched to meory speeds.

The Wikipedia article on the 6502 indicates that DRAM access times were on the order of 250ns - 450ns. In particular, 250ns access times are well-matched to 4 MHz clock rates; do the math. At 1 MHz, 250ns DRAM has time to go make a sandwich before it needs to supply the next memory cells.

On the 8080s using main memory you could often see gltiches on video displays that would happens when the video access was overridden by the CPU access at irregular clock cycles.

No. Then, as now, video display glitches were caused by updating video RAM directly outside of a VSync pulse. You could just as easily get video glitches on 6502s as on 808x machines. Which leads us to:

As a result most 8080 series based video systems used dedicated video card like a CGA or EGA. Hence we had all these ugly character based graphics with slow video access by I/O in the Intel computer world. In the 6502 world, we had main memory mapped graphics.

Patently false. Video memory on an 808x machine (even on CGA and EGA cards) was most certainly memory mapped.

I think it was this memory mapping style rather than formal I/O lines to dedicated cards for periphrials (keyboard decoders, video, etc..) that lead apple to strive for replacing chips with software. they software decoed the serial lines (rather than using USART chips) they soft sectored the floppy drives rather than using dedicated controller chips, etc... And that was what lead to making the macintosh possible: less hardware to fit in the box, lower cost chip count, lower power more efficient power supplies.

I/O lines on an 808x machine formed a bus. (Think of it like an address bus, but with slightly different electrical characteristics.) There were a handful of dedicated pins for specific interrupts, just like on a 6502. Also, having a separate I/O bus actually allowed for better performance: once pipelining and out-of-order execution started to get into CPUs, having a long transaction cycle on a separate bus for slow hardware meant you weren't occupying the memory bus - which is important when you realize that most modern CPUs are memory-bound because DRAM cycle times haven't kept up with Moore's Law at all.

Eventually however the megahertz myth made the PCs seem like more powerful machines than the 68000 and powerPC.

68000 was definitely a qualitatively better CPU than even an 80286. The 68040 paled in comparison to an 80486, though (although you can certainly argue that the 80486 is excessively complex - a product of having to still be able to run 16-bit-mode 8086 applications.) MIPS, Sparc, and PowerPC were arguably better architecturally: RISC, simple for compilers to optimize for, easier to design and build silicon for. Having said that, IIRC there is a RISC core beating at the heart of every x86-compatible modern Intel CPU behind the scenes...

Wrong (4, Insightful)

Anonymous Coward | about 9 months ago | (#45644571)

Since I designed, wirewrapped, and programmed embedded 6502 and 8080 system in that era I am well prepared to assess your claims. In a nut shell you are an arrogant tard and the original poster is figuratively accurate inexact.

Your post is really bizarre.

THe 6502 was an amazing processor. the Apple II was also a 6502. Unlike it's near contemporaries, the 8086 and Z-80 (and 6800),

Ok, with you there.

Nearly every instruction took a microsecond. Thus while the clock rate was 1 Mhz, it was much faster than a 4 Mhz 8080 series chip since those could take multiple cycles to do one instruction.

Well, that's just a bunch of crap: http://www.obelisk.demon.co.uk/6502/reference.html [demon.co.uk] (look at the "Cycles" column.)

What the original poster was likely saying, since it becomes clear later in the article, was that all the 6502 instructions were divided up into alternating cycles of memory fetches and internal calculations with an exact period of 1 microsecond for those. The 8080 series would use 1,2,3,4 and more with wait states cycles for an instruction with no regular pattern (in terms of future predictable times) of when the bus would be busy.

So you are wrong, have a reading comprehension problem, and are an ass about it.

Few memory chips (mainly static memory) could keep pace with that clock rate so the memory would inject wait states that further slowed the instruction time. The 6502's leisurley microsecond time was well matched to meory speeds.

The Wikipedia article on the 6502 indicates that DRAM access times were on the order of 250ns - 450ns. In particular, 250ns access times are well-matched to 4 MHz clock rates; do the math. At 1 MHz, 250ns DRAM has time to go make a sandwich before it needs to supply the next memory cells.

Sigh, again you have a reading comprehension problem. The original author was discussing static memory. Moreover, the cycle time for memory access always involves some overhead. The time when the CPU reads the data bus needs to occur after the bus has settled which is not at the start of the memories data valid period. But most of all 250ns memory was rare and expensive. Most computers in that time period did use wait states. Why do you think processors even allowed wait states?

Again you are being an ass about this as well.

On the 8080s using main memory you could often see gltiches on video displays that would happens when the video access was overridden by the CPU access at irregular clock cycles.

No. Then, as now, video display glitches were caused by updating video RAM directly outside of a VSync pulse. You could just as easily get video glitches on 6502s as on 808x machines.

that was an additional restriction on 8080 machines. But on 6502 machines one did not have to wait for the vertical sync to update the video memory. In fact that is EXACTLY what the original poster was pointing out, without trying to flaunt jargon like you.

This makes you look stupid now.

Which leads us to:

As a result most 8080 series based video systems used dedicated video card like a CGA or EGA. Hence we had all these ugly character based graphics with slow video access by I/O in the Intel computer world. In the 6502 world, we had main memory mapped graphics.

Patently false. Video memory on an 808x machine (even on CGA and EGA cards) was most certainly memory mapped.

yes it could be done. But then you had the problem of glitches or waiting for VSYNC (or if you liked to live dangerously, HSYNC). It wasn't pretty to build hardware or write code for. Your interaction with it didn't treat it like main memory but rather some very special memory. The CGA normally had two banks of memory, one for character graphics and then one for bit graphics and you could switch the video mode (usually with a huge glitch in the CRT sync that took many moments to recover from.).

I don't think you know what you are talking about.

I think it was this memory mapping style rather than formal I/O lines to dedicated cards for periphrials (keyboard decoders, video, etc..) that lead apple to strive for replacing chips with software. they software decoed the serial lines (rather than using USART chips) they soft sectored the floppy drives rather than using dedicated controller chips, etc... And that was what lead to making the macintosh possible: less hardware to fit in the box, lower cost chip count, lower power more efficient power supplies.

I/O lines on an 808x machine formed a bus. (Think of it like an address bus, but with slightly different electrical characteristics.) There were a handful of dedicated pins for specific interrupts, just like on a 6502. Also, having a separate I/O bus actually allowed for better performance: once pipelining and out-of-order execution started to get into CPUs, having a long transaction cycle on a separate bus for slow hardware meant you weren't occupying the memory bus - which is important when you realize that most modern CPUs are memory-bound because DRAM cycle times haven't kept up with Moore's Law at all.

Now you are contradicting yourself. So you now agree there was a distinct input output modality separate from the memory's use of the data bus. But it's a much less versatile an inconvenient system than memory mapping. It did have some utility though which is why it existed. So did things like DMA mode on the Z-80 which would rapidly increment the address bus for you so that some other device could supply the data to the memory. It was really useful for a few things like data acquisition at rates faster than the processor could handle. But it also was a giant kludge. Ask anyone who had to deal with horror of assigning Interupt addresses to different peripherals on their unstable DOS system. You sound blaster card would want the same ones as the mouse or the floppy drive. It was pure hell. That's why DOS and early intel machines were awful. Memory mapped cards each with their own address space was a much cleaner way to do it. Things like the S-100 bus were designed around cards that used the I/O system to send data, memory mapping was possible but not compatible between cards.

Eventually however the megahertz myth made the PCs seem like more powerful machines than the 68000 and powerPC.

68000 was definitely a qualitatively better CPU than even an 80286. The 68040 paled in comparison to an 80486, though (although you can certainly argue that the 80486 is excessively complex - a product of having to still be able to run 16-bit-mode 8086 applications.) MIPS, Sparc, and PowerPC were arguably better architecturally: RISC, simple for compilers to optimize for, easier to design and build silicon for. Having said that, IIRC there is a RISC core beating at the heart of every x86-compatible modern Intel CPU behind the scenes...

Just a confusing ramble comparing apples to oranges. Yes later processors tended to leap frog each other. 8080's and their ilk, were contemporary with the 6502.

Re:Megahertz myth and the 6502 (2)

tlhIngan (30335) | about 9 months ago | (#45644775)

THe 6502 was an amazing processor. the Apple II was also a 6502. Unlike it's near contemporaries, the 8086 and Z-80 (and 6800), the instruction set was reduced.

The reason for the popularity of the 6502 came down to one factor - cost. An 8086, 68000, Z80, etc., would've run you about $200 or so, while MOS was selling the 6502 for... $20. And you got a databook too.

The 6800 from Motorola was supposed to be the "cheap" chip (compared to the 68000), but it was still pricey - enough so that a bunch of Motorola engineers broke away and formed MOS and designed a 6800 workalike. This they called the 6500. Motorola sued them for releasing a competing product (it was basically pin-compatible), so what they did was switch a few pins around and re-released the 6500 as the 6502.

So that and the cost of it meant a lot of hobbyists used 6502s including one little company named after a fruit.

Re:Megahertz myth and the 6502 (1)

biobogonics (513416) | about 9 months ago | (#45644961)

THe 6502 was an amazing processor. the Apple II was also a 6502. Unlike it's near contemporaries, the 8086 and Z-80 (and 6800), the instruction set was reduced. It had only 2 data registers (A,B) and two 8 bit address registers ( X Y) and fewer complicated ways to branch. Instead it effectively memory mapped the registers by using instructions like, offset Y by A, treat that as an address and get the byte at that location. Because it could do all that in one clock cycle, This effectively gave it 256 memory mapped registers. It also didn't have separate input lines for perifprials, and instead memory mapped those.

Actually the 6502 only had one accumulator, the A register. The 6809 had A and B. It is correct that the 6502 had very nice addressing modes. Zero page addresses acted more like machine registers. One commonly used addressing mode was z-page indirect indexed by Y. Two consecutive locations on z-page acted like a 16 bit pointer and register. Either that could be incremented OR Y could be incremented. So a block move of 1 256 byte page was easy.
I don't think I *ever* used ($23,X) where X selects the z-page locs ("register pair") to use as a pointer.

At one time I had an Apple 2+ with a hardware accelerator board which ran at 3 MHz instead of the standard 1 MHz. For many tasks, my fast 2+ outran a comterporary PC-AT machine. For word processing, the Apple was much more responsive.
The conventional wisdom at the time was that the 65xx was clock for clock 4x more powerful. 3x4 effectively was 12 MHz which was faster than an AT. (Yes I'm ignoring memory and disk....).

Re:Megahertz myth and the 6502 (1)

Miamicanes (730264) | about 9 months ago | (#45645311)

What's kind of sad is that technically, VGA *did* have some of the same low-level capabilities of the C64 (besides sprites, obviously). At least, if you had a VRAM-based card like the ET4000. They just weren't supported by the BIOS, so they were (almost) never used in commercial software. You had to know how the video subsystem was wired together, where the various control registers were mapped, and bitbang them directly by hijacking system timers and dead reckoning.

One of the more hardcore examples I remember involved setting an interrupt handler to fire on VBLANK, using THAT handler to set a timer to fire (by dead reckoning) at the moment you hoped would give you enough time to pre-load the 486's registers, NOP a few cycles, then blindly ram new values into the VGA card's control registers during (what you hoped was) the horizontal retrace. From what I remember, it only worked (in 1991, at least) on a Tseng ET4000 video card (I'm pretty sure it required VRAM to avoid bus contention). As far as I know, no commercial software EVER took advantage of this trick, but lots of Eurodemos did.

Another cool capability that was very rarely used: you could rewrite soft fonts on the fly. As far as I know, exactly two real apps actually DID it... the MS/DOS 6 shell, and ProTracker. They replaced whatever was under the mouse with a 3x3 matrix of custom characters, then redefined them to whatever characters they replaced & XOR'ed a mouse pointer on top. Kludgy, but elegant in a way.

Re:Mind blowing (4, Interesting)

Dogtanian (588974) | about 9 months ago | (#45643519)

The C65 should have been what made it to market, not the weird 128 with its obsolete the day it left the factory CP/M mode running at half the speed of its competitors.

Whatever the merits or demerits of the two machines is irrelevant; the C128 came out in 1985, whereas the C65 [wikipedia.org] wasn't developed circa 1990-91.

C64 diehards have an obsession with the C65 and Commodore's perceived mistake in abandoning it, but despite the latter's numerous crap decisions, I'm sorry to say that in this case they were absolutely right.

The C64 was still selling as a budget option circa 1991 (*) viable due to sheer momentum. The 16/32-bit Amiga was not only established as the successor, it had already taken over (in Europe, at least) and was already nearing *its* own commercial peak(!)

Trying to release a (sort of) new 8-bit format by that point, even a very good one, would have made absolutely no sense, flopped horribly and stood on the low-end Amiga models' toes, mudding the waters pointlessly.

They could have sold it as cheaply as the C64 (i.e. the high manufacturing costs of a new machine selling at the same price as a "wringing the last profit from established cash cow model"), but what would the point of that have been?

The C128 at least came to market when there was still *possibly* a gap in the market for a high-end 8-bit machine between the C64 and the new (but still very expensive) Amiga.

(*) Apparently C= were still making them when they went bankrupt circa mid-1994(!)

Re:Mind blowing (1)

Dogtanian (588974) | about 9 months ago | (#45643687)

Edit; sorry, should read "The C65 wasn't developed until circa 1990-91".

Re:Mind blowing (1)

50000BTU_barbecue (588132) | about 9 months ago | (#45643761)

Yes, Commodore should have started work on the C65 much earlier instead of spreading out into bizarre orphan architectures like the C16, C116, Plus/4, B128, C264 and all the other useless cruft they came up with.

Commodore was right to abandon the C65 by 1991. Yes. I think we agree there, I'm just saying C= should have focused earlier and the C65 would have more sense in the marketplace in 1986. Granted, it wouldn't have been the 1991 C65, sure.

But if C= had taken its engineers away from all the useless cruft they were working on in the mid 80s and just asked for a C64++, my opinion is that this would have been the correct approach.

Oh and software. Bundling GEOS was the correct move, a GEOS for a C64++ in 1986 could have been enough to sustain C=, add a proper marketing strategy too.

Re:Mind blowing (1)

Dogtanian (588974) | about 9 months ago | (#45644339)

Spreading out into bizarre orphan architectures like the C16, C116, Plus/4, B128, C264 and all the other useless cruft they came up with.

While they (like Tramiel's Atari Corp. did later on) probably did too many overlapping things at once, it's only fair to point out that the apparently pointless introduction of a new, C64-incompatible architecture for the C16, C116 and Plus/4 family did supposedly start out for sensible reasons. According to the WP article [wikipedia.org] , Jack Tramiel was paranoid that (as they'd done in many other industries), the Japanese would swoop in and undercut everyone with ultra-cheap consumer-oriented machines. That's why the chipset is inferior in many ways to the older C64 design; its original purpose was to be much *cheaper* than the C64 to manufacture, and apparently, the rubber-keyed (i.e. low cost) C116 [wikipedia.org] was closest to the original intent.

However, the perceived threat to the home computer market never materialised (*), Tramiel left Commodore and the management was left with a chipset they didn't know what to do with. Presumably, for political and business reasons it was better for management to launch *something* rather than write off the chipset, but this would explain why the decision didn't seem to make sense- by the time the machines came out, the chipset's raison d'etre was past and management had to do something, so shoved it in some would-be midrange machines that overlapped with the established C64.

(*) Ironically, the Japanese took over the US market another way, by launching the NES and everyone buying them for gaming instead of home computers.

Re:Mind blowing (1)

ackthpt (218170) | about 9 months ago | (#45643601)

I had a C64 for years and at one time was slaving it to an Apple ][ with a nifty little interface, which I still have in a box somewhere. It was a dream to hack and play games on, despite having a mainframe at work which could do things I could only dream of at home (such as load/save from/to a HDD). My brother bought a 128 but never did anything with it as he wasn't a coder and had no idea what I was doing. Eventually I'd move to an Amiga 500 and then to a 2000 (which I still have.)

Re:Mind blowing (2)

CronoCloud (590650) | about 9 months ago | (#45644551)

Wordstar! The 1571 floppy can read/write Kaypro formatted discs, as well as some other CP/M formats, and Commodore's own GCR'd CP/M format. With software the 1571 can read/write practically any 5.25 format out there, including DOS.

IIRC I've read tha CP/M on the 128 was popular for BBS sysops since it was inexpensive.

U.S. Navy? (0)

Anonymous Coward | about 9 months ago | (#45642673)

I didn't know that there was a computer named after a rank of the United States Navy. I learned something new. Thanks for posting the link.

Re:U.S. Navy? (1)

dpilot (134227) | about 9 months ago | (#45642737)

I also heard it called Comma-toy and Commode-door, at the time.

Re:U.S. Navy? (0)

Anonymous Coward | about 9 months ago | (#45642795)

Usually by people with IBM PCs hooked up to amber monitors who, if they were lucky, could get their speaker to beep at various frequencies.

They were toys, yes. But they were what the cool kids had for toys back then.

Re:U.S. Navy? (-1)

Anonymous Coward | about 9 months ago | (#45643023)

"Cool kids" like you'd ever know. LOL

Re:U.S. Navy? (5, Interesting)

wcrowe (94389) | about 9 months ago | (#45643161)

True. I was one of those guys initially. I was a CS major in 1985 and my computer experience consisted of mainframes, CP/M machines and IBM PCs. Anything else was a "toy". I had a new girlfriend who suggested I could do my CS homework at her house because her dad "had a computer". She didn't know what kind it was, but he was an engineer so I figured it was probably pretty nice. I took her up on her offer, figuring that the suggestion was nothing more than a ploy to get me to come over to her house. When we got to her house she took me to the room he used as his office and pointed. I literally guffawed. It was a Commodore 64. She was somewhat offended at my reaction and I quickly apologized. Over the next few weeks I was a frequent visitor to her house and I began playing with the C64. The more I worked with it, the more respect I had for the platform. I especially liked the serial interface and how components could be daisy-chained. Far from being a toy, the C64 had the capability to do some pretty advanced stuff. And it was a LOT less expensive than an IBM PC. Eventually, the girlfriend became my wife, and her dad gave me the Commodore after he moved on to a PC. The wife and I broke several years ago. I still have the C64.
   

Re:U.S. Navy? (4, Funny)

bmajik (96670) | about 9 months ago | (#45643751)

This is such a Slashdot story :)

"A girl invited me to her house on several occasions. Each time, I spent more and more time being impressed with the Commodore 64"

Re:U.S. Navy? (3, Informative)

wcrowe (94389) | about 9 months ago | (#45644105)

Well, you know, there was other stuff going on, as her father and step mother were out of town that winter, but this is Slashdot, not Penthouse forum, so...

Hey, I eventually married the girl. ;-)

Re:U.S. Navy? (1)

CronoCloud (590650) | about 9 months ago | (#45643699)

But they were what the cool kids had for toys back then.

Cool "affluent" kids, the vast majority of kids back then didn't own computers. It is only on Slashdot where everyone assumes everyone was one of those spoiled suburban kids with a WarGames or Ferris Bueller style set up like they had. You'll see things like:

"When I was 15, my Quantumlink/Compuserve/Source bill was around 300 a month"

Re:U.S. Navy? (1)

An ominous Cow art (320322) | about 9 months ago | (#45643689)

I called it "Commode-odor". I was an Atari fan, but most of my friends had C=64s.

A few years later, though, I got an Amiga.

Re:U.S. Navy? (4, Insightful)

Dogtanian (588974) | about 9 months ago | (#45645167)

I called it "Commode-odor". I was an Atari fan, but most of my friends had C=64s. A few years later, though, I got an Amiga.

Assuming you mean the 8-bit Atari 400 and 800 (and its compatible redesigns, the XL and XE series), I did pretty much the same thing- was an Atari fanboy, but ended up with an Amiga. When one knows a little more about the "Commodore" Amiga and "Atari", it all seems a bit silly.

The major irony is that the Amiga developers included a number of ex-Atari staff- most significantly Jay Miner- who had worked on the 400/800 and the VCS/2600 before that. It represented (some have argued) a continued thread of architectural design that the 400/800 had significantly improved upon from the VCS, and had the same state-of-the-art custom chipset approach as its predecessors. (Indeed, just as happened with the 400 and 800, the Amiga was originally meant to be a console, before it evolved into a computer).

Also worth noting that "Amiga" was originally an independent company and it was only later bought by Commodore (after some legal wrangling with Atari, who'd had some involvement with them).

Meanwhile, Jack Tramiel had left Commodore (after falling out with the management), bought Atari Inc's computer and console division (i.e. the one that brought us the VCS and 400/800), which formed his new Atari Corp. The latter was a very different company to Atari Inc. (very obviously a much more shoestring operation). The Atari ST was designed by a different team after Tramiel had sacked most of the old Atari Inc. engineers, and very much reflected the "new" Atari; affordable, but much more off-the-shelf parts.

Atari Corp continued selling the XL and XE (cost-reduced versions of the 400 and 800), but they didn't design it; they merely milked the profits from a design they'd inherited while they focused on *their* Atari ST.

So... which was really the "true" successor to the Atari 400 and 800? By any measure, it was the "Commodore" Amiga. Who cares who made it? I briefly owned an ST because I couldn't afford an Amiga, but I ended up selling it and buying the latter a year later.

One grain of salt (2, Informative)

Anonymous Coward | about 9 months ago | (#45642781)

From the Article: "Commodore C-128, the last mass production 8 bit computer and first home computer with 40 and 80 column displays"

C-128 was in 1985, the Acorn BBC had 20, 40 & 80 column modes (and a teletext mode) in 1981.

Re:One grain of salt (1)

Dogtanian (588974) | about 9 months ago | (#45643919)

C-128 was in 1985, the Acorn BBC had 20, 40 & 80 column modes (and a teletext mode) in 1981.

Yes, this is correct. Technically, I guess it could depend how one interprets

[The] first home computer with 40 and 80 column displays, dual processors, three operating systems, 128k memory via MMU and one heck of a door stop.

Was the BBC truly a "home computer"? I'd say yes, though it overlapped the educational market too, but one could argue the point.

And perhaps it could have meant "(40 and 80 column displays) BOOLEAN-AND (dual processors) AND (three operating systems) AND (128k memory via MMU)".

That said, this is probably overanalysing. The BBC Micro wasn't that successful outside the UK, and the US tech industry (well, the US in general!) tends to assume that itself == the worldwide situation. So my suspicion is that Herd probably wasn't aware of it, or at least of it being a "home computer" (if it was).

Re:One grain of salt (2)

CronoCloud (590650) | about 9 months ago | (#45644485)

The thing with the C128 is that you can use both displays at the same time, meaning you can have a 40 column display hooked up AND an 80 column display. Most people used dual-mode monitors but there was some software that you did some things in 40 column mode but then the software displayed special output in 80, or vice versa.

Learn electronics repair from Bil himself (3, Informative)

Anonymous Coward | about 9 months ago | (#45642783)

Bil will be teaching a class at the Vintage Computer Festival East [vintage.org] next spring. He also lectured about the 128 and Commodore repair at the same event in 2012. Details are on c128.com.

Hrmmmmm (3, Interesting)

Anonymous Coward | about 9 months ago | (#45642841)

He is claiming a lot of "firsts" that I would swear were in my Apple ][e prior to Winter '85...

Re:Hrmmmmm (1)

50000BTU_barbecue (588132) | about 9 months ago | (#45643089)

Yup, as I read it more and more he's claiming some historically dubious things, but now you know how it feels to have history re-written.

Re:Hrmmmmm (0)

Anonymous Coward | about 9 months ago | (#45643703)

I was thinking the same thing. Must have had the marketing department write the article.

Re:Hrmmmmm (1)

CronoCloud (590650) | about 9 months ago | (#45644321)

Which ones? While 80 columns and 128K were options on the //e, 6he //e didn't come with 128k as default till 1987 with the Platinum //e. That was also the first //e with a numeric keypad by default. The 1571 also has a higher capacity than Apple's 5.25" drives.

So yes, the C128 did have some features as standard before the //e.

Re:Hrmmmmm (1)

NJRoadfan (1254248) | about 9 months ago | (#45644565)

The Apple IIc had 128k in 1984. Was the 1571 any faster than the 1541? Most Apple IIe systems were equipped with 2x5.25" drives, so it wasn't that big a deal. Apple did offer the Unidisk 3.5"+interface card for the Apple IIe for 800k worth of storage. The first revision of the Apple IIc had built in support for the drive as well.

So what? (2)

anotheryak (1823894) | about 9 months ago | (#45642985)

A lot of early personal computers have a similar story. Software is often written with breadboarded or nonexistent hardware.

What is unique about the idea of custom silicon LSI chips for a 1980's PC?

The original Atari 800 (a design later copied by Commodore for the VIC-20 and Commodore-64 computers) had three custom chips (ANTIC, CTIA, POKEY) which made up the majority of the machine's circuitry when designed in 1978. And the OS and other early programs were written without the benefits of that completed hardware.

Only two LSI parts were off the shelf; the 6502 CPU and the 6520 PIA. Atari later replaced the CTIA with the GTIA (delayed by design issues) and the 6502 with a custom "sally" variant that built in formerly external tristate allowing the ANTIC to shut off the CPU's access to RAM every-other clock cycle so the RAM could be accessed by the ANTIC graphics chip.

That design was in active production for over ten years.

Even the lowly 2600 was a basically a custom TIA chip that originally existed as discrete logic parts wirewrapped together.

I fail to see how this story is either unique or great. If anything, it seems average.

It was worth having the 128... (2)

PhantomHarlock (189617) | about 9 months ago | (#45643275)

...to play Ultima V in dual SID mode.

After several C=64s and the 128, I moved to the Amiga, which got me into the VFX business thanks to the Video Toaster and Lightwave.

Looking forward to reading this article. If it's good I'll stash a copy next to my "Rise and Fall of Commodore" book.

This quote is great (4, Insightful)

phantomfive (622387) | about 9 months ago | (#45643321)

You never know what marketing will do to you as an engineer.

a couple of weeks later the marketing department in a state of delusional denial put out a press release guaranteeing 100% compatibility with the C64. We debated asking them how they (the Marketing Department) were going to accomplish such a lofty goal but instead settled for getting down to work ourselves.

Re:This quote is great (1)

TWiTfan (2887093) | about 9 months ago | (#45643723)

It was actually a pretty important selling point of the C128. Keep in mind that I (and many others) had a collection of *hundreds* of C64 games before we bought the 128 (thank you, early DRM crackers). I probably wouldn't have bought one if all it could play was C128 software (what little there was of it).

Re:This quote is great (1)

NJRoadfan (1254248) | about 9 months ago | (#45644663)

One theory behind the lack of C128 software was that the machine could run C64 software and that developers didn't bother writing software that most people couldn't run. Why write C128 software when you can write C64 software that can run on both new and old machines. The Atari STe line had the same problem with games, very few took advantage of the improved graphics and digital sound available on the newer machine.

Re:This quote is great (2)

Dogtanian (588974) | about 9 months ago | (#45645315)

The Atari STe line had the same problem with games, very few took advantage of the improved graphics and digital sound available on the newer machine.

The STe was clearly designed to close the gap between the "vanilla" ST (and STFM) and the Amiga, which had come down in price by that point. It might have worked... had Atari directly replaced the STFM with the STE at the same price when it launched.

Problem was that- almost certainly due to Jack Tramiel's penny-pinching short-sightedness- they charged more for the STe and continued to sell it alongside the STFM. So anyone buying an ST because it was cheap would get the STFM, and anyone who had a bit more to spend would have gone for the Amiga, whose superior power was already taken advantage of by existing software.

Hence there was no reason to buy an STe, so no-one bought an STe, so no-one developed software to take advantage of it, so there was no reason to buy an STe.... vicious circle.

Had the STe become the base model, there would eventually have been enough in circulation to make it worth supporting. They didn't, and it flopped. The STe *did* eventually replace the STFM circa mid-1991, but too little, too late- the ST's terminal decline had already started by then.

My dad bought me this as my first computer. (5, Interesting)

JoshDM (741866) | about 9 months ago | (#45643323)

I was in 5th or 6th grade, and I woke up to a new computer in my room. The printer immediately broke and I noticed the desk was half up-side down. My dad had assembled it and the desk in the dark, during the night, while I was asleep (I'm a heavy sleeper). He was no technician, but I appreciated the effort. I traded c64 games with kids at school and stacks of 5.25 floppies via mail. Commodore games were fantastic; much better than NES. Junior year of High School, I finally had the initiative to figure out what my dad had done to the printer, and it turned out to be a simple problem that I fixed. I used 80 column mode to type and print essays for school for the next two years. Much praise to my old man. Granted, first year of college and he helped me acquire a 386 with Windows 3.0, which I had for three years, then built my own. I'll never forget my C=128. Thanks, dad!

Re:My dad bought me this as my first computer. (3, Informative)

wcrowe (94389) | about 9 months ago | (#45643717)

Just wanted to say that's a great story about your dad.

Too little too late (3, Interesting)

RedMage (136286) | about 9 months ago | (#45643333)

I was a big fan, and a game developer for the C64. Those were the days that a machine could be fully understood by an untrained person with a knack for programming. When the C128 came out, I was interested, especially in the 80 column screen and CP/M software compilers. But there were too many limits on the machine (no hard drive easily added, no real OS, etc.) and it didn't feel like enough of an advancement over the C64. My grandfather did buy one, and I had some time with his, but that never really sparked much either. My next machine would be the Amiga, and as soon as that become somewhat affordable by a college student (the A500), I never looked back.

RM

Two months? Luxury. (1)

residents_parking (1026556) | about 9 months ago | (#45643721)

I've had hardware dumped on my desk the *day before* the proto is due to ship. I knocked up enough code in a week and a half, it worked great, and survived virtually intact into production.

But here's the rub: as long as I keep on working miracles, the hardware will keep on getting later.

My First Machine (0)

Anonymous Coward | about 9 months ago | (#45644049)

It was mostly just a C64 for playing jumpman. The only time I actually booted into C128 mode was to run this sweet word processor. I used to be able to slap some clip art in my homework and pull an A+ for creativity. This was going to be easy...

About time (4, Funny)

Tablizer (95088) | about 9 months ago | (#45644273)

Excellent, my wife's been on me to upgrade my C64

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>