This site has actually gone and researched this urban legend and said it was false Quite an interesting site if you've got an hour or so free to have a look around.
The PC standard is really crap in my opinion. It's had features and other parts grafted on over the years. ACPI and PNP are good examples of recent braindeadness at work. ACPI needs a huge interpreter in the kernel just to handle initialising the hardware and PNP is great if it works. Totally crap if it decides not to. Give me back the days of jumper blocks
I'm not saying that the original hardware was any better though. IBM didn't actually follow the specifications laid out by Intel and redefined the meaning several of the interrupts. Don't get me started about the A20 line gate inside the keyboard controller...
On a related note, Bill Gates denies ever saying that 640k would be enough for anybody. He was interviewed awhile back
here and he does give a quite convincing reason.
The mains voltage doesn't matter for the television standards. It's the frequency that is important. US electricity is at 60hz whilst Europe is mostly 50hz. The reason why the television standards used mains frequency is that it is quite stable as the power station actually has extra equipment to lock it to 50 or 60 hertz. If the mains frequency is a multiple of picture field scan rate then interference is also reduced.
- Trevor