EvanAnderson 13 hours ago

Ahh, alternative futures...

If the FCC hadn't been so strict I think there's a good chance we'd be using computers with a lineage going back to Atari versus IBM today.

Commodore ate Atari's lunch with the C64 and pricing, but Atari could have launched the 400/800 at lower price points with more lax emission standards. They would have had lower peripheral price points, too, since the SIO bus and "smart peripheral" design was also an emissions strategy.

On the home computer front the Atari 8-bits beat the pants off of the PET for graphics and sound capabilities. More success in the late 70s might have enabled Atari to build down to a price point that would have prevented the C64 from even happening.

On the business side Atari R&D had interesting stuff going on (a Unix workstation, Transputer machines). Alan Kay even worked there! They were thinking about business computing. If the 8-bits had had more success I think more interesting future products could have been brought to market on the revenue they generated.

  • squeedles 13 hours ago

    I happened to buy an Atari 800 at the peak of this and was amazed at the metal castings that surrounded everything. That little 6502 could survive small arms fire! That shielding was far beyond anything else at the time.

    And you make a good point about the SIO bus - this was when every other machine had unshielded ribbon cables everywhere. Their devotion to daisy chained serial really crippled them in terms of speed, and when USB finally arrived, I initially scorned it due to the prejudice formed by my experience with the Atari peripherals! It turns out they were on the right track all along!

    • EvanAnderson 11 hours ago

      You may not be aware, but Joe DeCuir, who worked for Atari on the VCS and 8-bit computers, also worked on the development of USB. Some of his Atari engineering notes helped fend off a patent troll who tried to claim USB was infringing. It's a neat story. There are a ton of interviews with him about it. He gave a nice presentation at VCF a few years ago where it was mentioned, too: https://youtube.com/watch?v=dlVpu_QSHyw

  • fidotron 13 hours ago

    > If the FCC hadn't been so strict I think there's a good chance we'd be using computers with a lineage going back to Atari versus IBM today.

    And/or many of the other manufacturers of that era. I have encountered execs from that era that still believe the whole thing was some sort of shrouded protectionism.

MountDoom 12 hours ago

The regulatory landscape here is pretty funny. In all likelihood, the worst RFI offenders in your home are LED lights, followed by major appliances. Both of these are regulated less than something like a computer mouse. For lightbulbs, I think the manufacturers just self-certify.

I guess there are two ways to look at it. Either the regulation was wildly successful, so the problems persist only in the less-regulated spaces. Or we spend a lot of effort chasing the wrong problem.

  • Flamingoat 12 hours ago

    If I turn my kettle or microwave on in my kitchen it will kill any bleutooth or wifi signal. My microwave is getting on for 15 years old, maybe newer ones are better, but the kettle was bought last year.

    • elevation 11 hours ago

      If you cannot change the microwave, consider trying a different wifi channel. I once had a 2012 Panasonic microwave that killed 802.11g channels 7 and 14 but not channel 0.

      • FuriouslyAdrift 10 hours ago

        Microwave ovens hover in and around the 2.4 GHz range just like 802.11b/g. Switching to 5 or 6 GHz (802.11a/n/ac/ax/etc) - can help immensely.

      • Flamingoat 8 hours ago

        I am not too bothered about it. I only use it every other day for about 3 minutes to heat up some porridge. I keep on meaning to buy a new Microwave, I bought it in ASDA 15 years ago for £30 and it just keeps on working.

schoen 13 hours ago

I remember being confused as a kid about the "this device must accept any interference received, including interference that may cause undesired operation" labels.

I kept reading "must accept" as a technical requirement, somehow like "must not be shielded against" or "must not use technical means to protect against", rather than what I now think is the intended legal sense "does not have any legal recourse against".

It's weird that they phrased it in terms of how the device itself must "accept" the interference, rather than the owner accepting it.

  • bitwize 10 hours ago

    I always thought it meant "must continue to function despite" such interference, i.e., it must not blow up or break permanently in the presence of such interference.

alwa 11 hours ago

I was gratified by the last little tidbit: a nod to the Ohio “tinkerer” whose 2019 experiment in home automation interfered with neighbors’ 315MHz-band devices to the point that the power company shut off the whole block in an attempt to isolate the interference

https://www.nytimes.com/2019/05/04/us/key-fobs-north-olmsted...

(https://archive.is/aTWZ2)

Apparently the regulations work well enough to provoke an official response when garage door openers stop working over the area of a few houses… a level of reliability I’d long taken for granted

afandian 13 hours ago

I'd heard, probably in a Centre for Computing History [0] interview or similar that these regulations contributed to the BBC Micro never getting a good foothold in the USA and losing to Apple.

It had an amazing selection of ports, all unshielded and designed for flat ribbon cables. But that wouldn't fly in the USA.

[0] https://www.youtube.com/@ComputinghistoryOrgUk1

anjel 13 hours ago

Back in the mid-century, we used to put an AM radio anywhere on the same desk next to the TRS-80 and the revealed cacophany was endlessly fascinating. As I recall, radio tuning was unnecessary.

  • whartung 12 hours ago

    We used to put them next to our TI-58/59 calculators. You could use the radio for sound effects in games.

    • bitwize 10 hours ago

      The first application developed for the MITS Altair, besides watching the blinkenlights blink, was playing music on a nearby AM radio.

whartung 12 hours ago

Anecdotes from the age.

When I would fire up my KIM-1, the TV would turn to snow.

There was a toy called the "Big Trak", a programmable ATV toy. If you ran that underneath the desk with a TRS-80 on it, it would crash.

The TRS-80 Model 1 was notorious for this, as you connected the computer to the expansion interface with a bare, 40(? ish?) pin ribbon connector. It was a beautiful broadcast antenna for computer signals.

The FCC was an impetus for the Model 3.

jcalvinowens 11 hours ago

The worst RFI I encounter in my day to day life is from ethernet switches... I really wish the FCC would stop allowing the use of 125.0MHz on airband. My local airport (KPAO) uses that as it's ground frequency, and it's every bit as terrible as you'd expect :D

mmastrac 13 hours ago

For those of you who watch Adrian Black on YouTube, you might remember him angrily tearing out RF shielding from the older computers.

On the other hand, I have been struggling to get my IP KVM at home working and it turned out that the cause of its failure was some cheap PoE extractors that spew noise across the spectrum, especially interfering with VGA sync.

Modern equipment, assuming you aren't buying bargain-basement AliExpress junk (which I do, from time to time) is surprisingly good at RF rejection.

And, amusingly, this just popped up on Twitter: https://x.com/hakluke/status/1980479234159398989

mrandish 10 hours ago

Having lived through the early 8-bit home computer era as a teenaged user and then in the mid-80s working in tech startups making hardware peripherals for 16-bit computers - although not as a hardware designer, here's my perspective. Early digital devices definitely could occasionally cause interference with TVs and radios in their immediate area, so there needed to be some regulation to address the issue.

However, once aware of the potential problems it wasn't too hard or even very expensive to design hardware which avoided the most serious problems. Properly grounding components and a little bit of light shielding here and there would generally suffice to ensure most devices wouldn't cause noticeable issues more than two walls and 30 feet away. I think by the 90s the vast majority of hardware designers knew how to mitigate these issues while the evolution of consumer device speeds and designs reduced the risks of actual interference on both the 'interferor' and 'interferee' sides.

Unfortunately, the FCC's regulatory testing requirements didn't similarly evolve. Hardware designers I worked with described an opaque process of submitting a product for FCC testing only to receive a "Pass/Fail" with no transparency into how it was tested. Sometimes the exact same physical product could be resubmitted a month later with zero changes and pass. This made things unpredictable and slow, which could be a lethal combination for small hardware startups. So there emerged a sub-industry of "independent RF testing labs" which you could pay to use their pricey gear and claimed expertise to test your device and tell you why it failed, let you make a change right there and retest again until you passed. This made things more predictable but it could cost upwards of $10K (in 90s dollars) which was a serious hardship for garage startups. I was told a lot of the minor hardware changes made during such interactive testing probably did nothing to decrease actual interference in the real-world and only served to pass the test.

Then came the era of "weaponizing" FCC certification. Small startups could avoid the costs and delay of FCC testing by filing their product as a "Class A" device (which meant only for use in industrial/scientific environments) instead of as a "Class B" (consumer) device. The devices still had to not interfere but their makers could self-certify their internal tests without going through FCC testing. When new hardware startups would threaten a large, established company product with a cheaper, better product shipped as "Class A", BigCo would report them either interfering or just being used in consumer environments - despite the device very likely not interfering with anything. This ended up creating a lot of problems for such startups because if their cool new product ended up even once in an arguably "retail distribution channel", they could get hit with big fines - all without ever causing any actual interference - and even if the device was able to pass FCC testing and would have been certified as Class B. It got especially ridiculous when a lot of cheaper products were simply generic designs, like a modem using the standard Rockwell chip set and reference design. These were often made on the same production line and even used the same circuit board in a different case as other products which all passed FCC testing. But if you didn't have your official "FCC Cert", you could get busted.

I left the hardware space in the early 2000s so I never heard if these regs were ever modernized, but it sure seemed like they were in need of it.

  • bji9jhff 7 hours ago

    I like how you described how I imagine OSA will be used.

jaydenmilne 13 hours ago

Tangentially related: once I bought a no name Amazon HDMI switch that would cause FM interference but only when the screen was mostly white: https://youtu.be/n2DPLEvwO-k

Another reason to use dark mode I guess

  • mmastrac 13 hours ago

    What's interesting is that HDMI is supposed to have a scrambling system that prevents any repeating pattern from causing EMI. I wonder if there was an unshielded, unscrambled raw data path somewhere in the switch.

superkuh 14 hours ago

I wish the FCC would go back to doing their job re: radio interference and unintentional emissions. The entire concept of a computer case 'window' is about equally as anti-social and rude as littering in a public park. But it's so pervasive and socially accepted when I bring it up the only responses I receive are all quite hostile. Even worse are the computer builds that don't have any metal case at all. They're broadcasting that <30 MHz interference from the high speed switching electronics around the world, ruining a shared medium for everyone, and they have no idea.

And really, it's not the consumer's place to be aware of these things. It's the regulators'. And they've dropped the ball.

  • transpute 13 hours ago

    > computer builds that don't have any metal case at all

    Would be nice to have more metal cases for SBCs, like the one on R4S, https://www.androidpimp.com/embedded/nanopi-r4s-review

    KKSB makes metal cases for some SBCs, https://kksb-cases.com

    • superkuh 12 hours ago

      I run a handful of odroid sbcs constantly with 3x rtl-sdr usb radio receivers each. For their cases I used the cardboard box they shipped in covered in overlapping 2" conductive-glue backed copper tape. And lots of type/mix 63 ferrites on the cords.

  • dontlaugh 10 hours ago

    I also don’t get the point. Why would you want to look inside your PC or even have lots of coloured lights in it to look at?

    • sokoloff 8 hours ago

      I think the insides of a PC look more interesting than a flat piece of sheet metal. Adding addressable LEDs make that even more true.

      Visually, I don't care particularly much one way or the other, but on a 6-12 layer PCB, there's plenty of opportunity to closely couple and shield fast-changing signals, so I wouldn't expect surrounding in a Faraday cage would be needed (and certainly I've never noticed an issue from the computers near my RF receivers).

      • dontlaugh 8 hours ago

        Neither is very visible under your desk, except for possibly the front of the case.

        I went out of my way to avoid extra lights, which sadly wasn’t possible for the GPU. I had to figure out how to turn those off in software.

  • oakwhiz 12 hours ago

    Metallized windows would be nice at least

    • transpute 12 hours ago

      One workaround is conductive mesh (metal/fabric) over the window interior, bridged to bare metal of the case interior.