intel CPU history 

intel has made a fuck ton of CPUs. do you know how many? lots of them. every new process requires new factories and all kinds of waste. but did you know there's a DARK secret? ok maybe not DARK or a secret, but something really worth digging in to. so let's look at the history here

intel CPU history 

1971 intel releases their chip, the 4004. it's a 4-bit CPU max clocked around 750 KILOHERTZ. it has 2250 transistors and 16 pins. you'd use it with ROM, DRAM and shift registers. this bad boi could do some math shit, hit up maybe 16 registers (though you could only store like one number digit in 4 bits) and and 640 BYTES of RAM. it ran at 15 volts for some reason and used 5v logic levels. this shit was fucking DOPE

intel CPU history 

1972 calls. intel released the 8008, a fucking BYTE-oriented CPU. it was a bit slower but it was a step up: instead of calculators this was intended to drive a CRT terminal. but instead of 16 registers it had 7, with the rest being used for the program counter or call stack. yes, call stacks went in the registers. the 4004 had a 3 level call stack, the 8008 had 7. it had 18 pins, with 8 of them being the data bus and supported 14-bit addresses with 48 instructions

intel CPU history 

separate chips would be used to actually handle things like mapping memory, bank switching, IO registers and stuff. this is the kind of architecture you would have in a retrocomputer as many discrete chips were used to handle separate tasks, while modern 'system on a chip' dies include these all on one die. this saves cost, power and also allows faster computers as electronic signals are limited by the speed of light so modules have to be closer

intel CPU history 

what we'd call modern x86 'CPUs' are actually systems on chips like that but with a TON of shit packed in, including what use to be northbridge/southbridge chips. another more pressing reason for this kind of integration is pin count. each electronic chip needs power pins, data pins, clock pins, etc. this really still adds up to the point we have 'system on modules' or 'system in packages' which are just chips on a single PCB all connected with useful pins on a connector

intel CPU history 

anyway it's now 1974. intel released the 4004 which was a straight upgrade from the 4004. you get 8 more registers, INTERRUPTS and a larger subroutine stack. interrupts are a key feature in CPUs- it allows devices to make the CPU jump to certain sections of code, 'interrupt handlers'. this is a blessing and a curse because on one hand interrupts are necessary but also hard to handle. let me explain *pulls up whiteboard*

intel CPU history 

say you have a program that needs to check email. it can check it when it has free time, or get a notification (an interrupt) and decide to do something about it. the main problem here is that when the CPU has an interrupt, it jumps to a different code address and that's 'all'. the programmer has to write code to save all the current task's state and be very careful about which resources it touches. and then what if you get an interrupt during that?

intel CPU history 

you effectively have a multitasking CPU that will switch tasks, and as such you need to program as if your code is running in a somewhat distributed environment. you have to lock resources that are in use and deal with things like critical sections. this is the price of being able to be responsive to things like mouse moves and hard drives being ready with data

intel CPU history 

ANYWAY it's still 1974 and intel releases the legendary 8080 CPU. it's an 8-bit CPU with a 16-bit address bus. 40 pins, 16 for addressing, 8 for data. this thing did away with call stack as registers and instead left you with a painful 7 8-bit general purpose registers, 4 of which can be used in pairs to make 16-bit addresses. it also does away with requiring chips for muxing peripherals, instead it supports IO 'ports' separate from the address space

intel CPU history 

running at 2mhz instead of the 8008's 500khz, this thing is what armchair historians like me collectively call 'hot shit' and luddites call 'ground zero'. this chip went on to be used on the first microcomputer: the Altair 8800. then it was used (or a clone was) for various 8-bit home microcomputers during the 70s and 80s. this shit was what they call 'cash' to the point intel decided compatibility with the 8080 was important from then to now

intel CPU history 

where do you go from here? by releasing the 8085 in 1976. what's the difference? it runs at 5v, instead of its weird 15v supply. you know what else happened that year? zilog released the z80 which is what most home microcomputers used. this shit had more useful instructions, it had a secondary set of registers just for running interrupts and extra good stuff. this CPU is still produced to this day and used in calculators and shit

Follow

intel CPU history 

1978: with zilog effectively taking the 8080 and saying 'this is mine now', intel had to think fast and improve. naturally intel added more bits and made the 8086! it's 16-bit! it was intended as a stop-gap until they could work out a better architecture. narrator: they didn't

so you have 8 16-bit CPU registers, but 4 of them double as 8-bit registers. it also borrowed the z80's idea of having indexing registers for addresses.

intel CPU history 

the 8086 was used by the IBM PC to give us ... PC shit i guess? DOS? that's super cool. but i have a slight bone to pick: memory segmentation. see, 16-bit memory addresses only go up to 64k. not great right? so x86 used two registers: a segment and an offset, to get access of up to 1 megabyte of memory. you're probably wondering how that works given 32-bit systems can access 4GB of memory. well, intel made a tradeoff

intel CPU history 

since 1M ought to last forever and the 8086 was a stop-gap, intel decided to add together the segment and memory offset to get an address. which meant segments and memory offsets overlapped. so multiple combinations of segment:offset could access the same final address. think of it as having a 64k segment of addressable memory you could shift throughout 1M of RAM

intel CPU history 

is this a smart idea? i would think so: it allows DOS to pack segments together next to each other if a segment won't use a full 64k of memory. so if a program uses 8k, the rest of the 64k segment wouldn't go to waste on a 256k system

intel CPU history 

anyway, the 8088 comes out in 1979. this is actually what the IBM PC uses, but it only has 8 data lines. thanks i guess. 1982 they release the 80186 AND the 80286? straight upgrades. kinda. the 80286 is fucking bizarre, man. intel didn't expect consumers to use it, and it was designed for multitasking systems. to do this it provided a really cool feature: protected mode. but they kind of fucked it up? let me explain

intel CPU history 

protected mode means you separate actual real memory addresses from virtual addresses. this is great because you wouldn't need to pack segments together, but even better because you don't need segments: every 4k 'page' of memory is backed by 4k of memory at a completely different address. this lets the OS be sneaky and on demand load/save RAM to disk without programs knowing, and hide other program's virtual memory addresses from each other for security

intel CPU history 

the 16-bit 'real mode' and 16-bit 'protected mode' both acted differently (as expected) but intel made a critical and really REALLY silly mistake: they made it impossible to switch from protected mode BACK to real mode. so you couldn't load a super cool program that uses protected mode then jump back to DOS afterwards. absolute trash. what the fuck? anyway

intel CPU history 

the year is 1985. intel is sweating pretty bad at this point because they can't seem to actually make a CPU worth a damn, unless... intel release the 80386, a 32-bit version of the 8086 architecture. it includes protected mode, and a new virtual 8086 mode so you can run DOS programs from protected mode. you have ... 4 registers? uh... yes, 4 general registers. each can be broken in to two 16-bit registers, or 4 8-bit registers. you also have some index and pointer registers. uh

intel CPU history 

In 1989 they release the 80486. it go faster. then 1993 they release the pentium, or P5, or 80586. this is where scary CPU optimization happened with superscalar architectures, speculitve execution, FAST SHIT. fucking SCORE. 1995 they release the P6, or i686 architecture. things are coming up intel, even though AMD is competing hard. how could it go wrong?

intel CPU history 

2000 intel released the NetBurst architecture that went FAST MAYBE SOMETIMES. it also implemented AMD's x86-64 bit specification which is probably the sanest x86 architecture. eventually though intel realized this design was so trash they junked it and went back to the P6 architecture and branched that off to the Dual Core stuff. OK, SO. THAT'S NOT ACTUALLY WHAT I WAS HERE TO TALK ABOUT TODAY

intel CPU ALTERNATE TIMELINE 

from the past 1 hour of tooting you might have gotten the impression that intel doesn't know how to design CPUs and just up the clock speed or cores or whatever number there is. that's partially true. BUT. did you know that during the 80s intel tried to kill the 8086? ON THREE SEPARATE OCCASIONS?

intel CPU ALTERNATE TIMELINE: i960 

intel released three non-x86 CPUs during the 80s, with development starting during the late 70s. starting with the most successful: 1984, the i960. this was a really good selling RISC CPU and apparently is still used today. it supported 32-bit tagged memory for hardware memory protection, superscalar execution, and a flat memory space. turns out nobody really wanted it, but the team behind it worked on the P6 architecture

intel CPU ALTERNATE TIMELINE: i960 

in 1989 there was the i860 cpu. i have no idea why it's a lower number. this one was fairly popular but not very successful. it has a VLIW CPU, had shit like SIMD, was kinda 64-bit. but this design relied on the compiler being smart. intel somehow didn't learn their lesson later with the Itanium architecture. context switching could cause a 62 to 2000 cycle pipeline stall which wasn't great.

intel iapx 432 

so now we get to the big boy. intel's design they were working on using the 8086 as a stop gap? the iapx 432. this CPU is truly, TRULY bizarre. i've talked to you so far about buses, and registers, but the iAPX 432 instead had OBJECTS, GARBAGE COLLECTION, MULTITASKING, FLOATING POINT, CAPABILITY-BASED ADDRESSING. in hardware.

intel iapx 432 MYSTERY 

it's a truly alien CPU. why did this fail? because it was slow and unoptimized and the 8086 took off. so here comes the absolutely truly bizarre part of this: it's gone. the only left left of it is some reference manuals, one guy's website listing various products that are known to exist, a scientific paper criticizing the performance. no emulators, no software , no compilers. the guy who did the website (brouhaha) has some boards, but none are working and few have chips

intel iapx 432 MYSTERY 

there's been a lot of interesting CPU architectures out there, FUCK TONS. but none have been like the iapx where they attempted to design a hardware-based bubble of objects. don't get me wrong, it's a bad idea to do that. don't do that. but how is it that intel managed to tape out actual boards, chips, reference manuals, software packages, and have all of that disappear?

intel iapx 432 MYSTERY 

unless everyone physically destroyed their boards, surely there's a working one out there? unless intel destroyed all their designs, software, manuals, etc, surely there's an emulator or simulator out there? the chip was produced from 1981 to 1985. there were factories to make this thing. there were printers for manuals, engineers for design, testing jigs, debug tools. where'd it all go?

intel iapx 432 MYSTERY 

was it aliens? probably not. the most likely scenario i can think is that it didn't get past limited runs of leased dev kits that had little interest, and intel to protect its IP recalled and took back what wasn't wanted to avoid leaks. for something to have to make it out of intel's control it would have to be lost or sold to consumers in some way, such as through finished products which weren't viable

intel iapx 432 MYSTERY 

even if that's the case and the men in black (intel lawyers?) took everything back, where did those go? to the trash? what about software developed by companies for the dev kits? surely that's archived somewhere.

though i suppose it's been 35-40 years. nobody has shown interest in this, there's not even any youtube videos in it. it might have actually been considered trash that wasn't useful

intel CPU ALTERNATE TIMELINE: i960 

@jookia everyone loves the iAPX 432

intel CPU ALTERNATE TIMELINE: i960 

@a_breakin_glass sssssssshhh SPOILERS

re: intel CPU ALTERNATE TIMELINE: i960 

@jookia @a_breakin_glass Especially since the iAPX432 project was launched before the 80286. That's where the 286/386 "descriptor tables" all come from. ;)

intel iapx 432 MYSTERY 

@jookia Apparently it was also optimized later in it's life for Ada, so the DoD might have been part of the initial runs with it, and that might explain why it disappeared. That and Intel not wanting to admit failure.

intel iapx 432 MYSTERY 

@craigmaloney yeah, that's one idea that makes sense. it might have been that the DoD were their only customer, or build solutions that exist today and required buying up all the existing chips in advance or something

Show newer

re: intel iapx 432 MYSTERY 

@jookia

This is such an amazing thread.

Show newer

re: intel iapx 432 MYSTERY 

@jookia

don’t get me wrong, it’s a bad idea to do that. don’t do that.

OMG you are correct. That sounds super limiting

re: intel iapx 432 MYSTERY 

@jookia The Amiga architecture did do a lot of stuff in hardware which lead to two things, one good one bad.

Super longevity and relevance long past its age
Really difficult to upgrade

I guess with the solarpunk we-wanna-reuse mindset, those things are actually both good♥

re: intel iapx 432 MYSTERY 

@Sandra yeah basically all my computers are rescue/obsolete computers so there's a lot of time to reflect on architectural mistakes. luckily though the amigas have expansion ports right

re: intel iapx 432 MYSTERY 

@Sandra yeah part of the reason the iapx 432 was dead on arrival is because it implemented high level concept in microcode, and it did it slow. and because it's hardware, fixing it is very expensive. between VLIW ISAs where programs must know the hardware and 432's ISA where the hardware is invisible, something like RISC or CISC as an ABI *seems* to work well enough

intel iapx 432 

@jookia whoa that's fucking cool! but way too much to try to pull off at that point in time.

wonder if it would technically work better now given that transistor counts have gotten far larger since then

kind of reminds me of the lisp machine architecture

intel iapx 432 

@KitRedgrave yeah it's a bit similar to that except lisp machines compiled to like stack machines that were the actual hardware. this is way higher level.

it'd be cool to see how it would work on systems that have a clock speed limit where having higher level instructions would mean better optimikzation, like FPGAs. though you still have a transistor limit there :\

intel CPU history 

@jookia this was enhanced mode? (Or enchanted mode as I seem to remember calling it)

intel CPU history 

@marnanel yeah i think so. enchanted mode is pretty good. unreal mode is when the 286 somehow ran 16-bit code with a flat address space mode and also called voodoo mode

intel CPU history 

@jookia they used the 8088 in the first PC, but the differences are minor to your overall point.

Sign in to participate in the conversation
Mastodon

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!