Hardware, electricity and computing

Nice thread.

[...] fundamentals of how computer work at the material level and how it is related to driver kernel and software.
Depending on what you actually want to learn that is quite a span of (domain) knowledge.

[...] However, trying to learn a CPU from logic gates up is very challenging.
I think so too. Consider also, hardware projects with “lots of wires” could get you bogged down by electronics/timing issues that have little to do with logical design. I think precise stepped projects like those great ones from Ben Eater could be a good basis to steer you clear of those (mostly). With an Arduino kit you’ll get a lot less wires.

Ben Eater shows you how a small CPU and peripheral components interact, and can be programmed in a simple way (assembler). A nice impression and Q & A of a breadboard CPU is perhaps Tech Talk on Building a Custom TTL CPU

If you want less or no actual hardware and want to concentrate on the logical CPU design, you could opt for design in a hardware description language with simulation, something in VHDL or comparable probably*. I don’t have any current references for that however.

For an overview, I can recommend Code: The Hidden Language of Computer Hardware and Software, 2nd edition - Announcing “Code” 2nd Edition. That may not cover a hands-on practical experience but it does give an overview that addresses your original broad question. Have a look at the contents and preview text; also the companion website that contains interactive examples.

Along the span of low-level hardware (basic TTL circuits) to driver software interactions with the kernel, I suggest you pick a place of interest to start and see what you actually like spending time on.

___
* pre-VHDL during my studies a long time ago, I followed a course in university where we designed and simulated parts of a CPU in APL, including the system clock timing signals. Prof. Gerrit Blaauw, a co-designer of the IBM System/360 architecture, gave the course.


edit: less broad than your question but, as you are already engaged in FreeBSD and mentioned device drivers, I think it's at least worth mentioning:
  • The Design and Implementation of the FreeBSD Operating System, 2nd edition by Marshall Kirk McKusick, George V. Neville-Neil
  • FreeBSD Device Drivers by Joseph Kong
See: FreeBSD Development: Books, Papers, Slides
 
It is reasonably easy to learn how semiconductors work and how logic gates are made from them.

However, trying to learn a CPU from logic gates up is very challenging.
Glad You say that.

I found it easy to understand logic gates, combine them, build some stuff with them.
I also found it understandable how to build an 8-bit computer with a given CPU (there were periodicals for radio-amateurs and electronics hobbyists, and they showed how to do it), and then combine both pieces and build peripherials.
But what stayed obscure to me, for further years, is: how does that CPU work inside? How does it manage to make the assembler commands do the specific thing?
 
I touched upon logic gates very briefly in my first year of college - there was one chapter on that in my introductory textbook that was covered for a week in my class. The importance of that didn't quite sink in until later - the logic in C programming (another class that I took the following term) is actually a pretty decent reflection of the hardware logic circuitry... That was kind of the logic of my curriculum - learn some basics, and then learn how those basics can be put together, all the way up to the modern computer.
But what stayed obscure to me, for further years, is: how does that CPU work inside? How does it manage to make the assembler commands do the specific thing?
It's those exact same breadboard transistors, and the assembler - except that the transistors are 5 nanometers across these days, thanks to Intel and TSMC... Putting it all together, coming up with specific patterns, assigning them meaning, materials research to make the whole thing even smaller, coming up with ways to regulate electrical impulses to do things correctly... all in the name of making that same breadboard power a Threadripper.
 
A few months ago I cleaned out some old boxes. Found my old school project. What we would now call an SBC; 6809 CPU, 6264 (8Kbx8) RAM, 2764 (8Kbx8) EPROM, MC146818 (Real-time clock), 6551 ACIA (serial port) and some glue logic. Built this in 1988. I should fire it up again, see if it still works. Some of the tantalums might be shorted or blow up though, they tend break after 35+ years.

 
[...] Some of the tantalums might be shorted or blow up though, they tend break after 35+ years.

Yeah, but nice to see those quality DIP sockets with machine rounded contacts though.
All on—taking an educated guess :)—an "in-school" designed and etched PCB?
 
There were kits in the 1970s/80s for building your own 6800/6502/Z80 computers.
Well such kits for building retro computers, like Apple I, are still available today. BTW Apple I building are easier to build together than Commodore 64, since the SID chip on C64 was proprietary chip and is no in production any longer. So getting by a SID chip is not so easy to do.

There is also a thing called the Gigatron TTL Computer, which is a building kit for a simple 8 bit computer using TTL logic chips only, so it has no dedicated CPU. Instead you are building the CPU with these logic chips. Of course this needs some soldering experience to get the job done, but it is targeted at people who want to understand what makes a simple computer tick.

View: https://www.youtube.com/watch?v=_2uXqTi42LI


Deeper introduction:
View: https://www.youtube.com/watch?v=QUfdASs82Lw


BTW the whole design of the Gigatron is Opensource, and its sourcecode also under the BSD license.
 
the logic in C programming (another class that I took the following term) is actually a pretty decent reflection of the hardware logic circuitry..

To this day, when thinking about the flow of a program, I still catch myself sometimes picturing in my mind the address and data lines between the processor and memory
 
All on—taking an educated guess :)—an "in-school" designed and etched PCB?
At that time, the PCB was supplied by the teacher. I'm guessing it was produced at a 'proper' PCB maker but I honestly don't know who designed it. It's a really nice double sided board, soldermasks and everything. Not impossible to do yourself but does require some specialized equipment. Certainly not the kind of equipment the school had in the late 80s. It basically came in a kit and we had to put it together. That said, other lessons did include boolean algebra, reading timing diagrams and working out the glue logic to make the thing work. Then learning how to code for it, learning how to query the I/O chips, interrupts, the whole nine yards. The year after it we had to design and implement a 'daughterboard' for it ourselves. There's a dubbelsided pinheader that basically connects to the CPU bus and the daughterboard plugged into it.

BTW Apple I building are easier to build together than Commodore 64, since the SID chip on C64 was proprietary chip and is no in production any longer. So getting by a SID chip is not so easy to do.
The VIC-II (video) was a custom chip too. Impossible to get nowadays. The Apple used a fairly 'simple' circuit to drive the video signal. Easy to reproduce with modern components. You don't have to find a source for 40 year old chips whose production stopped eons ago ;)

The 6502 is still being produced today. The one I linked earlier is a modern variant, they even fixed some of the bugs of the original 6502. And it runs at a whopping 14 MHz. By comparison the Apple, C64/Vic-20, NES, etc. typically ran at 1 or 2 MHz. I've read an article some time ago about reverse engineering some modern bus controller and it turned out there's a soft-core 6502 running the show internally. Makes sense too, it's a reasonably simple CPU, not a lot of transistors/components. Doesn't take up much space in an FPGA or ASIC, yet it's still quite powerful.

I've looked around for a 8088 or 8086 but those are not produced anymore. I really would have liked to put together an actual Altair 8088 or IMSAI 8080 clone, a proper clone. I know of the modern replica/clones but they all use an Arduino or ESP32 to emulate the machine. Cool but not the same thing.
 
I've looked around for a 8088 or 8086 but those are not produced anymore. I really would have liked to put together an actual Altair 8088 or IMSAI 8080 clone, a proper clone. I know of the modern replica/clones but they all use an Arduino or ESP32 to emulate the machine. Cool but not the same thing.
Sometimes, you can find an old, out-of-production device that someone just didn't have the heart to toss out... then you can pull it apart and salvage original circuitry... And who knows, maybe you can get lucky that way. ;)
 
https://www.sparkfun.com/products/12614I found it easy to understand logic gates, combine them, build some stuff with them.
I also found it understandable how to build an 8-bit computer with a given CPU (there were periodicals for radio-amateurs and electronics hobbyists, and they showed how to do it), and then combine both pieces and build peripherials.
But what stayed obscure to me, for further years, is: how does that CPU work inside? How does it manage to make the assembler commands do the specific thing?

I agree with you to some extent.

WIth a few months worth of effort (evenings and/or weekends), one can understand how transistors and FETs work. Build discrete circuits, which make lights blink or receive and transmit radio waves. Trying this out (breadbording) is relatively simple, with modern tools: Schematic capture followed by simulation, or use an actual breadbord (link goes to a $20 large one), buy the components at Digikey or similar mail-order houses (or sometimes there are still hobby stores that have them in stock).

From there to TTL is not a huge leap: a NAND gate is just a few transistors, and a 74LS00 is nothing but four NAND gates in a convenient 14-pin box. Building small circuits which make lights flash in interesting patterns with TTL chips takes another few hours if one has experience with power supplies, wires, and transistors. You can build counters, you can add numbers (which are input with little switches and displayed bit-by-bit on LEDs), you can build "if statements" (only turn the red LED on if switch 2 and 4 are off).

The next step used to be reasonably easy: Grab an off-the-shelf CPU (like a 6502, 8080, Z80, 6809) running at a reasonably low clock rate (MHz or below), grab some memory chips with TTL interfaces (2764 for ROM, 4164 for RAM), some TTL chips for the glue, and pretty soon you have a functioning computer. The only problem with it: it has no interfaces. Adding a serial port for a terminal and a parallel port for the printer is not that hard, but who even has a terminal or printer these days? And building a video controller is actually hard at that level (BTDT).

Now coming from the other end: Anyone can push icons around on a GUI. From that, learning a CLI and knowing what files are and how to execute them isn't terribly hard. Given that skill set, programming shell scripts or BASIC is a small step, and one can quickly proceed to a serious language (Python for example), and from there to compiled languages. By the time one is programming in C, one knows concepts like bytes and word sizes, reading and writing memory, branching in the code, and if statements. Armed with that knowledge, programming in assembly is tedious, but not very hard.

Once one is writing assembly, it becomes possible to actually write all the software for the computer we built a few paragraphs above. After all, the BIOS is just a few thousand lines of assembly, and then you can find an off-the-shelf BASIC interpreter, or a floppy interface to load a commercial operating system (cp/m or OS-9).

All these are things that an energetic amateur can do, and I've done them all. In my basement is a home-built cp/m machine, with a wire-wrapped CPU and serial/parallel interface board, a few floppy drives, and a home-written BIOS that boots cp/m.

But the one mystery remains: How does the CPU chip work inside? The number of people who know how to understand that is remarkably small. Even in computer science classes, this is not actually taught. One can sort of make an educated guess: bus drivers, a little bit of memory (based on flip-flops) for the registers, an ALU (which is nothing but adders and a bit of control logic), make several copies of that (for example to both be able to add numbers, and to increment the PC and stack pointer), and you're there. It turns out there are textbooks for this, but they are very uncommon. Our (sadly recently deceased) neighbor wrote one of these textbooks, called "Microprocessor Logic Design The Flowchart Method" (it's 35 years old). He literally understood every gate in the Motorola 68000 CPU, because he architected and designed it. Ultimately, the stuff inside the CPU is just a reasonably large number of really simple components (registers, counters, ALUs), each of which is easy to understand.

If someone has a reference for a good modern book on CPU internals, I'd be delighted to hear about it.
 
Well, this here is used in many universities as textbook, and covers almost everything: John Hennessy and David Patterson: Computer Architecture. A quantitative approach.

It's not cheap, it covers almost everything and is a monster. In case you just want to have a gentler introduction, which keeps is more simple than this book is not for you.


 
Holy crap, someone built the Hack computer from Nand2tetris on breadboards using TTL ICs.

EE student, and it took him two years. Dunno that I would attempt it.
 
The next step used to be reasonably easy: Grab an off-the-shelf CPU (like a 6502, 8080, Z80, 6809) running at a reasonably low clock rate (MHz or below), grab some memory chips with TTL interfaces (2764 for ROM, 4164 for RAM), some TTL chips for the glue, and pretty soon you have a functioning computer. The only problem with it: it has no interfaces.
Originally I wasn't even interested in computers. I intended to go for audio electronics, stage equipment and such. But then the home computers appeared, and people started to knock on my door: hey, you can solder, can you fix my, whatever, C64 psu? or such. (I actually had the equipment to create pcb as well.)
So I decided to have a look into how that stuff works. Came across something that appeared to be a commodore business machine (6509-based), only to find out that there was no means to store anything permanently. :( One would need to buy a floppy drive or such, but that would be business equipment with business price-tags, so no way.
But, the thing should also be able to put data on a cassette. That's probably a pita - but I had an open-reel tapedeck, and that had a remote control, and a real-time tape counter. So it could be steered by the computer, and the computer could get (more or less) accurate information about where on the tape we are - it might do retry-on-error and all such fancy things.
So I started to write a tape storage with directory. It did work (it was painfully slow, obviousely - audio decks are not built for quick positioning), and afterwards I knew how to program. I didn't get further with that, because then the IBM-PC gained momentum and there were more interesting options.

But the one mystery remains: How does the CPU chip work inside? The number of people who know how to understand that is remarkably small. Even in computer science classes, this is not actually taught.
Hm, I thought there it would.

Ultimately, the stuff inside the CPU is just a reasonably large number of really simple components (registers, counters, ALUs), each of which is easy to understand.
A huge amount of these, and very well orchestrated in their interaction. Probably it's difficult because one cannot just grab a few gatters, put them together, then add a few more in the hobbyist way - it needs a good plan to begin with.

But those people who built the 4004 and the other early microprocessors, they didn't start from scratch. There were mechanical calculators before, so the logical requirements were already known.
 
If someone has a reference for a good modern book on CPU internals, I'd be delighted to hear about it.
Bit Slice Microprocessor Design by Mick and Brick gets close. I own that plus another one but I can't recall the authors or the title.
I did study on my own transistor level design but never finished. I got to the point where I could tell which part of an Xrayed chip did what.
 
In a course of Computer Architecture we used many, many years ago, in the 1980s, a book by Hill and Peterson.
I do not remember the title, but was very clear, easy to read.

Sure outdated, it was oriented to design something like a PDP 11, or perhaps a mainframe, it used as
hardwaŕe design language something like APL. But I think, perhaps some concepts are till today the same.
 
In a course of Computer Architecture we used many, many years ago, in the 1980s, a book by Hill and Peterson.
I do not remember the title, but was very clear, easy to read.

Sure outdated, it was oriented to design something like a PDP 11, or perhaps a mainframe, it used as
hardwaŕe design language something like APL. But I think, perhaps some concepts are till today the same.


Fredrick J. Hill_ Gerald R. Peterson - Digital Logic and Microprocessors (1984)

?
 
Fredrick J. Hill_ Gerald R. Peterson - Digital Logic and Microprocessors (1984)
No, no microprocessors. It was the first: "Digital Systems: hardware organization and design".
It was very clear how the APL dialect "compiled" to a circuit with flip-flops. How a computer is built in that way.
A very nice book.

Something that I remember, was, that the price of using the hardware description language, was that
one needed more flip-flops. I can imagine that today there are ways to optimize it?
 
Something that I remember, was, that the price of using the hardware description language, was that
one needed more flip-flops. I can imagine that today there are ways to optimize it?
Oh yes. Sometimes the compiler pulls tricks you did not think about. But it sometimes performes at glacial speed. Friend of mine did a design that took 7h for a turnaround. He also is in the altera(?) regression test set with a "design from hell", where he left 4 blocks unused and partly wired the rest in such a way that the router could do it. The next stages started pulling assert()s of the "How the hell did we get here?" type.
Did I mention he is a highly functional authist?
 
I would suggest starting with an 8 bit CPU based computer. It's simple enough to wrap your head around the whole concept. The fun thing is that today's modern 64 bit computers aren't that different, they still use the same concepts, only much more complex.

8 bit 6502 based computer:
View: https://www.youtube.com/watch?v=LnzuMJLZRdU&list=PLowKtXNTBypFbtuVMUVXNR0z1mu7dp7eH


Entire CPU created from scratch:
View: https://www.youtube.com/watch?v=HyznrdDSSGM&list=PLowKtXNTBypGqImE405J2565dvjafglHU
Man, those Ben Eater videos are amazing! I couldn't stop watching them.
 
Back
Top