I figure we'll have a single thread for each broad set of "How Computers Work" episodes. We'll be posting links in here as topics come up. Of course, bother us with your questions and comments as well!
The podcast was quite heavy. I feel that I need to sit down, listen to this and do nothing else. If you can slow it down or even provide a visual on the front page, that'll be great.
Aha, so you're reading The Difference Engine, Rym.
And the first name of Babbage is Charles.
And according to recent discoveries, Babbage's Engine may not have been the first computer. The first one appears to have been the Antikythera Mechanism, a device fished up from the Mediterranean.
So, if any of you listened to the episode but don't understand how to calculate binary into base 10 here goes:
say you have an 8 bit binary number (each didgit of the binary sequence is another computing 'bit') hence 64 bit processors can handle a string of 64 binary didgits.
say it is
011001011
Now imagine above each didgit you have a number , that number represents that number in the sequence. The next number in the sequence represents double the last number so..
256
128
64
32
16
8
4
2
1
0
1
1
0
0
1
0
1
1
(please forgive the poor table)
now, just add up each one. in the sequence 011001011 we have: 1*1 1*2 1*8 1*64 1*128
so, 1+2+8+64+128 gives you... 203
Therefore 011001011 is equal to 203.. Happy? And yes it is 0 Volts or +5 volts, i believe
I agree with amethisttomoe. This episode was a little heavy and fast.
I'm fully educated on the topic, and should normally just nod at everything you said about it. I often play simple, mindless puzzle games while listening to GN, and it rarely distracts me from the podcast so much that I don't understand what you are talking about. This time I had to quit playing puzzles and listen carefully, and I still had to rewind several times.
One obvious reason for this is that I'm not used to do mathematics in English and therefore am a bit slow at that part, but I also think the topics of bits, bytes and circuits are difficult to teach by audio alone.
As far as voltages in logic, your computer power supply (as seen on the labels) have +12, -12, +5, -5, +3.3 volt rails. I won't get into why there are - voltage rails, and the 12 volt is only for driving motors in the drives and not for logic circuits. All the logic is 5 and 3.3 volt. Older computers did more with 5V, but newer computers do more logic on the 3.3V rail. The CPU actually runs much lower, and you can see the CPU voltage setting in the BIOS and is usually around 1.6V - 1.8V range and is processor dependant. Using lower voltages helps reduce power consumption and heat generation of the electronics.
Older ISA and other expansion slots were all based on 5V. During the PCI era, you can have cards and slots that have either 3.3V, 5V or both (universal). Where the cut out in the card (or the key on the connector) is located determines what type of power it will use for signaling. Most PCI devices and motherboard PCI slots are 3.3V only now. I believe that the new PCI-Express are only 3.3V.
I'm not really sure a computer novice can understand the signifiance of binary logic without a solid understanding of boolean logic and binary mathematics.
It might be best to start with the definition of a computer (input->deterministic process->output), then expand on that by talking about basic logic and mathematics (if, x++, etc.). You could move from there to explaining how that logic is transformed into assembler, how that works, and ultimately how that is expressed in machine code.
From there, explain the physical components of a computer, from cpu/mobo to peripherals, then onto operating systems and software. Then maybe networking.
The trivia answer is NAND gate. A transistor may be hard to explain, but it's easy to say that it is simply a NAND gate. But then I'm supposed to know that...
All right. We'll talk solely about binary logic on the next one. I think if we keep it to one very narrow topic every episode, it will be easier to follow.
First year CS shit... oh yeah! The only real subject at university I slept through and still scored a solid distinction grade was based on what Rym and Scott are going to be talking about over the next few weeks. Revision never hurts, though. Might even learn something new.
Yeah, I have an exam coming up within the next week thats got some of this in it. So far, I have slept through most of it, and I have gone solid in all the tests and assignments.
In fact, I have 3 exams coming up, and have done nothing but read ranma and x-men comics for the past few days...
I'm surprised that Scrym didn't cover the turing machine and Alan Turing. Sure it's mostly a "thought experiment", but still its the basis of modern computing. While not really related to how a computer works, I still think it's important to the logic behind computing.
Rym, I might have yoru copy of Diamond Age. Scott lent me a few books and a cople of them were yours, I think. Just wanted to let you know in case you went looking for it and found it missing.
So as some of you know I am pretty new to Geek Nights. I have been listening to old episodes, especially the science and tech episodes, so please forgive the comically late addition to this conversation. The inner working of computers are fascinating to me because they are simultaneously complex and simple.
So as I was listening to the first part in these series on How Computers Work, there was a moment where Rym and Scott to going over the components that were non electrical, and Scott said "...and some parts of the computer are magnetic, which is kind of electricity..." Rym says "what!?... I'm gonna let that one slide." Scott replies " yeah I'm just bein' silly"
Now since I'm years late to the discussion it may be that you guys have since found more out about the subject, but I thought it important to say something in case some walked away from listening to that thinking they weredifierent forces.
I gotta say though, random comment aside, this series was super interesting, and filled in more than a few gaps in my under standing of computer logic and how all that complexity came come such a simple building block, as a NAND gate.
So as we all know from listening to the podcast, "Computers are deterministic, if they weren't they would be utterly unusable." As a general rule I totally agree with this statement, and every once in a several blue moons, a computer will do something that, as a non-programmer I can't explain, but also has the ring of being human.
Like a few minutes ago, I started my computer as normal, and as my programs booted up magic began to load. Now if your not familiar magic jack lets you make calls through you internet connection, and has been around for awhile now. The UI got an uplift a few weeks ago to update the style a bit. So it was very strange to me when on start up it show the old splash graphic, for like a full second (an eternity for a computer) before switching to the new splash graphic. It felt very much like the computer was a bit groggy and through up the old graphic out of habit before realizing its error.
Now I no my computer IS in fact deterministic, just very very complex. My guess for what caused this would be lazy programming, probably the update didn't actually get rid of the old image and maybe the computer always show the old just before the new but just make the switch so fast I can't see it. (not a hard feet at all for a computer.
The reason I think it interesting, is how similar it still WAS to the quirky behavior of a human. I mean, isn't evolution the ultimate 'lazy programmer'? I think that it could easily happen, given enough time, that an AI could accidentally be created, it may look very different from human intelligence, but given enough 'quirks' eventually glitches become indistinguishable from free will.
Anyway thats my internet rank for the day, thanks!
When you think you see apparent human-like behaviour in things that are not humans, especially things that aren't even animals, be very wary of apophenia.
As far as artificial general intelligence is concerned, I very much doubt it would happen by accident, and that's a good thing. If and when it's created it will indeed be quite different to human intelligence, though.
As for the "free will" issue, insofar as free will is a coherent concept in the first place it has to be deterministic, and so there's no reason computers won't or can't have it. Glitches or quirks aren't really helpful, however.
Comments
And the first name of Babbage is Charles.
And according to recent discoveries, Babbage's Engine may not have been the first computer. The first one appears to have been the Antikythera Mechanism, a device fished up from the Mediterranean.
say you have an 8 bit binary number (each didgit of the binary sequence is another computing 'bit') hence 64 bit processors can handle a string of 64 binary didgits.
say it is
011001011
Now imagine above each didgit you have a number , that number represents that number in the sequence. The next number in the sequence represents double the last number so..
now, just add up each one. in the sequence 011001011 we have:
1*1
1*2
1*8
1*64
1*128
so, 1+2+8+64+128 gives you...
203
Therefore 011001011 is equal to 203.. Happy?
And yes it is 0 Volts or +5 volts, i believe
*Edit* Made me a binary table *Edit*
Table of Values For Binary Numbers
I'm fully educated on the topic, and should normally just nod at everything you said about it. I often play simple, mindless puzzle games while listening to GN, and it rarely distracts me from the podcast so much that I don't understand what you are talking about. This time I had to quit playing puzzles and listen carefully, and I still had to rewind several times.
One obvious reason for this is that I'm not used to do mathematics in English and therefore am a bit slow at that part, but I also think the topics of bits, bytes and circuits are difficult to teach by audio alone.
Older ISA and other expansion slots were all based on 5V. During the PCI era, you can have cards and slots that have either 3.3V, 5V or both (universal). Where the cut out in the card (or the key on the connector) is located determines what type of power it will use for signaling. Most PCI devices and motherboard PCI slots are 3.3V only now. I believe that the new PCI-Express are only 3.3V.
Now I can go back to listening to the episode ;p
(Regarding a statement made by Scott about 35 minutes in)
(edit: nixed after later explanation in audio)
It might be best to start with the definition of a computer (input->deterministic process->output), then expand on that by talking about basic logic and mathematics (if, x++, etc.). You could move from there to explaining how that logic is transformed into assembler, how that works, and ultimately how that is expressed in machine code.
From there, explain the physical components of a computer, from cpu/mobo to peripherals, then onto operating systems and software. Then maybe networking.
The trivia answer is NAND gate. A transistor may be hard to explain, but it's easy to say that it is simply a NAND gate. But then I'm supposed to know that...
A little slower would be good for me too.
Thanks
In fact, I have 3 exams coming up, and have done nothing but read ranma and x-men comics for the past few days...
The inner working of computers are fascinating to me because they are simultaneously complex and simple.
So as I was listening to the first part in these series on How Computers Work, there was a moment where Rym and Scott to going over the components that were non electrical, and Scott said "...and some parts of the computer are magnetic, which is kind of electricity..." Rym says "what!?... I'm gonna let that one slide."
Scott replies " yeah I'm just bein' silly"
Now since I'm years late to the discussion it may be that you guys have since found more out about the subject, but I thought it important to say something in case some walked away from listening to that thinking they weredifierent forces.
http://en.m.wikipedia.org/wiki/Electromagnetism
But, yeah, electricity and magnetism are just two sides of the same coin.
Like a few minutes ago, I started my computer as normal, and as my programs booted up magic began to load. Now if your not familiar magic jack lets you make calls through you internet connection, and has been around for awhile now. The UI got an uplift a few weeks ago to update the style a bit. So it was very strange to me when on start up it show the old splash graphic, for like a full second (an eternity for a computer) before switching to the new splash graphic. It felt very much like the computer was a bit groggy and through up the old graphic out of habit before realizing its error.
Now I no my computer IS in fact deterministic, just very very complex. My guess for what caused this would be lazy programming, probably the update didn't actually get rid of the old image and maybe the computer always show the old just before the new but just make the switch so fast I can't see it. (not a hard feet at all for a computer.
The reason I think it interesting, is how similar it still WAS to the quirky behavior of a human. I mean, isn't evolution the ultimate 'lazy programmer'? I think that it could easily happen, given enough time, that an AI could accidentally be created, it may look very different from human intelligence, but given enough 'quirks' eventually glitches become indistinguishable from free will.
Anyway thats my internet rank for the day, thanks!
As far as artificial general intelligence is concerned, I very much doubt it would happen by accident, and that's a good thing. If and when it's created it will indeed be quite different to human intelligence, though.
As for the "free will" issue, insofar as free will is a coherent concept in the first place it has to be deterministic, and so there's no reason computers won't or can't have it. Glitches or quirks aren't really helpful, however.