Where did Bytes Come From? - Computerphile

Sdílet
Vložit
  • čas přidán 27. 05. 2024
  • Why do we have 8 bits in a byte? Professor Brailsford on the origins of the humble byte.
    Why Use Binary?: • Why Use Binary? - Comp...
    Babbage's Puzzle: • Babbage's Puzzle - Com...
    Initial Orders II: • Bootstrapping EDSAC: I...
    The Font Magicians: • The Font Magicians - C...
    Colossus and Bletchley Park: • Colossus & Bletchley P...
    / computerphile
    / computer_phile
    This video was filmed and edited by Sean Riley.
    Computer Science at the University of Nottingham: bit.ly/nottscomputer
    Computerphile is a sister project to Brady Haran's Numberphile. More at www.bradyharan.com

Komentáře • 499

  • @patavinity1262
    @patavinity1262 Před 3 lety +306

    "At home I've got ten-digit log tables, a thick book of them"
    His bedtime reading, what a savage.

    • @ikrotzky
      @ikrotzky Před rokem +2

      “- Professor Brailsford: I checked this out weeks ago for a bit of light reading.
      - Ron Weasley: This is light?”

    • @blitzmensch
      @blitzmensch Před rokem +1

      @@ikrotzkyhahah you got me bro

  • @BertGrink
    @BertGrink Před 6 lety +2082

    This man ought to be declared a National Treasure.

    • @RustyTube
      @RustyTube Před 6 lety +136

      *International

    • @TheTwick
      @TheTwick Před 6 lety +60

      BertyFromDK He is the embodiment of the history of digital computers. I can listen to him talk for hours.

    • @BertGrink
      @BertGrink Před 6 lety +17

      TheTwick I completely agree. I wish i had had the opportunity to study under his guidance.

    • @AllanBrunoPetersen
      @AllanBrunoPetersen Před 6 lety +4

      Get this man on WFPB diet so we can have him around longer. :)

    • @thomasp2572
      @thomasp2572 Před 6 lety +4

      Then declare it

  • @TheDave000
    @TheDave000 Před 4 lety +271

    Also as a telecoms engineer, I would like to point out that 8 bits wasn't always a byte, so when bandwidth matters we call them "octets" to be sure that we really do mean 8 bits. The term is still in use today.

    • @AlexSmith-fs6ro
      @AlexSmith-fs6ro Před 4 lety +24

      Yes indeed. Some computer hardware used 9 bits to refer a byte. Telecoms engineers began to use the word octet to mean exactly 8 bits.

    • @brendonholder2522
      @brendonholder2522 Před 4 lety +2

      Hello, I am currently studying for my bachelors in electrical engineer and I'd like to chat with you about your career if that's okay?

    • @IronicHavoc
      @IronicHavoc Před 3 lety +1

      I mean the way he mentioned it in the video I wouldn't have assumed 8 bits was always called a byte.

    • @FaeTheo
      @FaeTheo Před 3 lety +29

      He literally had a section where he talked about how 8bits weren't always a byte. 8:22

    • @twogitsinacar4811
      @twogitsinacar4811 Před 2 lety

      True !

  • @StephenFarthing
    @StephenFarthing Před 6 lety +5

    I wrote my first programs on an ICL1903a back in 1971 at the Lanchester Polytechnic where I read Biogeography. It had an Algol 60 compiler and a 24 bit word length and six bit bytes. The ICL1900 series was very widely used in U.K. academia and government.

  • @LindaLooUK
    @LindaLooUK Před 6 lety +360

    I do so love Professor Brailsford's videos. I understand about 1% of what he says usually (although this video was more approachable) but I'm happy to listen to whatever he has to say. His enthusiasm and depth of knowledge is extraordinary and he has such a lovely way about him. :)

    • @ZeedijkMike
      @ZeedijkMike Před 6 lety +4

      I can only agree. It's always a pleasure to watch him talk - even if some of it flies right over ones head (-;

    • @LindaLooUK
      @LindaLooUK Před 6 lety +7

      What an odd comment. I wasn't aware that I was pretending otherwise.

    • @woodywoodlstein9519
      @woodywoodlstein9519 Před 5 lety

      LindaLooUK same.

    • @woodywoodlstein9519
      @woodywoodlstein9519 Před 5 lety

      LindaLooUK his passion yes. Lol.

    • @Adrian_Finn
      @Adrian_Finn Před 4 lety +2

      It is a special skill he has when even those who understand little of what he says will nonetheless pluck up and listen, I wish all educators were the same.

  • @MTGR
    @MTGR Před 6 lety +311

    I liked that quite a bit

    • @LMABeste
      @LMABeste Před 6 lety +13

      *quyte

    • @NeilRoy
      @NeilRoy Před 6 lety +5

      I see what you did there. ;)

    • @Roxor128
      @Roxor128 Před 6 lety +3

      Alexander Robohm - If a byte is 8 binary digits, then would a quyte be 8 quaternary digits?

    • @baganatube
      @baganatube Před 6 lety +2

      A qubit?

    • @EchoHeo
      @EchoHeo Před 5 lety +3

      Roxor128
      A quit?

  • @NeilRoy
    @NeilRoy Před 6 lety +213

    One is a bit, four bits is a nibble, two nibbles is a byte so therefore two bytes is... a word? Calling two bytes a "word" just breaks the whole eating theme! I propose we rename two bytes as a "snack" and maybe four bytes a "meal"! ;)
    Great video anyhow, as always. Love Mr. Brailsford. And I definitely want to hear more from him about 8bit computing and IBM, though I think I know the answer to that one, having grown up with them.

    • @eideticex
      @eideticex Před 6 lety +32

      So by your logic, 8 bytes (32 bits) would be a "feast"? Extending further would 16 bytes be a buffet? How many bits is a "take out"?

    • @PvblivsAelivs
      @PvblivsAelivs Před 6 lety +22

      For computer speak, it's spelled "nybble."

    • @ThePharphis
      @ThePharphis Před 6 lety +14

      2 bytes is a brownie

    • @stevefrandsen
      @stevefrandsen Před 6 lety +14

      And if you lose a byte it could be a burp, losing 2 could be a hiccup, 4 a belch and 8 a puke, 16 a stomach pump and 32 DOA. Just continuing the food theme on a negative scale.

    • @ffggddss
      @ffggddss Před 6 lety +4

      Cut a byte in half and it's a nybble; so . . . Cut a word in half and call those 2 bytes, a syllable.
      Fred

  • @LightDhampire
    @LightDhampire Před 6 lety +20

    In my Computer Systems class, my professor worked for IBM back in the day and told stories from when he worked there. One of those was the big debate over whether to call the byte either 8 or 6 bits.

  • @Zackyzic
    @Zackyzic Před 6 lety +31

    Honestly, like anyone is going to refuse an invitation to learn more about that beautiful historical background! I am but a lowly Electronics Engineering student, but knowing how the pioneers of the field set about innovating and planning early platforms is really interesting because a lot of this deals with hardware solutions devised to solve practical problems. It reminds me of the (possibly) small group of amazing souls who perform the arduous task of optimising our compilers and working with lower level languages like fortran, C, assembly level etc. To imagine that people were just ploughing through these problems is somewhat humbling and also scary to imagine. Its like the wild wild west, where the concrete boundaries between electronics and computer science and computer engineering were super blurry, before embedded systems came along and everything became compact. Its like staring into the heart of my subject and that's what makes me appreciate these historical lessons. Please, keep em coming professor!

    • @gwenynorisu6883
      @gwenynorisu6883 Před 5 lety +1

      Well, depends if we just get the well-worn story of the PC itself (which is somewhat fascinating, given how much of a departure it was from IBM's usual operating procedure, but has been done to death elsewhere), with which IBM were relatively late to market and only managed really to steal a leading share on the strength of their name (and, I guess, how easy it was to shove a relatively large amount of memory into it, as well as the option of a very sharp terminal-like display), or if we get the backstory behind how they spent most of the 70s, basically from the arrival of the highly promising Intel 8008 / 8080 onwards, trying and failing to produce a "minicomputer in a shoebox" type system that would capture the small-and-home-office imagination in the same way that all their rivals (and, up to 1981, much more successful dominators) seemed to pull of even without thinking about it. They had a lot of interesting and diverse stabs at it with gradually more PC-like offerings, but for some reason it's only the PC itself that finally saw meaningful sales. If it had also flopped, it's likely IBM would have abandoned the small-system arena altogether and remained concentrated on massive corporate mainframes and the like instead. Which, though I'm not really a fan of the PC and XT themselves, still would have been a shame, because they built some fine machines in the late 80s and early 90s, especially when branching out into laptops, and after an early falter with the CGA and (in most applications) EGA, produced some excellent and cost effective display standards, rather earlier than you might have expected, in the VGA and XGA.

  • @timh.6872
    @timh.6872 Před 6 lety +6

    Anyone else want Dr. Brailsfrod to narrate a complete 'history of computing' online course? He's got the perfect mix of domain specific knowledge and grandfatherly tone to make the topic interesting.

  • @ffggddss
    @ffggddss Před 6 lety +5

    When I entered college in the mid-60's, the byte had yet to be standardized.
    We had a Bendix machine (two conjoined twin G-20's, called by the faculty, a "G-21"), which was on its way out the door, and which had four 7-bit bytes per word.
    It was replaced with a Univac 1108, which had six 6-bit bytes per word.
    And then along came an IBM 360/95, with four 8-bit bytes per word.
    After I graduated, some time in the 70's I believe, the Intel 8080 took off, putting the microprocessor on the map, big-time.
    It was really the trilobite of the computer era - they were everywhere, in every conceivable device.
    And by then, the 8-bit byte was firmly established. (There was sometimes also a 4-bit nibble [nybble?].)
    Fred

    • @profdaveb6384
      @profdaveb6384 Před 6 lety

      Interesting story. But ,for the Bendix machine and the Univac 1108 , were those bytes actually directly addressable ? Or was it just that 7 bits and 6 bits were the character widths?

    • @ffggddss
      @ffggddss Před 6 lety

      They were the character widths, at least (the G-21 had a 127 or 128 character-set); as for addressability, I don't know. All we students were doing with either machine was programming it in ALGOL; we never dealt in assembly or any machine-code level addressing.
      It's certainly an interesting question. I kinda wish I knew how to find out.
      (BTW, this was at CMU in Pittsburgh. And I recall a researcher from the UK, Alan Bond, being there for a time, it would have been ca 1967-8, to develop a CRT-screen interface for the G-21. Pretty sure it was a vector-based graphics system. Any chance you might have known him? I don't know what university he was from in UK.)
      Fred

    • @gwenynorisu6883
      @gwenynorisu6883 Před 5 lety +1

      Wow, there was actually a computer with a 7-bit (well, 28-bit) architecture? How utterly bizarre. Were the G20s individually 14-bit, or did the conjoining work in a different way?

  • @rentzepopoulos
    @rentzepopoulos Před 6 lety +36

    Great video! I fondly remember when working on a Univac in the early 80s where we had 36-bit words that could hold *either* 4 9-bit (sort of) ASCII characters *or* 6 6-bit FIELDATA characters that were the standard character set used by the operating system :)

  • @danieljensen2626
    @danieljensen2626 Před 6 lety +245

    I would watch the video he talks about making at the end

    • @TheR971
      @TheR971 Před 6 lety +23

      I would watch him talking to garlic bread flying to space for 2.5 hours.

    • @9999rav
      @9999rav Před 6 lety

      I see what you did there...

    • @Ice_Karma
      @Ice_Karma Před 6 lety +1

      Yes! This! So much this!

  • @realcygnus
    @realcygnus Před 6 lety +12

    Brailsford's knowledge of the history/evolution of computing technology is always fascinating.

  • @Naeddyr
    @Naeddyr Před 6 lety +11

    Professor Brailsford has really become a real storyteller / showman. That end especially :D

  • @greenghost2008
    @greenghost2008 Před 6 lety +38

    He was there when the first computer was made. My man literally interned for Alan Turing.

    • @CTimmerman
      @CTimmerman Před 6 lety +4

      Turing proved that XOR gates could compute anything, but Babbage designed a universal calculator in the 19th century.

    • @gwenynorisu6883
      @gwenynorisu6883 Před 5 lety +3

      Which is why I'm edgy about seeking clarification over what I heard elsewhere that contradicts this account. The written record is one thing, but right from the mouth of someone who was actually _there_ somewhat trumps it. It'd just be nice to make sure which one is indeed right.

    • @HashimAziz1
      @HashimAziz1 Před 4 lety +1

      @@gwenynorisu6883 which part of it is contradicted?

  • @davidstoneback6159
    @davidstoneback6159 Před 6 lety +824

    This man oozes intelligence every time he opens his mouth.

    • @twothreebravo
      @twothreebravo Před 6 lety +11

      And "Cool"

    • @RCassinello
      @RCassinello Před 6 lety +38

      He has a way of making everything sound exciting and interesting, too. I could watch him lecture about paint drying.

    • @woowooNeedsFaith
      @woowooNeedsFaith Před 6 lety +12

      +Ronstar308
      There would probably be some quite interesting physics involved if we got to the details. But that's not computer science...

    • @NVE
      @NVE Před 5 lety +1

      alot of experience.

    • @CandidDate
      @CandidDate Před 5 lety

      You all want to know the secret of representing space with a byte? Well, take a 2x2x2 cube and each of the eight 1x1x1 cubes within has a distinct value, 1 or 0. The only problem is when you look at this space-byte from the corner, there is one hidden bit. Now connect this way of representing space to a self driving car, and I claim you can represent the entire Universe with stacked space-bytes. Do any of you relate?

  • @Bl00drav3nz
    @Bl00drav3nz Před 6 lety +48

    Really interesting story, I want to hear more about the 8bit micro please! :D

  • @DodgaOfficial
    @DodgaOfficial Před 6 lety +2

    One of my favorite channels when it comes to computing. Such a wealth of information.

  • @user-zl4kk7wi5u
    @user-zl4kk7wi5u Před 6 lety +4

    Best computer science professor I ever had. And I didn't study at the University of Nottingham.

  • @russellcannon9194
    @russellcannon9194 Před 6 lety +8

    Yes, please do part 2. Cheers, Russ

  • @elirane85
    @elirane85 Před 6 lety +2

    As a software engineer, I could listen to Prof Brailsford all day long.
    If he ever decided to write a book, I would buy one for every coder I know.

  • @Legomegger
    @Legomegger Před 6 lety

    its always a pleasure to see a new video from computerfile, especially with Professor Brailsford

  • @atalantab8764
    @atalantab8764 Před 6 lety

    I want this video! It's so incredibly interesting watching and learning about the history behind the individual bytes we often carelessly toss around with reckless abandon.
    Please make it!

  • @notes_at_dusk
    @notes_at_dusk Před 6 lety +9

    I could listen to this man all day. I wish that I knew all that he knows. Thank you!

  • @AtticusFinch65
    @AtticusFinch65 Před 6 lety +275

    You should make a video on little/big endian

    • @philrod1
      @philrod1 Před 6 lety +10

      Kyle Horne - on CZcams?! Are you mad? The comments will explode!

    • @mipmipmipmipmip
      @mipmipmipmipmip Před 6 lety +9

      I once had the same software compiled on both a big and a little endian system. It used a binary format for configuration and outpuf files. What a horrible time that was.

    • @MrTridac
      @MrTridac Před 6 lety +6

      Oh, yeah. That would be awesome. I bet the Professor has some snarky comments on that mess.

    • @longlostwraith5106
      @longlostwraith5106 Před 6 lety +1

      mipmipmipmipmip
      A file system I'm working on currently has both big *and* little endian files inside it...

    • @Roxor128
      @Roxor128 Před 6 lety +14

      Get Tom Scott in to rant about the headaches that go with it. It'll be up there with his ones about time zones and internationalisation.

  • @GoodWoIf
    @GoodWoIf Před 6 lety +41

    0:38 "word-based" [dramatic snap zoom]

  • @skab111
    @skab111 Před 6 lety

    I could be listening to such a great story telling for days and days...

  • @willsi
    @willsi Před 6 lety +24

    Prof' Brailsford brought up something interesting in that early computers were "word based." If there's anyway to do a video on this, or anything, please do. Unfortunately it wasn't touched in the rest of the video.

    • @w0ttheh3ll
      @w0ttheh3ll Před 4 lety +6

      actually, it's the key point of the rest of the video. what you're probably missing is that "word" refers to a specific amount of memory.
      for example, if a computer architecture has 32-bit registers, 32 bits is considered the word size of that computer.
      "word based" means that a word is the smallest amount of main memory that can be read from or written to. modern computers can usually address single bytes, allowing for a finer-grained memory access.

    • @RobBCactive
      @RobBCactive Před 4 lety

      The word size is the natural integer and address size used by the CPU. Word based opposed to byte addressable relates to memory organisation. Both RISC & CISC computers prefer memory access aligned on natural sizes, every 4 bytes for int32 or 8 for int 64 for example.
      As register sizes and calculations can be done on even larger units eg) AVX 256 or 512, plus memory data access transfers occur in units of cache lines, the original concept of words became less useful.
      For example AMD64 cannot penalise use of 32bit integers over 64bit, as compatability with Win32 binaries was a key requirement.

  • @jeremybarker7577
    @jeremybarker7577 Před 2 lety

    Some word-oriented computers did have instructions for manipulating smaller units of data.
    For example, the 24-bit word ICL 1900 series used 6-bit characters packed 4 to a word. It had several instructions that used a special type of pointer to directly address characters within a word.
    Similarly the 36-bit word DEC PDP-10 (DECsystem10 and DECSYSTEM20) had some processors with the capability to handle arbitary length "bytes" packed within 36-bit words but most commonly used 6, 7, 8 and 9 bit "bytes".

  • @hoagy_ytfc
    @hoagy_ytfc Před 6 lety +4

    The first big kit I used was a new Honeywell computer in 1980, which had nine-bit bytes, 36-bit words. And ran Multics, so we had a decent Algol 68 compiler. Nice :)

    • @captainz9
      @captainz9 Před 2 lety

      DecSystem20 (2060, actually a PDP-10) used 6-bit encoding for text, 36bit machine, 18-bit addressing. I learned a lot on that system, my first "big" machine after starting on Apple-II/TRS-80.

  • @Bring_MeSunshine
    @Bring_MeSunshine Před 5 lety

    I understood about 2/3 of Prof Brailsford's topic here, but I just have to watch these vids, because his knowledge, delivery and passion pull you in and carry you along

  • @albrough
    @albrough Před 6 lety

    We demand! Another great video. Thank you for all the effort and research that goes into making this great content

  • @ZorgFS
    @ZorgFS Před 6 lety

    Thank you sir! Such knowledge along with such perfect manner of narration i always wanted for my education. And having been in IT for more that 25 years I learn from your videos! I'm deeply appreciate that. Please continue to educate us and even younger generations who may even be interested in basics. I bow my knee and head! Cheers from Russia!

  • @BrainboxccGames
    @BrainboxccGames Před 2 lety

    this guy's voice is so relaxing. I could listen to him talk about bytes and computing history all day!

  • @kabalder
    @kabalder Před 6 lety +11

    Hahaha. Wow. I wish Brailsford could have lectured my professors a while ago on how this actually hangs together, so they wouldn't have docked me a full grade for not just parroting the story in the textbook.

  • @revellights1000
    @revellights1000 Před 6 lety

    Clear, byte-sized description of their source. Thanks.

  • @kimsteinhaug
    @kimsteinhaug Před 6 lety

    Your stories are great, thank you very much for talking time to share with us.

  • @CamTooling
    @CamTooling Před 4 lety +1

    This man sounds really pleasant. Half of the time I have no idea what he is talking about, but his voice and manner of speech is really soothing lol

  • @roelvoort
    @roelvoort Před 2 lety

    The cliffhangers are real. Awesome presentation.

  • @lordsqueak
    @lordsqueak Před 6 lety +6

    I just read a thing in which "pieces of 8", is called bits. (that's an old spanish coin that often was divided into smaller bits.)

    • @allanrichardson1468
      @allanrichardson1468 Před 4 lety

      That’s where expressions like “two bits” for 25 US cents came from. Of course, the professor is British, so he wouldn’t mention that.

  • @drval
    @drval Před 11 měsíci

    Yes, do a video on how IBM saw personal computing but then had to give up the game. BTW, brilliant videos all! I came into computing in the 60s and 70s via numerical analysis and Lie group, rotation group and JTFA mathematics, as well as good level nonlinear, dynamical systems analysis and hearing you tail about all of this - and explain it very clearly - is a real joy. You remind me of one of my early mathematics teachers from high school (yes, I grew up in the US), so thank you!

  • @cyogian
    @cyogian Před 6 lety +4

    Should make the sequel.... Waiting desperately for it...
    👍

  • @joelthomastr
    @joelthomastr Před 3 lety

    Whoever did the captions is a world class perfectionist

  • @jasonmarktobin
    @jasonmarktobin Před 6 lety +1

    Thank you for offering your content in 4K resolution!

  • @MattMcIrvin
    @MattMcIrvin Před 3 lety

    One way you can see the shift to byte-based architectures is the supplanting of octal (base 8) with hexadecimal (base 16) as the prevalent compact way to write binary numbers. An octal digit is three bits, which isn't that convenient if your memory is organized into bytes-- a long octal number will have digits split into contiguous bytes. But a hexadecimal digit is exactly four bits, so two of them fit into a byte exactly.
    I remember my dad in the 70s bringing home core dumps out of the trash bin for me to scribble on the blank backs of, and these dumps were always pages and pages of octal numbers, just digits from 0 through 7. The machines he was working on used a 36-bit word, a multiple of 3, so it split nicely into octal digits. But in the microcomputer era everyone at that end was using "hex" instead: 0-9 mixed with the letters A-F to represent the digits zero through 15. Much more convenient if you dealt in bytes.
    DEC's machines are interesting. The PDP-1 and several others in the PDP line used an 18-bit word, the PDP-8 was 12-bit (both multiples of 3, and divisors of that nice 36-bit length he mentioned in the video). Naturally you'd use octal to write constants at this width. The PDP-11 was actually a 16-bit machine, but DEC people still used octal for everything at that point, which must have been a little awkward. I guess it was tradition. You'll notice that the C language (and its many descendants) seems to slightly privilege octal over hexadecimal--you just need to preface an octal constant with "0" whereas hexadecimal constants take "0x".

    • @thomasstambaugh5181
      @thomasstambaugh5181 Před 2 lety

      I was a hardware engineer at Digital from 1974 through 1982, and I worked in the PDP-11 systems engineering group. At CMU, where I got my degree, a PDP-8 was in the main EE lab on campus. So far as I know, it was the only computer on campus that undergraduates could actually touch -- as in start, stop, and toggle-in code (including the boot sequence). When I first got to Digital, there were four PDP-11 flavors -- each designed and built by a different designer. Digital created the VAX-11 as a 32-bit counterpart to the 16-bit PDP 11s.
      While I enjoyed this video, I have a different perspective on this history -- colored in no small part by my own history. It isn't that anything here is incorrect, it is instead that a lot was going on in the 1970s that isn't mentioned here. There is no mention of Digital, Prime, Data General, and so on. The first "personal computer" was the PDP-8 -- intentionally designed and built to be used by a single person. Digital arguably created the entire "low end" market by intentionally selling products where the price of each component was under $50K (some said $30K). This was not an arbitrary choice -- it was the threshold above which prospective customers had to capitalize the buy.
      The Digital business model was to bypass purchasing departments -- and therefore sidestep the dominance of IBM that this video talks about -- and instead sell directly to lab managers. I'm ignoring the "OEM play" that was the other big invention of Digital.
      In the early years of Digital, the PDP-8 had four 3-bit bytes per word while the PDP-11 had four 4-bit bytes per word. The PDP-11 was always byte-addressable.
      Finally, this video fails to mention another design choice of enormous import -- how are bytes indexed within a word? Is byte 0 the most significant or least significant byte? Anybody who's coded in C -- especially the C of the late 1970s -- is accustomed to worrying about "Endedness".

  • @ONRIPRESENCE
    @ONRIPRESENCE Před 5 měsíci

    I'm currently watching this video to better understand bytes so that I can properly translate it into my Native American language (Navaho). It will be crucial for my efforts on making a Navaho word for qubit. This is side project on quantum hardware engineering in Navaho. Many thanks for your videos.

  • @lawrencedoliveiro9104
    @lawrencedoliveiro9104 Před 5 lety +2

    5:31 Slight irony in IBM’s dominance, though: the most popular programming language for commercial applications was COBOL, which IBM wasn’t really keen on. It kept trying to push its customers to use its own “Commercial Translator”, but they kept pushing back and insisting on COBOL. So finally it had to give way.
    COBOL was also one of the first “portable” (or at least, largely portable) languages. So a nice side effect of its dominance was that it weakened vendors’ lock-in of their customers.

  • @jessicalaursen1790
    @jessicalaursen1790 Před 5 lety

    That enthusiasm and passion about computers should be emulated! Great discussion. What I learned? Bit is like the atom of computers.

  • @HardCoreMore
    @HardCoreMore Před 6 lety +3

    Great video and awesome explanation.

  • @aditisinghvyas4754
    @aditisinghvyas4754 Před 2 lety

    You remind me of The Sir David Attenborough by your narration.
    Every time I hear you, I am LOST...!
    Thank You so much

  • @bsharpmajorscale
    @bsharpmajorscale Před 6 lety +6

    I've been reading about computer history lately, so I will join in requesting the follow up on IBM's first attempt at the personal computer

  • @frogspawn7423
    @frogspawn7423 Před 2 lety +1

    I had to write a sub program in PL/1 to multiply a ratio with 6 decimal places (stayed that same) by a value with 4 decimal places that went from 0.0001 to 9,999,999,999,999.0000 (this was a currency value). Took me months to get it working - and the solution was so easy in the end......

  • @fk319fk
    @fk319fk Před 5 lety

    After I left college, probably mid-1990's I was looking at the case of using three values for a digit instead of two. Binary uses 1/0, on/off, +3v/0v. So worked some math and saw +/0/- had some very nice elegance. you could increase the bit digit density at a factor of 2 to 3, (2^3 ~ 3^2). Later I did some yahooing and saw that someone else had done circuitry, but he left his notation as 0/1/2.

  • @hcblue
    @hcblue Před 6 lety +4

    Professor Brailsford is like the nerd grandad I never had.

  • @rogeriorogerio1007
    @rogeriorogerio1007 Před 6 lety

    Another great video from Professor

  • @jms019
    @jms019 Před 6 lety +1

    In followup(s) you could also talk about EBCDIC and how it came about i.e. evolution from numerical cards, 9+9+8 avoiding adjacent holes as I have only recently learnt

  • @Noi5ee
    @Noi5ee Před 5 lety

    having watched a lot of these videos in the past weeks, hardcore seems to be one of brailsford's favorite words.

  • @thomasvnl
    @thomasvnl Před 6 lety +18

    There is demand! We demand an explanation!

  • @brandonkruger9040
    @brandonkruger9040 Před 2 lety

    Kudos to Computerphile for preserving the knowledge.

  • @simeongeorgiev1107
    @simeongeorgiev1107 Před 5 lety

    I have read this in articles. Took so much more time to learn it and even though I knew most of the stuff, I am pleased to find some more interesting info. Very well explained! Tnx :)

  • @83vbond
    @83vbond Před 2 lety

    Incredibly informative. Thank you!

  • @thomasre8073
    @thomasre8073 Před 5 lety

    A remarkable person. Like his humoerous way to tell things and edding a whole bunch of experience.

  • @Pedritox0953
    @Pedritox0953 Před 4 lety

    Love your histories Prof Brailsford

  • @wp5355
    @wp5355 Před 4 lety

    Excellent Presentation!!

  • @nitram_nosnibor
    @nitram_nosnibor Před 2 lety +1

    My word CZcams knows what I like. This was so damn interesting. Definitely subscribing and looking forward to watching the other videos.

  • @NLBoots
    @NLBoots Před 6 lety +1

    Yes, please continue the story about IBM 8 bit machines...fascinating.

  • @gestaltlabart
    @gestaltlabart Před 6 lety

    The hardware-side is important too. For addressing memory, for each power of 2 you get a dedicated wire which turns on and off - that can be processed effectivly with simple gates.

  • @ahwabanmukherjee5065
    @ahwabanmukherjee5065 Před rokem

    I'm just beginning to know about computers...I aspire to one day understand every word this man says.

  • @LMacNeill
    @LMacNeill Před 6 lety

    I see a new video with Professor Brailsford, and I *immediately* have to click on it. I so look forward to his videos!

  • @Adam-ce5mq
    @Adam-ce5mq Před 6 lety +1

    I've always wondered why the Kilobyte (1000/1024 bytes) was decided upon as the next amount of bytes worth naming? ~1000 bytes seems an arbitrary amount of bytes to be concerned about in a binary world. Is it just the spectre of base10 hanging over the binary world? As usual, Professor Brailsford is an absolute pleasure to watch.

    • @MRCAGR1
      @MRCAGR1 Před 6 lety +1

      Adam Feather IIRC it is the nearest power of 2 to 1000. 1000 in SI nomenclature is kilo, hence Kilo is used in computing parlance, with an upper case k to distinguish it from 1000. Likewise mega is used as the prefix to represent 10^6, the binary equivalent is 1024^2 again with an uppercase m. Coincidentally 1024 is 2^10.

    • @Adam-ce5mq
      @Adam-ce5mq Před 6 lety

      Thanks for taking the time to reply. It feels like you're reiterating my intuitions on why ~1000 bytes was where the stick was put in the ground to signify the first quantity of bytes we consider of import: we're from a base10 world and already think in terms of thousands. 1000 just feels like an arbitrary number of bytes to put the the next label on from a purely base2 perspective.

  • @backgroundkiller
    @backgroundkiller Před 6 lety +1

    As always a very good video! I love to hear about the historical facts.
    Could you possibly make a video about Konrad Zuse and his Z series of (to be) computers?

  • @rillloudmother
    @rillloudmother Před 2 lety

    Prof Brailsford is one of the few folks on youtube that speaks fast enough to stop me from putting it at 1.5x

  • @JohnDoe-rv4dm
    @JohnDoe-rv4dm Před 2 lety

    Wow! What a narrator! What a talent!

  • @cra0kalo
    @cra0kalo Před 6 lety

    I wish he was my grandpa, literally would sit there and listen to all this stories. ❤

  • @mikeklaene4359
    @mikeklaene4359 Před 6 lety +1

    Keep in mind that IBM's 2nd generation systems - like the 1401 - used a 6 bit character via the Binary Coded Decimal (BCD) method.
    It was not until the 3rd generation 360 systems that the 8 bit byte became standard and characters were encoded using EBCDIC - Extended Binary Coded Decimal Interchange Code.
    Sorry but the 650 business systems predate even me!
    Starting with the 360 series, IBM offered the 'Decimal Arithmetic' feature that enabled the encoding to nibbles into one byte. Where the value 12345+ would be stored internally as HEX 'F1F2F3F4F5' in character mode (think COBOL ' PICTURE 99999') but as packed decimal it would be stored as HEX '12345F' .

  • @skun406
    @skun406 Před 4 lety +4

    11:10 there's a demand, please go on :)

  • @kosan3688
    @kosan3688 Před 6 lety

    I will happily pursue a career in computer engineering if every computer engineer spoke with such eloquence.

  • @jeevachaithanyansivanandan

    Such a brilliant presentation... I need the second video, where is it ??

  • @a.d.1882
    @a.d.1882 Před 4 lety +1

    I’d give Professor Bailsford multiple bottles of fine red wine and enjoy letting him talk all the day 😀

  • @lawrencedoliveiro9104
    @lawrencedoliveiro9104 Před 5 lety +1

    9:12 DEC were able to do the same sort of thing, a little bit later, at a fraction of the cost: the PDP-11 was their first byte-addressable machine.

  • @jerryschwinn7234
    @jerryschwinn7234 Před 6 lety +3

    0:56 "Waffly bits of text" this is too funny.

    • @gordonrichardson2972
      @gordonrichardson2972 Před 6 lety

      Funny, but true! Early computers were very expensive. You don't spend $10,000+ to do word processing...

  • @x3ICEx
    @x3ICEx Před 6 lety

    Subtitles imperfect (some words missing, spelled incorrectly) but otherwise very detailed. True CC (Closed captions) experience, thank you.

  • @TheNovum
    @TheNovum Před 6 lety

    Great story. waiting for the end

  • @feldinho
    @feldinho Před 6 lety +3

    Prof. Brailsford is the Dr. Poliakof of computing.

  • @tymothylim6550
    @tymothylim6550 Před 3 lety

    Thank you very much for this video! It is very interesting and educational :)

  • @kd1s
    @kd1s Před 6 lety +1

    Way I know it is a byte is eight bits. And everything in my world is based on that - KB, MB, GB, PB. And yeah, BCD is a biggie too - how my TRS-80 with an 8 bit Z-80 could represent 65536 with decent precision. Now of course on a 64 Bit PC laptop where max addressable by 64 bits is 18446744073709551616. Big difference there.
    And micros character sets have been 5, 6, 7, and 8 bits in the past. No idea what they are now - but if Windows alt sequences are any indication Unicode uses lots more bits.

  • @bikalimark
    @bikalimark Před 6 lety +17

    The David Attenborough of Computer Science

    • @BrianJ001
      @BrianJ001 Před 5 lety

      prof knows what he is talking about - David Attenborough is a champagne Socialist with a Climate Catastrophic mantra. Wrong as wrong can be!

  • @PbPomper
    @PbPomper Před 2 lety

    What a great story teller!

  • @SilverPsycheMetal
    @SilverPsycheMetal Před 6 lety

    Wonderful, just wonderful!

  • @frankie5781
    @frankie5781 Před 5 lety +7

    "Hardcore numerical operations."
    ~ Prof. Brailsford 2k18

  • @jordancobb509
    @jordancobb509 Před 3 lety

    This man is a walking encyclopedia of computing knowledge. I would love to have a beer or two with him.

  • @suntexi
    @suntexi Před 4 lety

    When I started programming a computer, it was an IBM discrete transistor, 1401, in which addressing was decimal. There were no fixed words and instructions were delimited by word marks. The 8 bit byte consisted of a 6 bit BCD character, word mark bit and parity bit. When I started programming 360 series computers, the idea of EBCDIC characters and words of 2 bytes seemed a bit alien.

  • @ShubhamRaj-mu8ol
    @ShubhamRaj-mu8ol Před 3 lety

    Boredom was nibbling me away when i started searching youtube a bit and discovered this. My very first reaction was to click for a byte of this information.

  • @fsmoura
    @fsmoura Před 6 lety

    Awesome video!

  • @jonp3674
    @jonp3674 Před 6 lety

    Tell us more about the quantum chemistry work! That sounds awesome.

  • @lawrencedoliveiro9104
    @lawrencedoliveiro9104 Před 5 lety +2

    9:00 “Sheer speed” of IBM’s hardware!? Somewhere, Seymour Cray is going “HAHAHAHAHAHAHA!!!”

  • @Gooberpatrol66
    @Gooberpatrol66 Před 6 lety

    I love these stories.

  • @General12th
    @General12th Před 6 lety +4

    Dr. Brailsford is awesome. +++

  • @vinicus508
    @vinicus508 Před 2 lety

    This man makes me so confused at first watch, and mind blown at second watch.