Options

Do I need to know hex conversion for ICND1?

dmwdmw Member Posts: 81 ■■□□□□□□□□
I don't recall seeing that anywhere but I've seen it in some practice tests. The practice tests are for ICND 1 and 2 and I was thinking/hoping hex conversin was on ICND2 only.
Rebooting computers since 1999

Comments

  • Options
    networker050184networker050184 Mod Posts: 11,962 Mod
    I don't see anything about it on the blueprint, but its not anything difficult so you might as well cover it to be on the safe side. If you can get subnetting down you should have no problem converting binary to hex.
    An expert is a man who has made all the mistakes which can be made.
  • Options
    TheShadowTheShadow Member Posts: 1,057 ■■■■■■□□□□
    once you realize that hex is shorthand for binary it is irrelevant anyway. It just keeps your eyes from crossing reading 1's and 0's.

    0x3f674b98 is a lot easier than 111111011001110100101110011000b

    windows shows mac addresses in hex and most errors; Linux uses it for everything so I suppose if you are at that point you might as well pick it up along with the other notations since you will also see it in the routers. Turn the windows calculator to scientific mode and practice for an hour that should be enough.

    Long ago this was a employment test question in the mainframe world.
    Write 100 in the four standard IT number systems.

    decimal 100
    octal 0100
    hex 0x100 or x100 or 100h
    binary 100b

    Which led to the joke there are 10 kinds of people those that understand binary and those who don't. The joke or course is syntactically incorrect unless you place it only in the context of binary (it should be 10b).

    Just remember 4 bits can represent 16 values 0 to 15. anything above 9 was a stroke digit. Mainframes using gas filled Nixie displays placed a slash bar over 0 - 5 to represent 10 -15. Realizing this was a pain to type IBM, Burroughs and the rest of the seven drawf's switched to A-F to represent 10-15 giving the correct printed hex-decade notation or hexadecimal 6+10. Originally it was sexadecimal but that was back when Ricky and Lucy were sleeping in twin beds with a nightstand between them. Change was demanded and the industry quietly complied.
    Who knows what evil lurks in the heart of technology?... The Shadow DO
  • Options
    tierstentiersten Member Posts: 4,505
    TheShadow wrote: »
    Realizing this was a pain to type IBM, Burroughs and the rest of the seven drawf's switched to A-F to represent 10-15 giving the correct printed hex-decade notation or hexadecimal 6+10.
    Citation for that?
  • Options
    TheShadowTheShadow Member Posts: 1,057 ■■■■■■□□□□
    tiersten wrote: »
    Citation for that?
    Moldy paper by now, this was pre public INTERNET my friend. I don't normally admit to my age, but would you believe I was there. I worked for Burroughs now Unisys in the 70's and 80's as a hardware design engineer.

    Those ancient tidbits were passed on as part of the newbie training classes. I actually worked on some of the first chip Ram implementations using 1103 1Kx1bit DRAM's B2700 B3700 and B6700 mainframes.

    My very first assigned task was fixing a bug in an ASCII to EBCDIC translator in a piece of half way logic called DIDDLEDIC were I learned more than I ever wanted to know about hex and Boolean Algebra in the pre full digital schematic world. I made my bones and advanced.

    The stroke digits were called undigits by Burroughs and that caught on like the uncola 7-up commercials.

    I also still have several new Nixie tubes and 1103 DRAMS and a core memory plane in the garage if that helps the veracity of my statements.
    Who knows what evil lurks in the heart of technology?... The Shadow DO
  • Options
    tierstentiersten Member Posts: 4,505
    TheShadow wrote: »
    Moldy paper by now, this was pre public INTERNET my friend. I don't normally admit to my age, but would you believe I was there. I worked for Burroughs now Unisys in the 70's and 80's as a hardware design engineer.

    Those ancient tidbits were passed on as part of the newbie training classes. I actually worked on some of the first chip Ram implementations using 1103 1Kx1bit DRAM's B2700 B3700 and B6700 mainframes.
    Interesting! I was just curious because I couldn't find where hexadecimal actually first started to be used. Its remarkably hard to search for it.
    TheShadow wrote: »
    My very first assigned task was fixing a bug in an ASCII to EBCDIC translator in a piece of half way logic called DIDDLEDIC were I learned more than I ever wanted to know about hex and Boolean Algebra in the pre full digital schematic world. I made my bones and advanced.
    Eww. EBCDIC. I've had more than my share of "fun" with EBCDIC but that was with the more recent AS400 models.
    TheShadow wrote: »
    I also still have several new Nixie tubes and 1103 DRAMS and a core memory plane in the garage if that helps the veracity of my statements
    Nixie tube clocks seem to be in fashion yet again so they might be fairly valuable :)
Sign In or Register to comment.