I'm racking my head trying to make sense out of this ...
The book (Odom-ICND2) using the formula to calculate cost as:
ref-bw / Int-bw = OSPF cost
It's only a brief paragraph, but so far it's just not adding up for me

. He also uses the example of serial interfaces defaulting to 1544, which is rounded down to 64??? I can't even get that to add up on the old calculator. I'm sure if I can get the math, I will eventually master it... but I'm just not seeing it right now, any help would be mucho appreciados!
The reason I ask is the boson practice exam has a question on it that ask, about...
Which of the following statments is true about how OSPF derives the cost of an interface.
Well there are two options that contend if the ref-bw were changed using the
auto-cost reference-bandwidth 1000 or
auto-cost reference-bandwidth 100 would that change increase the previously calculated cost by a factor of 10?
Naturally I pulled out the book to reference the section, but I'm still lost on this calculation.