We are independent & ad-supported. We may earn a commission for purchases made through our links.
Advertiser Disclosure
Our website is an independent, advertising-supported platform. We provide our content free of charge to our readers, and to keep it that way, we rely on revenue generated through advertisements and affiliate partnerships. This means that when you click on certain links on our site and make a purchase, we may earn a commission. Learn more.
How We Make Money
We sustain our operations through affiliate commissions and advertising. If you click on an affiliate link and make a purchase, we may receive a commission from the merchant at no additional cost to you. We also display advertisements on our website, which help generate revenue to support our work and keep our content free for readers. Our editorial team operates independently of our advertising and affiliate partnerships to ensure that our content remains unbiased and focused on providing you with the best information and recommendations based on thorough research and honest evaluations. To remain transparent, we’ve provided a list of our current affiliate partners here.
Networking

Our Promise to you

Founded in 2002, our company has been a trusted resource for readers seeking informative and engaging content. Our dedication to quality remains unwavering—and will never change. We follow a strict editorial policy, ensuring that our content is authored by highly qualified professionals and edited by subject matter experts. This guarantees that everything we publish is objective, accurate, and trustworthy.

Over the years, we've refined our approach to cover a wide range of topics, providing readers with reliable and practical advice to enhance their knowledge and skills. That's why millions of readers turn to us each year. Join us in celebrating the joy of learning, guided by standards you can trust.

What a Gigabit?

By R. Kayne
Updated: May 16, 2024
Views: 22,908
Share

A gigabit is a unit of measurement used in computers, equal to one billion bits of data. A bit is the smallest unit of data. It takes eight bits to form or store a single character of text. These 8-bit units are known as bytes. Hence, the difference between a gigabit and a gigabyte is that the latter is 8x greater, or eight billion bits.

Storage capacity is normally indicated in bytes rather than bits. You’ll probably not hear someone describe a 200-gigabyte drive as being 1,600 gigabits. Instead, bits are typically used to describe data transfer rates (DTRs), or how fast bits of information can move between devices, such as modems, Firewire, or Universal Serial Bus (USB) ports.

Two types of numerical systems used with computers are the decimal system and the binary system. The decimal system counts the kilo as 1000, while the binary system counts the kilo as 1024. This is because it takes an extra 24 bits of data to store 1000 bits of information on a hard drive or standard storage device. For simplicity, and when referring to data transfer speeds, the more typical designation is the decimal system as follows:

  • 1000 bits = 1 kilobit
  • 1000 kilobits = 1 megabit
  • 1000 megabits = 1 gigabit

  • 1000 bytes = 1 kilobyte
  • 1000 kilobytes = 1 megabyte
  • 1000 megabytes = 1 gigabyte

As an aside, the binary system that uses 1024 bits instead also uses different terminology. The kilobit becomes the kibibit; the megabit, the mebabit; and the gigabit, the gibibit.

Getting back to our more familiar decimal designations, abbreviations can often cause confusion. For example, an Internet provider might advertise speeds of 1500 kbps, while a potential customer might assume the abbreviation refers to kilobytes. Typically, measurements in bytes are used with a capital letter, such as “kBps,” or “KBps.” If all of the abbreviated letters are small case, the reference should be to bits. However, kilobit, megabit and gigabit might also be abbreviated as Kbit, Mbit and Gbit.

The gigabit is not often used in data transmission rates, as most devices push information at slower kilobyte and megabyte speeds. A notable exception is fiber optic cable. In a Verizon press release dated 19 November 2007, the company announced successfully transmitting a video broadcast along fiber optic cable at a whopping 100 gigabits per second (gbps). For comparison, the fastest Ethernet networks have a maximum throughput of 100 megabits per second (mbps). As of winter 2007, Verizon is installing fiber optic service (FiOS) across the U.S. to provide television, digital phone and Internet services.

As technology advances and data transmission rates increase, the average computer user will no doubt become familiar with the gigabit. Until then, most of us will remain locked into slower kilobyte and megabyte speeds, looking forward to jumping that next hurdle.

Share
EasyTechJunkie is dedicated to providing accurate and trustworthy information. We carefully select reputable sources and employ a rigorous fact-checking process to maintain the highest standards. To learn more about our commitment to accuracy, read our editorial process.
Discussion Comments
By anon317087 — On Jan 31, 2013

I've only seen "terabyte" with one "r", "tera-" meaning 1 trillion (American numbering system).

By anon54025 — On Nov 26, 2009

K=Kelvin

k=kilo

m=milli

M=Mega

b=bit

B=Byte

mb=millibit

Mb=Megabit

mB=milliByte

MB=MegaByte

Gb=Gigabit

GB=Gigabyte

By anon19650 — On Oct 16, 2008

Yes. 1,000 Gigabytes = 1 Terrabyte (decimal system)

or 1,024 Gigabytes = 1 Terrabyte (binary system)

By anon19615 — On Oct 15, 2008

i think 1000 gigabytes(gb)= 1 terrabyte(tb)

i think

Share
https://www.easytechjunkie.com/what-a-gigabit.htm
Copy this link
EasyTechJunkie, in your inbox

Our latest articles, guides, and more, delivered daily.

EasyTechJunkie, in your inbox

Our latest articles, guides, and more, delivered daily.