Signup for email updates!

Email signup

Computer Jargon - Megabyte, gigabyte and megahertz

megabyte

The world of computing is not unique. It is not unique because computing, like any profession, has a set of terms and words specific to itself. Jargon, the term for words specific to a specific profession or group, is confusing to those who are not familiar with them. Medical professionals, for instance, use jargon; for example, a myocardial infarction is medical jargon for a heart attack. Jargon, while seemingly confusing and obfuscating, is necessary: The necessity lies in the need for language to be standard no matter who or what is speaking.

Those who seek a Bachelor of Information Technology must learn a specific jargon set when discussing computers, computing, hardware and software. These terms are specific to computing and learning these terms is of the highest priority for students seeking degrees in computing or computers. The terms defined below specifically refer to computer hardware and not software. Software refers to programs which are installed on a computer, such as Microsoft’s Office Suite or Macintosh’s iTunes. Hardware refers to components of the computer: the monitor, case, hard drive and motherboard. Understanding these terms is vital in earning a Bachelor of Information Technology degree.

The terms megabyte and gigabyte refer to a computer’s memory and hard drive size. Imagine the two as the following: Computer memory, RAM, is like a table. The larger the table, the more open programs a user can have running. A hard drive is like a closet, so the larger the closet, the more things one can put into the closet. So, the larger the hard drive, the more things, software, that can be kept inside.

A BYTE

To fully comprehend a megabyte, one must first understand a byte, and to understand a byte, one must understand a bit. A bit is a way to define binary code. Binary code is a series of zeroes and ones that represent information in the computer. The computer reads information as the ones and zeroes and then displays the information so the user can see it. A bit is a value of one zero or one; a byte is eight ones and zeroes. So, if 010 is three bits of information, 01010011 is one byte of information. A kilobyte, for instance, is 1,000 bits of information.

MEGABYTE

The prefix mega- is standard SI, French for Système international d’unités, or the International System Measurement of ten to the power of six bytes, or approximately 1,000,000 bytes of information. In other words, if you imagine a set of one million ones and zeros, you have a megabyte. In computer jargon, a megabyte is abbreviated as MB. Many standard video cards sold today are measured in MB.

GIGABYTE

A gigabyte is a measure of storage one SI greater than a megabyte. The prefix giga- is SI for the measurement of ten to the power of nine bytes, or approximately 1,000,000,000 bytes of information. Most computers sold in the U.S. today measure RAM and hard drive space in gigabytes.

MEGAHERTZ

Computers compute; the computer’s CPU, central processing unit, does the computing. This is to say a computer take the information put into it and puts information out of it: This is the computer’s clock speed. How quickly this information is processed is measured in hertz or more specifically, megahertz or MHz. One megahertz is one million cycles per second. The larger the number of megahertz means the computer’s clock speed is faster. A faster clock speed means the CPU can do things, such as run programs, more quickly. In other words, it takes less time to start a computer bought today than a computer bought three years ago because of the increased megahertz.

Ask An Expert: Real Questions, Expert Answers

Ask your Question