If you are entering the world of computing and you are looking at processors to buy, you will have read many times GHz or Gigahertz. All of this is exactly the same, and no, it is not a food seasoning, it is a measure that is used very frequently in computer science and engineering.
So the least we can do at this point is explain what this measure measures and why it is so widely used today. Perhaps after this, you are more clear about many things that you find every day in the world of electronics.
What is a GHz or Gigahertz
GHz is the abbreviation of a measure used in electronics called Gigahertz in Spanish, although we can also find it as Gigahertz. And it is not really a base measurement, but it is a multiple of Hertz, specifically we are talking about 10 9 million Hertz.
So really what we will have to define is Hertz, the base measurement and where the Kilohertz (kHz), Megahertz (MHz) and Gigahertz (GHz) come from. Well, this measure was invented by Heinrich Rudolf Hertz, from whose surname the measure’s name comes.
He was a German physicist who discovered how electromagnetic waves propagate in space. So really this measurement comes from the world of waves and not purely from computer science.
One Hertz represents one cycle per second, in fact, it was not until 1970 that Hertz was called instead of cycles. In case you don’t know, a cycle is simply the repetition of an event per unit of time, which in this case will be the movement of a wave.
Then a Hertz measures the number of times a wave repeats in time, which can be sound or electromagnetic. But this is also extensible to the vibrations of solids or ocean waves.
If we try to blow a paper parallel to its surface, we will notice that it begins to undulate repeating the pattern every so often, in seconds or thousandths of a second if we blow with force.
The same happens with waves, and we call this magnitude frequency (f) and it is the inverse of a period, which is measured in clear seconds (s).
If we put it all together, we can define Hertz as the frequency in the oscillation of a particle (of a wave, paper, water ) in a period of one insurance.
Here we can see the shape of a wave and how it repeats over a period. In the first, we have the 1 Hz measurement, because in one second it has only undergone one oscillation.
And in the second image, in a single second it has oscillated 5 full times. Imagine then how much 5 GHz would be.
The GHz in computing
Now that we really know what a Hertz is and where it comes from, it’s time to apply it to computing.
The Hertz measures the frequency of an electronic chip, for us, the best known is the processor.
So transferring the definition to it, a Hertz is the amount of operations that a processor can do in a period of one second. This is how the speed of a processor is measured.
The processor of a Laptop (and other electronic components) is a device that is responsible for performing certain operations that are sent from main memory in the form of instruction code that are generated by programs.
Then each program is subdivided into tasks or processes, and in turn into instructions, which will be executed one by one by the processor.
The higher hertz a processor has, the more operations or instructions it can carry out in one second. In a common way we can also call this frequency ” clock speed “, since the whole system is synchronized by means of a clock signal so that each cycle lasts the same time and the information transfer is perfect.
The evolution of GHz
We have not always had Gigahertz even in the soup, in fact, almost 50 years ago engineers only dreamed of ever naming the frequency of their processors this way.
The beginning was not bad at all, the first microprocessor implemented in a single chip was the Intel 4004, a small cockroach invented in 1970 that revolutionized the market after those mammoth laptops based on vacuum tubes that did not even have RGB lighting.
Exactly, there was a time when RGB did not exist, imagine. The fact is that this chip was capable of processing 4-bit strings at a frequency of 740 KHz, not bad, by the way.
Eight years later, and after a few models, the Intel 8086 arrived, a no less than 16-bit processor that ran from 5 to 10 MHz, and was still shaped like a cockroach.
It was the first processor to implement the x86 architecture, which we currently have in processors, incredible. But this architecture was so good at handling instructions, that it was a before and after in computing.
There have also been others like the IBM Power9 for servers, but certainly 100% of personal laptops continue to use x86.
But the DEC Alpha processor was the first chip with RISC instructions that reached the 1 GHz barrier in 1992, then AMD arrived with its Athlon in 1999 and in the same year the Pentium IIIs that reached these frequencies.
What is a MHz or Megahertz
( Mega Hert Z) One million cycles per second. MHz is used to measure the transfer velocity of electronic designs, including directs, buses and the laptop’s internal clock.
A one-megahertz clock ( 1 MHz) implies some number of bits( 1, 4, 8, 16, 32 or 64) can be controlled at least one million times per second.
A two-gigahertz clock ( 2 GHz) implies at least two billion times. The “at least” is because multiple procedures often occur in one clock cycle.
Both megahertz ( MHz ) and gigahertz( GHz) are used to measure CPU speed. For precedent, a 1.6 GHz laptop manages data internally ( calculates, equates, forgeries) twice as fast as an 800 MHz machine.
Why Isn’t It Faster?
Internal cache and CPU architecture plus the quicken of the RAM, storage and network all contribute to the laptop’s actual carry-on and overall throughput.
Users are often horrified to find simply incremental increases after obtaining a so-called “faster” laptop.
In addition, newer versions of software are sometimes slower than previous editions, and a faster laptop is often required really to maintain the same performance level as the old-fashioned software. See directions per second, Hertz and opening/ time.
MHz and GHz Are the Heartbeat
When referencing CPU speed, MHz and GHz rate the raw, steady pulsations that invigorate the circuits in a chip.
Megahertz( MHz) and gigahertz( GHz) are the CPU’s clock hasten, and the number of bits (8, 16, etc .) are the width of the CPU’s registers. The combination of speed and extent determines the inherent processing performance of the CPU chip. Parallel directs from the CPU to external designs are also measured by rate and diameter; nonetheless, serial canals are rated simply by fast.