Everything out
there in this universe is complex, but presented to us in a simple way hiding
all the inner complexities through a popular philosophy known as abstraction.
For example God can be considered as an abstraction to define the unsolved scientific
mysteries. “I believe in God”. Computers are made, sorry evolved to this stage
just to cover the complexity of applications by one of the pillars of
abstraction which is the speed. Normally when two different applications are
run simultaneously, the application which runs faster will be considered simple
and the other complex to a naked eye. But what about the case of underlying
hardware in which the both applications are run, a simple application A when
running in a Pentium computer will run slow whereas a comparatively complex
application B running in a Intel i7 will get its task done in the time for
blink of our eye provided the same amount of input to both. So if the hardware
is covered from your eyes, will you say “B is simpler than A”! :-). This is known as abstracting the complexity of an application
using the speed. So whatever the application is, the actual “computer” which is
the processor decides how the application is run.
The need for
applications which helps us in doing our day to day activities is increasing.
The use of both soft real-time and hard real-time applications in our day to
day life is inevitable. Also these applications have to be “optimized” enough
in order to speed them up so that we won’t get annoyed by the slow hanging process
which done a constant check to our temper once upon a time. By the word
optimized I mean both hardware and software optimizations. Simple applications
and programs can be optimized easily. Their performance can be boosted enough
through simple software optimization alone. For example ‘using Quick sort
instead of Bogo sort’. But what if the application is really very complex and
runs for days together even with maximum linear (discussed later) software
optimization. Scientific applications like weather forecasting, simulations of
fluid dynamics, crash modeling simulation of a Mercedes, calculations to justify
the presence of water in mars etc, exhibit such scenario. Software optimizations
alone are not enough to make these kinds of applications run faster. Thus we go
for a specialized computer to run these applications which is known as a
Supercomputer. This is main problem domain which ‘High Performance computing’
or ‘High Throughput Computing’ (there exists technical differences between both
HPC and HTC, we will discuss it soon) actually deals with. Thus HPC is a
backbone for the Darwinism of computers. What was once considered a
supercomputer is now before you. Yes you are reading this article with the help
of a 1950’s supercomputer. Computers evolve through HPC. Today’s supercomputers will be in hands of
your grandchildren don’t worry :-). HPC is the mantra behind the trailing number of ‘O’s in your
Google search result. To put simply, HPC is a field of designing a complex
hardware and software system for crunching large amount of numbers and
algorithms along with rigorous optimizations.
Where is HPC
used? – Everywhere will be the certain answer. Yes, for example as I said above
when one searches for a word ‘hair’ in Google or BingJ, he/she
uses HPC. Bio sciences, military, energy, government labs, weather forecasting,
EDA, etc etc etc. With enough outline of what HPC is we can delve into next
level of HPC from next post where we will see about the world’s first
supercomputer. If you have any doubts and queries, please feel free to comment
it.
No comments:
Post a Comment