How does a Search Engine work?

Untitled-1

You’ve been using search engines for a long time now.

You probably used one to find this article. But do you know how they work? This post will give you a basic understanding of how search engines work to find all the information you wouldn’t be able to find on your own.

Developing a search engine is complicated and requires a demanding amount of resources. Long before Google started hosting vast amounts of data in Gmail and Google Drive, the company needed to invest heavily in infrastructure just to sustain their initial search service. Suffice to say, most people cannot design and maintain their own web search engines, and there is a reason only a few key players dominate the arena. Yet regardless of which search provider you turn to, the process for how search engines work is largely the same.

1. Crawling

First, developers create Internet bots capable of reading hyperlinks and HTML. These web crawlers browse the Internet looking for web sites in order to index them. They are provided with a list of URLs to start with, and the bot scans all of the hyperlinks on each of these pages and adds them to the list of URLs to visit next. Crawlers can only download a certain number of pages within a given time period, which is why search results sometimes suggest websites that have since been deleted or removed. The websites these web crawlers visit are indexed according to the information the bots can gather by reading each site’s HTML.

2. Indexing

Indexing speeds up the process of retrieving relevant webpages. Searching an index is a substantially less intensive and time-consuming task than searching all of the possible websites. Imagine having to send out new web crawlers to discover the answers to every search inquiry. It would be akin to sending explorers out into the wilderness blind rather than using a map.

Indexes must be stored, and considerable time and effort goes into keeping indexes up-to-date, but the trade-off in speed makes the effort worth it.

3. Searching

Searching is the most visible step of the process. When you type a term into Google, Bing, Yahoo, Duck Duck Go, or any number of search engines, they search their respective indexes for the best results. This information is then displayed on search engine results pages (SERPs).

4. Making Money

Since search engines are free, how were some able to grow into such massive companies? Like much of the Internet, search engines make money by selling ad space. Some advertisers pay to have their products or services ranked higher in search results. Some engines, such as Google, run search-related ads alongside their general search results. As search engines became the most popular way of finding information on the internet, they also became the most popular way of advertising online. Potential consumers may not read the same websites, but they do turn to the same search engines.

Final Thoughts

The process is immensely complex, and there is a great demand out there for people who understand the intricacies of how search engines work. This guide just scratches the surface, but there’s a wealth of information out there. Thanks to search engines, finding that information is easier than ever.

LEDs: Its Begining Of End For The Traditional Light Bulbs

images1

I’ve been fascinated by the light emitted by a LED, so my work turns into electrical materials.  

In the beginning, there was darkness.

Then came fire.

It wasn’t until the 19th century that artificial light was first generated. The big leap came in the 1880s, when Thomas Edison lit homes with the incandescent bulb. Since then, for the next 130 years, incandescents ruled the nights, the roads, and especially the Homes.

But now, the incandescent light bulb, one of the most venerable inventions of its era but deemed too inefficient for our own.

The stage has been set for the imminent death of the incandescent light bulb. And the rest of the World is also following the same. Already many stores across the world stopped stocking the good old bulbs already.

The days of the traditional incandescent bulb look numbered because these electricity-sapping glass orbs have fallen out of flavor with environmentally-conscious governments and consumers.
Moving to more efficient lighting is one of the lowest-cost ways to reduce electricity use and greenhouse gases. In fact, it actually will save households money because of lower power bills. Ninety percent of the energy that an incandescent light bulb burns is wasted as heat. What a waste !!

220px-LED,_5mm,_green_(en).svg

And waiting in the wings is a new breed of hi-tech light based on the humble LED (light-emitting diode), the small lights found in everything from TV remote controls to bike lights. Not only do they promise to solve the bulb’s environmental woes, their owners say they will also respond intelligently to your surroundings and even influence the way we behave.

Already, the efficiency and long life of LEDs has made them popular.

Of course, the death warrant for the incandescent bulb has been signed before.

These tiny lights were invented by GE in the early 1960s and were initially only available in red, a property that defined the look of early pocket calculators and digital watches. Over the years, however, more colors have appeared.

People still use vacuum tubes for some applications, and similarly incandescent bulbs may never go away completely. But it is not a question of if, but of when LED lighting will be the norm throughout the world.

Will you still use the Incandescent light bulbs again ???? 

How does electricity turn into software?

This is a great question. I asked this Myself when I used to play Dave on PC as an 10 year old kid and went on a long college journey trying to find an answer to this.

Its really complex but let me dumb it down a bit (actually a lot).

Let us start right from the bottom:

Matter is composed of atoms, Atoms have electrons and flow of these electrons is defined as electricity

Now, to make use of these electrons, the processor or the memory has transistors which can store and release electricity as needed. They are stored in units of 1 (5Volts) and 0 (0 Volts)

An 8-bit number is then represented with 8 transistors. So 8-bit representation of the number 3 will be : 0000 0011.

How is that achieved in hardware? Keep 8 transistors side-by-side (called registers and memory units). Make the first 6 transistors hold 0V and the next 2 transistors hold 5V

Just imagine a million of these transistors in today’s PC.

Now, an organization of such registers and memory makes a cpu+ram

To make it easy to compute using the CPU, machine code has been developed. This language is what essentially runs on the CPU. What do I mean by “run”? It means, keep flipping bits. If I want to perform 2+3, in machine, I would store 2 in one register and 3 in another register. Then I would take these values to an Adder unit which would do a mathematical add and give me the reply in another register. This is what a sample machine code would look like:

80 02 F3
80 03 F4
88 F3 F4 F5

Obviously, no one understood anything with this. So we came up with an ingenuous system to make it human readable. This is called assembly language. The following piece of code represents the above mentioned numbers:

MOV 2, REG A
MOV 3, REG B
ADD REG A, REG B, REG C (add A and B and store in C)

where MOV = 80
REG A = F3
REG B = F4
REG C = F5
ADD = 88

Voila, the first coding language

Now, assembly is too hard for humans to remember and code properly in. So compilers were developed that would convert a high level language like C to assembly language

So, a C representation of the above mentioned assembly would be:
{
int a = 2; b = 3;
c = a+b;
}

Just like people could write poems with English and not with sign language, an expressive language was required in which people could write some better programs. Then compile it to assembly. Then that would flip bits in registers. Which in turn would affect transistors, which affect flow of electrons.

With the above found expressiveness, people wrote operating systems to maximize hardware usage.

Everything from your keyboard input to mouse to desktop to windows to sound is a program written in such expressive languages, running on top of the OS.

Let’s walk the other way, from software to electrons now

When you type Google in the browser and hit Enter key, an http request is sent from your browser (the client) to Google (server).

In your own computer, the browser is a program written in C/C++

This gets compiled to assembly (actually browser is already compiled, you’re just giving input numbers to the compiled browser)

The operating system (windows/linux etc) and device drivers are all already compiled to assembly and are running on your machine

When the browser assembly gets it’s turn to run on the CPU, it runs the assembly code.

This assembly code does flipping of bits in registers and memory

The registers and memory are composed of transistors

Transistors control the flow of electrons and hence electricity.

I’ve overly simplified it. I did not even talk about caches, coherencies, consistencies in multiprocessor systems, schedulers, micro-architectures, register files, bridges, GPU, how display works, how does BIOS work, what is init?, what do you mean when you say something is a “program”? etc. Nor did I talk about state machines, ALUs, pipelines, power supply, how current is measured, CLOCK, system ticks, HDLs, control logic, digital circuits like mux/demux, decoders, cryptos, security firewalls, etc.

There are tons of other things happening but mostly, it’s different software programs interacting with each other .

Computers are man-made miracles of the highest order. No single person could have thought of all this. It has taken more than 50 years and millions of smart people to get to this point.

I have not touched even 1% of the actual detail. So, what are you thinking ?

 

P.S. – The post has some inaccuracies. For example, it is not true that 1 transistor represents 1 bit. In fact, a Flip-flop that does it. There are other flip-flops doing the same.

Why is the 250GB hard disk having less than 233 GB?


This was a question that one of my friends asks me after buying a new 250GB Hard Disk. I really didn’t ask that question to myself. So, I couldn’t answer that instantly. After a few days of information searching I realize the truth. So, here I’m going bring the facts the way I told my friend.

All Hard Disk manufacturer way of counting space is different from the way a software count a space. This means to a Hard Disk Manufacturer, 1GB is 1000 MB, 1 MB is 1000 KB and so on.
Software or computer language recognize a space in terms of power of 2 say 2^1, 2^2,2^3 etc. hence a KB is 2^10 Bytes which is 1024 Bytes.

Lets do a little math:
From a Manufacturer a 250 GB hard disk is
250 x 1000 x 1000 x 1000 = 250000000000 Bytes

From the Computer Language point of view a 250000000000 Bytes is:
250000000000/(1024*1024*1024) = 232.83 GB

Now you’ll see why your 250GB hard disk has only less than 232.83 GB space in total. Well if you go with higher space like 500 GB and so on the space missing is increasing.

The bottom line is that you need to think the way the computer thinks.

So, don’t be mad at the Manufacturer for the missing space.

Enjoy Disking !!!

VLC Mayhem !

vlc-media-player

Installing VLC Media Player could void your warranty!

Yes, you read it right .

Take this as a warning, if you are using VLC player and have the volume set to maximum for a prolonged period of time it could cause damage the laptop speakers, and replacing the speakers will not be covered by the hardware warranty. So if you do use VLC keep the volume down.

Be aware that you can be denied warranty service on the speakers just by having the player, whether you use it or not. If you contact Tech Support about a speaker problem and they do a scan and find VLC then you could be denied service.

VLC has a feature that can make the audio sound louder than it does with other players. VLC achieves this by using a process that creates hard clipping. Clipping has always been known to be dangerous to small speakers and the speakers in a laptop are tiny. Dell has tested VLC and verified that the speakers can be damaged after several hours of using VLC.

You would think that companies would have a buffer to keep that from happening. They can dim a screen to protect your monitor, but no way to keep a speaker from blowing?!

Not their fault. I am not throwing blame or anything. But that kind of stinks.

Probably they don’t blow. My guess is that they are burning out or having mechanical failure from the heat produced by unnaturally high average amplitude.

I played some short samples of a pop tune with VLC Player and simultaneously recorded them into a graphic audio editor, and made some screen shots of the amplitude waveforms. These amplitude graphs represent the relative energy of the audio signal. The highest peaks represent the greatest amplitude, which increases as the waveform moves away from the center crossing line in either up or down direction. The center “zero crossing” line represents silence.

Example 1 shows VLC playing the tune normally with its volume set at 100%.

Untitled-1

Example 2 is VLC Player playing the tune with the volume set to 200%.

Untitled-2

VLC can’t make the laptop’s amplifier produce more wattage than it was made to produce, so where does the increased volume come from?

The program seems to increase the signal’s amplitude to beyond the 0 dB limit — super-saturation. That would definitely increase the volume but the problem is that signal over 0 dB create very uneven digital distortion — basically a loud noise. To fix that, the entire signal over 0 dB is simply chopped off, or clipped.

It is called “clipping” because of the way the graph looks after parts of the signal have been hacked off. It is a form of distortion because the waveform has been ruined by hacking off parts of it. If you play a song with VLC Player set to 200% volume you can easily hear the distortion in the audio signal.

Here are very extreme close-ups of examples 1 & 2. The waveform in example 1 resembles a sine wave more or less.

Untitled-3

Example 2 is the same slice of music but with VLC Player’s volume set to 200%. The maximum amplitude is the same as in example 1, but now there is much more of the signal near maximum so the average amplitude is much greater here.

In example 1, the waveform was forming peaks near the maximum. In example 2 the waveform is creating plateaus near the maximum, which means that it is now starting to look more like a square wave — typical of hard clipping.

Untitled-4

The maker of VLC Player says there is nothing about the player that can cause damage to laptop speakers and for all I know that those guys may be either right or wrong. However these graphs demonstrate that the player creates hard clipping when the volume level is set to 200% and dramatically increases the average amplitude.

The bottom line is that there is a whole lot of extra energy for the speakers to have to convert into mechanical energy and heat without breaking down.

Wikipedia has an explanation of why hard clipping can damage speakers in its article on audio clipping. It also mentions another problem with hard clipping that can destroy speakers.

With this visual aid you all must be convinced about the nasty things that VLC player does to your laptops.

So, will you ever again use VLC media player ????????

The Google Story

Let me start off with a few questions.
Who are the Google Guys?
Why did they start Google ?
The answers to these and other compelling questions make up the content of this
eminently readable, highly entertaining account of the birth and phenomenal
growth of one of today’s leading technology companies —- GOOGLE.

In 1998, after reluctantly dropping out of the doctoral program at Stanford
University, Sergey Brin and Larry Page founded Google on some very basic
principals that remain at the heart of the company’s success today.

To quote , “Google’s transcendent and seemingly human qualities give it special appeal to an amazingly
wide range of computer users, from experts to novices, who trust the brand that has become an
extension of their brains.”

Google is so innately “human” because the programmers behind its functionality have remained true to
the founders’ vision of a search engine whose focus is entirely on the end user.  They favor “pull”
technology and marketing versus “push” and believe that the quality of their product will compel their
users to “tell a friend”.

Again to quote ,“Google grew in popularity and recognition without spending a dime.”

Of course, Google eventually needed to attract investment capital, and it did, but Brin and Page have
never compromised their integrity and vision and the result is a company that went public in 2004 at a
price of $85 a share and now trades well over $450 a share.
Boom !!!! That’s shear success in its shares.

Though they are millionaires several times over, the founders have remained personally involved in
nearly every aspect of the business.
To illustrate the sense of humor that pervades the entire corporate culture, in August, 2005 Google sold
14,159,265 additional shares in a secondary stock offering.
Why the unusual number of shares?????????
They represent the first eight digits after the decimal point for pi (3.14159265), completely appropriate
for a company run by two mathematicians.
Now , That’s passion for Mathematics and Algorithms.

From an auspicious misspelling (named GOOGOL) to an entity that for millions of people around the
world has become synonymous with the Internet, Google has transformed the way we search for
information.  And the founders’ “Don’t Be Evil” motto continues to ensure that they won’t sell out to the
mass marketers who want to capture the hapless web searcher – their marketing/advertising strategy is
unique in the industry.

Thus it is that what began as a misspelling of a very large number (a “googol”) has been so embraced by
the world that it has become a verb in the Dictionary as we Google our way around the Internet on a
daily basis.

So, Google your way into Google in everything that is Google.

GOOOOOOOOOOOOOOOOOOOOGLE !!!!!!!!!

Your computer may already be Hacked !

Untitled-1

Yes, you read it right , you PC may be hacked without your very knowledge.

It’s the Information Age apocalypse: What if, no matter how hard you tried, every computer on the market — from PCs to smartphones to fridges to cars — came pre-loaded with an irremovable backdoor that allowed the government or other nefarious agents to snoop on your data, behavior, and communications?

Believe it or not, we already have the technology to do this. It’s called hardware backdoor, and it’s a lot like a software virus that grants backdoor access to your computer — but the code resides in the firmware of a computer chip. In short, firmware is software that is stored in non-volatile memory on a computer chip, and is used to initialize a piece of hardware’s functionality. In a PC, the BIOS is the most common example of firmware — but in the case of wireless routers, a whole Linux operating system is stored in firmware.

Hardware backdoors are lethal for three reasons:-

a) They can’t be removed by conventional means (antivirus, formatting)

b) They can circumvent other types of security (passwords, encrypted filesystems)

c) They can be injected at manufacturing time.

Called Rakshasa (which are unrighteous spirits in Hindu and Buddhist mythoi), this backdoor is persistent, very hard to detect, portable, and because it’s built using open-source tools (Coreboot, SeaBIOS, and iPXE) it could be used by governments and still grant them plausible deniability.

You can easily try out with the above tools and it works like Magic!!!

The architecture of this backdoor is :

Untitled-2

To infect a computer with Rakshasa, Coreboot is used to re-flash the BIOS with a SeaBIOS and iPXE bootkit. This bootkit is gentle, and because it’s crafted out of legitimate, open-source tools, it’s very hard for anti-malware software to flag it as malicious. At boot time, the bootkit fetches malware over the web using an untraceable wireless link if possible (via a hacker parked outside), or HTTPS over the local network. Rakshasa’s malware payload then proceeds to disable the NX (no-execute) bit, remove anti-SMM protections, and disable ASLR (address space layout randomization).

Because the same basic chips are used time and time again, Rakshasa works on 230 Intel-based motherboards. It is also possible to load Rakshasa into the firmware of another piece of hardware — a network card, for example — and then have Rakshasa automatically transfer itself to the BIOS. Furthermore, the bootkit can be used to create a fake password prompt for Truecrypt and BitLocker, potentially rendering full-disk encryption useless. Finally, the Rakshasa bootkit even allows the remote flashing of the original BIOS — perfectly covering your tracks.

Rakshasa can be installed by anyone with physical access to your hardware — either at manufacturing time, or in the office with a USB stick.

It is this last point that has been causing some political unrest in the US, and the rest of the Western world. As you undoubtedly know, China is very nearly the sole producer of all electronic goods. It would be very, very easy for the Chinese government to slip a hardware backdoor into the firmware of every iPad, smartphone, PC, and wireless router.

There is hope, though: BIOS’s replacement, UEFI, unlike in Windows 8, should be a lot more secure. UEFI offers low-level security through firmware signing — if the signatures don’t match up (i.e. they’ve been modified by Rakshasa), then the system doesn’t boot. This might only be a partial fix, though: I don’t know if UEFI will prevent other, non-BIOS/UEFI chips from being backdoored.

So, with Hackers having an edge with your personal gadgets and your information at stake, Are you prepared to fight back? 

Smoke and Mirrors in F1

images

When an avid motorsport fan sees a Formula 1 car zipping through a 5-km lap in less than one and a half minutes, he/she can be forgiven for thinking that the 2.4-litre V8 engine is all there is to the vehicle.

In fact, hidden not just under the hood, but elsewhere as well, is a maze of embedded electronics. Forget about the 0-300 km/h in 8 seconds acceleration and the 300-0 km/h deceleration that such a car is capable of, these devils would not even be able to start without the electronics.

Electronics has become so pervasive in racing cars that the FIA, the governing body that regulates F1 racing, has felt compelled to intervene in order to ensure that technology does not take away the main purpose of racing — the test of a driver’s skill.

Special electronics in this 36 million dollar beast control everything, from the fuel consumption modes of the engine to technology that helps in drag reduction. Indeed, electronics plays a big part in the delivery of fluids to a racer who undergoes severe dehydration.

The monitoring, analyzing and controlling of F1 cars are all electronics-based.

The many compact sensors, the lightning-fast data processing, the quick communications, and the intelligent programs aboard these cars complement the razor-sharp reflexes of the driver behind the wheel. And, the servers in the mission control rooms of Formula 1 teams make it possible for a driver to focus on racing strategy on the go.

Untitled-1

The McLaren brain

Every F1 car has at least 250 sensors, monitoring about 1,500 parameters, generating about 1.5 gigabytes of data over a race weekend. Wow!!! that’s a lot of data .

Information such as the fuel consumption per lap, track temperature, humidity, the wear of tyres, and a host of other factors are continually monitored — from the practice sessions till the race on Sunday afternoon.

Not many know that McLaren of the McLaren-Mercedes team developed the brains that run modern F1 racing cars. Each of the 24 cars participating in every league of each F1 season since 2008 has used the Electronic Control Unit (ECU) developed by McLaren Electronics.

The ECU is a sealed and tamper-proof computer aboard every F1 car. The FIA mandates that all F1 cars deploy ECUs.

So, what’s an ECU?

The ECU is the nervous system of an F1 car. It comprises seven printed circuit boards (PCB), each up to 32 layers, which makes the unit very compact.

The ECUs use undisclosed microprocessors manufactured by Freescale Semiconductors (previously a unit of Motorola). These fingernail-sized chips are the silicon brains of F1 cars. The microprocessors are clocked at extremely high frequencies in order to keep the car in control at high speeds.

Just imagine , If 2 to 3 GHz is the frequency required to run your home PC then how about the frequency required to run these magnificent beasts . Simply Mind Boggling !!!!

For instance, the ECU, based on the inputs from the array of sensors, determines the quantity of fuel that is to be injected into the engine, at the rate of 1,000 injections per second.

Rugged electronics

The electronics in these cars, which account for only 5 per cent of the weight of the car, have to be ruggedized in order to enable them to bear the extreme shock, heat and vibration that they undergo while surfing upon the thick asphalt. One could find the same level of electronics used in F1 cars, or even with more sophistication in many industrial establishments, the primary difference being that the circuits inside F1 cars lap the whole 300-km race and more at extreme conditions, which are not frequent in many industries.

After all, even if a single soldering link snaps, the consequences can be dire for the driver and his car. Extensive vibration and shock testing, with adequate heat resistance, are performed on these ECUs to withstand the extremities they face during a race weekend.

Constant communication

Lapping at 300 km/h, the cars need to be in constant communication with mission control. Along the race track are multiple antennas, which enable the driver to stay in touch with his mission control, which receive telemetry signals from the cars. Hand off of communication signals from one antenna to the next as the car moves along the track, and multipath interference from signals received from more than one antenna simultaneously pose another set of problems in coordinating F1 cars with their control rooms. High-speed communication with intelligent arbitration using fast communication circuits take care of this challenge.

Formula 1 on roads

F1 cars may not be seen on our roads any time soon, but the formula behind these cars is trickling down as high-end technology in the latest cars. F1 is the fountain of numerous contemporary technologies that have made modern cars safer, efficient and convenient.

Power-steering, anti-lock and regenerative braking systems, onboard telemetry, the use of fuel blends and the evolution of advanced tyre rubber composites are some of the results of the advanced technology being incorporated in mainstream cars.

Regenerative braking

Kinetic Energy Recovery Systems (KERS) in F1 cars use regenerative braking in order to charge batteries using an electric generator. When additional acceleration is needed, this energy is deployed, usually by powering electrical motors. Switching between KERS and engine drive is meticulously controlled by the ECU. The hybrid cars available in the market today deploy this feature using a mini-computer, which makes these cars more fuel-efficient.

Electronic Stability Programme (ESP) in cars on the road today controls the traction of vehicles, preventing them from skidding off curves.

F1 racing is not merely an adrenaline-pumping sport, but an arena to showcase high-end technology. The F1 circuit is thus the laboratory that makes it possible for these advanced technologies to be brought into the wider automobile market.

So, when you watch these cars racing on the asphalt, just wonder about the brains that can go into racing these cars.

Enjoy Racing!!!!

Space Tech !!

Actual Mars Snap

Space technology may seem high-tech but in actual is an absolute inverse.

Flabbergasted !! , You need no reason to be .

Computer aboard the Mars rover may appear rudimentary, but it is truly special.

The landing of Curiosity, NASA’s Mars rover, and the initial pictures from the surface of the fourth planet, have swelled the curiosity of people across the globe. The dangerous landing, dubbed the “seven minutes of terror”, is an unprecedented technological feat considering that the spacecraft carrying Curiosity had travelled more than 560 million km from Earth.

Curiosity has 17 cameras and a host of other equipment. It has chemical cameras, spectrometers, weather probes, radiation meters, rock drillers — literally a laboratory on aluminium wheels mounted on a titanium chassis. Remote-controlling the craft, which has been entrusted with the task of observing, analysing and reporting on the Martian surface and its ambience, is a major challenge, especially because of the large distance involved. Commanding the rover from here on a real-time basis is not feasible because the signal would take anywhere between 8 minutes and 28 seconds for a single round trip. Thus, Curiosity mainly relies on its own onboard intelligence, communicating back to the commanding station only once a day with the Jet Propulsion Laboratory of NASA.

Who does the thinking?

Untitled-2

Interestingly, Curiosity’s computer is less powerful than the average Android-based smartphone available in market!

The motherboard (microcontroller, RAM, flash memory and other circuitry), RAD750, is the onboard computer for Curiosity and, on the lines of computation power, is humble when compared to the scale of terrestrial electronics available. It is clocked at 200 MHz , has 256 MB of RAM and 2 GB flash to store the data it captures. By way of comparison, the average low-end Android smartphone available today is powered by nothing less than a processor clocked at 600 MHz, at least 256 MB RAM and supports for 32 GB of flash memory.

So, why does the $2.6 billion project rely on computational power generated by a device that is available in a sub-$100 smartphone?

Space-grade electronics is different from terrestrial electronics. Temperature and radiation effects can destroy electronics. For instance, the computer aboard Curiosity, the RAD750, can operate in temperatures ranging from –55 degrees Celsius to 125 degrees Celsius, certainly not the range in which a smartphone or a music player would be expected to operate.

While the atmosphere shields us from hazardous radiation emanating from the sun, high-energy radiation can ionize electronic components aboard Curiosity, which would lower performance. They are also vulnerable to ‘bit-flips’. Software programmes are stored on a computer as sequences of zeroes and ones (binary) in memory registers. Radiation can cause these bits to flip: when a zero becomes one, or a one becomes zero, the information contained in the programme is altered, rendering it useless.

The harsh conditions in space demand that space-grade electronics manufacturing processes enable them to be ruggedized, enabling them to cope with the ‘radiation hardening’, which impairs the computation power of the circuitry. The RAD750 is nothing but a radiation hardened version of IBM’s PowerPC750 and is the most powerful of processors available for space applications.

Real-time operation

However, these relatively low-end computers can handle highly computation-intensive tasks. It is just that the computer in space is not made to work many multiple tasks at a time. It toggles between mission critical tasks such as its own health monitoring and safe navigation and, maybe, switching on one or two payloads like the camera or the spectrometer. The coordination of tasks, and the real-time response to conditions the rover faces, is done by using a real-time operating system, commonly known as RTOS. Curiosity runs VxWorks by WindRiver, which is considered one of the most reliable RTOS available.

Constraints in space

In a mission such as Curiosity, two of the major challenges are the limited bandwidth for communication and the time consumed by the cycle of ground analysis .

While operating through interplanetary distances, communication bandwidth and power pose major design constraints. To have ample bandwidth, signals must be transmitted at high frequencies, and in order to transmit signals at high frequency, more power is required.

On a 110-watt rover such as Curiosity, which is powered by the heat generated by the radioactive decay of plutonium dioxide, onboard power is still a precious commodity. This is one reason why the rover relies crucially on its own computer.

Moreover, with a flash memory of only 2 GB, and to perform numerous experiments and report back the data, the storage memory on Curiosity would exhaust within few tasks. This problem is solved by evacuating back to the commanding station on the Earth, either by transmitting high power signals using high gain antennas to the Deep Space Network (worldwide network of large antennas on the earth for interplanetary communication), or by handing over the data at low frequency and at lower power signals to the neighbouring Mars Reconnaissance Orbiter spacecraft that is orbiting Mars, which in turn transmits to the commanding station on the earth.

The case of Curiosity is a classic illustration of how raw computational power is not everything — even in the age of supercomputers.