The Future of Computers - Optical Computers

An optical computer (also called a photonic computer) is a device that performs its computation using photons of visible light or infrared (IR) beams, rather than electrons in an electric current. The computers we use today use transistors and semiconductors to control electricity but computers of the future may utilize crystals and metamaterials to control light.

An electric current creates heat in computer systems and as the processing speed increases, so does the amount of electricity required; this extra heat is extremely damaging to the hardware. Photons, however, create substantially less amounts of heat than electrons, on a given size scale, thus the development of more powerful processing systems becomes possible. By applying some of the advantages of visible and/or IR networks at the device and component scale, a computer might someday be developed that can perform operations significantly faster than a conventional electronic computer.

Coherent light beams, unlike electric currents in metal conductors, pass through each other without interfering; electrons repel each other, while photons do not. For this reason, signals over copper wires degrade rapidly while fiber optic cables do not have this problem. Several laser beams can be transmitted in such a way that their paths intersect, with little or no interference among them - even when they are confined essentially to two dimensions.

Electro-Optical Hybrid computers

Most research projects focus on replacing current computer components with optical equivalents, resulting in an optical digital computer system processing binary data. This approach appears to offer the best short-term prospects for commercial optical computing, since optical components could be integrated into traditional computers to produce an optical/electronic hybrid. However, optoelectronic devices lose about 30% of their energy converting electrons into photons and back and this switching process slows down transmission of messages.

Pure Optical Computers

All-optical computers eliminate the need for switching. These computers will use multiple frequencies to send information throughout computer as light waves and packets thus not having any electron based systems and needing no conversation from electrical to optical, greatly increasing the speed.

The Future of Computers - Quantum nanocomputers

A quantum computer uses quantum mechanical phenomena, such as entanglement and superposition to process data. Quantum computation aims to use the quantum properties of particles to represent and structure data using quantum mechanics to understand how to perform operations with this data.

The quantum mechanical properties of atoms or nuclei allow these particles to work together as quantum bits, or qubits. These qubits work together to form the computer’s processor and memory and can can interact with each other while being isolated from the external environment and this enables them to perform certain calculations much faster than conventional computers. By computing many different numbers simultaneously and then interfering the results to get a single answer, a quantum computer can perform a large number of operations in parallel and ends up being much more powerful than a digital computer of the same size.

A quantum nanocomputer would work by storing data in the form of atomic quantum states or spin. Technology of this kind is already under development in the form of single-electron memory (SEM) and quantum dots. By means of quantum mechanics, waves would store the state of each nanoscale component and information would be stored as the spin orientation or state of an atom. With the correct setup, constructive interference would emphasize the wave patterns that held the right answer, while destructive interference would prevent any wrong answers.

The energy state of an electron within an atom, represented by the electron energy level or shell, can theoretically represent one, two, four, eight, or even 16 bits of data. The main problem with this technology is instability. Instantaneous electron energy states are difficult to predict and even more difficult to control. An electron can easily fall to a lower energy state, emitting a photon; conversely, a photon striking an atom can cause one of its electrons to jump to a higher energy state.

I promise to write a series of articles on quantum issues very shortly...

The Future of Computers - Mechanical nanocomputers

Pioneers as Eric Drexler proposed as far back as the mid-1980's that nanoscale mechanical computers could be built via molecular manufacturing through a process of mechanical positioning of atoms or molecular building blocks one atom or molecule at a time, a process known as “mechanosynthesis.”

Once assembled, the mechanical nanocomputer, other than being greatly scaled down in size, would operate much like a complex, programmable version of the mechanical calculators used during the 1940s to 1970s, preceding the introduction of widely available, inexpensive solid-state electronic calculators.

Drexler's theoretical design used rods sliding in housings, with bumps that would interfere with the motion of other rods. It was a mechanical nanocomputer that uses tiny mobile components called nanogears to encode information.

Drexler and his collaborators favored designs that resemble a miniature Charles Babbage's analytical engines of the 19th century, mechanical nanocomputers that would calculate using moving molecular-scale rods and rotating molecular-scale wheels, spinning on shafts and bearings.

For this reason, mechanical nanocomputer technology has sparked controversy and some researchers even consider it unworkable. All the problems inherent in Babbage's apparatus, according to the naysayers, are magnified a million fold in a mechanical nanocomputer. Nevertheless, some futurists are optimistic about the technology, and have even proposed the evolution of nanorobots that could operate, or be controlled by, mechanical nanocomputers.

The Future of Computers - Chemical and Biochemical nanocomputers

Chemical Nanocomputer

In general terms a chemical computer is one that processes information in by making and breaking chemical bonds and it stores logic states or information in the resulting chemical (i.e., molecular) structures. In a chemical nanocomputer computing is based on chemical reactions (bond breaking and forming) and the inputs are encoded in the molecular structure of the reactants and outputs can be extracted from the structure of the products meaning that in these computers the interaction between different chemicals and their structures is used to store and process information.


These computing operations would be performed selectively among molecules taken just a few at a time in volumes only a few nanometers on a side so, in order to create a chemical nanocomputer, engineers need to be able to control individual atoms and molecules so that these atoms and molecules can be made to perform controllable calculations and data storage tasks. The development of a true chemical nanocomputer will likely proceed along lines similar to genetic engineering.


The Future of Computers - Electronic nanocomputers

Due to our fifty years of experience with electronic computing devices, including the extensive research and industrial infrastructure built up since the late 1940s, advances in nanocomputing technology are likely to come in this direction making the electronic nanocomputers appear to present the easiest and most likely direction in which to continue nanocomputer development in the near future.
Electronic nanocomputers would operate in a manner similar to the way present-day microcomputers work. The main difference is one of physical scale. More and more transistor s are squeezed into silicon chips with each passing year; witness the evolution of integrated circuits (IC s) capable of ever-increasing storage capacity and processing power.

The ultimate limit to the number of transistors per unit volume is imposed by the atomic structure of matter. Most engineers agree that technology has not yet come close to pushing this limit. In the electronic sense, the term nanocomputer is relative; by 1970s standards, today's ordinary microprocessors might be called nanodevices.

How it works

The power and speed of computers have grown rapidly because of rapid progress in solid-state electronics dating back to the invention of the transistor in 1948. Most important, there has been exponential increase in the density of transistors on integrated-circuit computer chips over the past 40 years. In that time span, though, there has been no fundamental change in the operating principles of the transistor.
Even microelectronic transistors no more than a few microns (millionths of a meter) in size are bulk-effect devices. They still operate using small electric fields imposed by tiny charged metal plates to control the mass action of many millions of electrons.

Although electronic nanocomputers will not use the traditional concept of transistors for its components, they will still operate by storing components, information in the positions of electrons.

At the current rate of miniaturization, the conventional transistor technology will reach a minimum size limit in a few years. At that point, smallscale quantum mechanical efects, such as the tunneling of electrons through barriers made from matter or electric fields, will begin to dominate the essential effects that permit a mass-action semiconductor device to operate Still, an electronic nanocomputer will continue to represent information in the storage and movement of electrons.
Nowadays, most eletronic nanocomputers are created through microscopic circuits using nanolithography.

The Future of Computers - Nanocomputers

Scientific discussion of the development and fabrication of nanometer-scale devices began in 1959 with an influential lecture by the late, renowned physicist Richard Feynman. Feynman observed that it is possible, in principle, to build and operate submicroscopic machinery. He proposed that large numbers of completely identical devices might be assembled by manipulating atoms one at a time.

Feynman's proposal sparked an initial flurry of interest but it did not broadly capture the imagination of the technical community or the public. At the time, building structures one atom at a time seemed out of reach. Throughout the 1960s and 1970s advances in diverse fields prepared the scientific community for the first crude manipulations of nanometer-scale structures. The most obvious development was the continual miniaturization of digital electronic circuits, based primarily upon the invention of the transistor by Shockley, Brattain, and Bardeen in 1948 and the invention of the integrated circuit by Noyce, Kilby, and others in the late 1950s. In 1959, it was only possible to put one transistor on an integrated circuit. Twenty years later, circuits with a few thousand transistors were commonplace.

Scientists are trying to use nanotechnology to make very tiny chips, electrical conductors and logic gates. Using nanotechnology, chips can be built up one atom at a time and hence there would be no wastage of space, enabling much smaller devices to be built. Using this technology, logic gates will be composed of just a few atoms and electrical conductors (called nanowires) will be merely an atom thick and a data bit will be represented by the presence or absence of an electron.

A nanocomputer is a computer whose physical dimensions are microscopic; smaller than the microcomputer, which is smaller than the minicomputer. (The minicomputer is called "mini" because it was a lot smaller than the original (mainframe) computers.)

A component of nanotechnology, nanocomputing will, as suggested or proposed by researchers and futurists, give rise to four types of nanocomputers:

Keep reading the next posts…


What is nanotechnology?

Nanotechnology (sometimes shortened to "nanotech") is the study of manipulating matter on an atomic and molecular scale, i.e., is the engineering of tiny machines — the projected ability to build things from the bottom up, using techniques and tools being developed today to make complete, highly advanced products.


In order to understand the unusual world of nanotechnology, we need to get an idea of the units of measure involved. A centimeter is one-hundredth of a meter, a millimeter is one-thousandth of a meter, and a micrometer is one-millionth of a meter, but all of these are still huge compared to the nanoscale. A nanometer (nm) is one-billionth of a meter, smaller than the wavelength of visible light and a hundred-thousandth the width of a human hair.

As small as a nanometer is, it's still large compared to the atomic scale. An atom has a diameter of about 0.1 nm. An atom's nucleus is much smaller -- about 0.00001 nm.


Generally, nanotechnology deals with structures sized between 1 to 100 nanometer in at least one dimension, and involves developing materials or devices within that size. Quantum mechanical effects are very important at this scale, which is in the quantum realm.

The Future of Computers - Overview

Computers' Evolution

The history of computers and computer technology thus far has been a long and a fascinating one, stretching back more than half a century to the first primitive computing machines. These machines were huge and complicated affairs, consisting of row upon row of vacuum tubes and wires, often encompassing several rooms to fit it all in.

In the past twenty years, there has been a dramatic increase in the processing speed of computers, network capacity and the speed of the internet. These advances have paved the way for the revolution of fields such as quantum physics, artificial intelligence and nanotechnology. These advances will have a profound effect on the way we live and work, the virtual reality we see in movies like the Matrix, may actually come true in the next decade or so.
Today's computers operate using transistors, wires and electricity. Future computers might use atoms, fibers and light. Take a moment to consider what the world might be like, if computers the size of molecules become a reality. These are the types of computers that could be everywhere, but never seen. Nano sized bio-computers that could target specific areas inside your body. Giant networks of computers, in your clothing, your house, your car. Entrenched in almost every aspect of our lives and yet you may never give them a single thought.

As anyone who has looked at the world of computers lately can attest, the size of computers has been reduced sharply, even as the power of these machines has increased at an exponential rate. In fact, the cost of computers has come down so much that many households now own not only one, but two, three or even more, PCs.
As the world of computers and computer technology continues to evolve and change, many people, from science fiction writers and futurists to computer workers and ordinary users have wondered what the future holds for the computer and related technologies. Many things have been pictured, from robots in the form of household servants to computers so small they can fit in a pocket. Indeed, some of these predicted inventions have already come to pass, with the introduction of PDAs and robotic vacuum cleaners.

Understanding the theories behind these future computer technologies is not for the meek. My research into quantum computers was made all the more difficult after I learned that in light of her constant interference, it is theoretically possible my mother-in-law could be in two places at once.

Nanotechnology is another important part of the future of computers, expected to have a profound impact on people around the globe. Nanotechnology is the process whereby matter is manipulated at the atomic level, providing the ability to “build” objects from their most basic parts. Like robotics and artificial intelligence, nanotechnology is already in use in many places, providing everything from stain resistant clothing to better suntan lotion. These advances in nanotechnology are likely to continue in the future, making this one of the most powerful aspects of future computing.

In the future, the number of tiny but powerful computers you encounter every day will number in the thousands, perhaps millions. You won't see them, but they will be all around you. Your personal interface to this powerful network of computers could come from a single computing device that is worn on or in the body.

Quantum computers are also likely to transform the computing experience, for both business and home users. These powerful machines are already on the drawing board, and they are likely to be introduced in the near future. The quantum computer is expected to be a giant leap forward in computing technology, with exciting implications for everything from scientific research to stock market predictions.

Moore's law

Visit any site on the web writing about the future of computers and you will most likely find mention of Moore's Law. Moore's Law is not a strictly adhered to mathematical formula, but a prediction made by Intel's founder co-founder Gordon Moore in 1965 in a paper where he noted that number of components in integrated circuits had doubled every year from the invention of the integrated circuit in 1958 until 1965 and predicted that the trend would continue "for at least ten years".

Moore predicted that computing technology would increase in value at the same time it would decrease in cost describing a long-term trend in the history of computing hardware. More specifically, that innovation in technology would allow for the number of transistors that can be placed inexpensively on an integrated circuit to double approximately every two years. The trend has continued for more than half a century and is not expected to stop until 2015 or later.

A computer transistor acts like a small electronic switch. Just like the light switch on your wall, a transistor has only two states, On or Off. A computer interprets this on/off state as a 1 or a 0. Put a whole bunch of these transistors together and you have a computer chip. The central processing unit (CPU) inside your computer probably has around 500 million transistors.

Shrinking transistor size not only makes chips smaller, but faster. One benefit of packing transistors closer together is that the electronic pulses take less time to travel between transistors. This can increase the overall speed of the chip. The capabilities of many digital electronic devices are strongly linked to Moore's law: processing speed, memory capacity, sensors and even the number and size of pixels in digital cameras. All of these are improving at (roughly) exponential rates as well.

Not everyone agrees that Moore's Law has been accurate throughout the years, (the prediction has changed since its original version), or that it will hold true in the future. But does it really matter? The pace at which computers are doubling their smarts is happening fast enough for me.

Thanks to the innovation and drive of Gordon Moore and others like him, computers will continue to get smaller, faster and more affordable.

Who Are Anonymous?

Anonymous' organization

Anonymous isn’t truly an organization; not in any traditional sense. They are a large, decentralized group of individuals who share common interests and web haunts. There are no official members, guidelines, leaders, representatives or unifying principles. Rather, Anonymous is a word that identifies the millions of people, groups, and individuals on and off of the internet who, without disclosing their identities, express diverse opinions on many topics.

Being "Anonymous" is much more a quality or a self-definition than a membership. Each project under the Anonymous banner may have a whole different set of instigators. Leadership, when it exists, is informal and carried out in chat channels, forums, IM and public calls to action online. No one's meeting in a board room.

The name Anonymous itself is inspired by the perceived anonymity under which users post images and comments on the Internet. Usage of the term Anonymous in the sense of a shared identity began on imageboards.

History of Anonymous Hacktivism

We are Anonymous. We are legion. We do not forgive. We do not forget. Expect us.

Long before they became vigilantes in the Wikileaks cyberwars, Anonymous was conducting large-scale “raids” against their enemies.

The Habbo Hotel raids

Probably the first time 4chan users banded together under the moniker Anonymous was in order to harass the users of Habbo Hotel, a cartoonish social network designed as a virtual hotel.
As early as 2006, Anonymous would "raid" Habbo, blocking its usual users from moving around. The first major raid is known as the "Great Habbo Raid of '06," and a subsequent raid the following year is known as the "Great Habbo Raid of '07."There appears to be some connection to the news of an Alabama amusement park banning a two-year-old toddler affected by AIDS from entering the park's swimming pool.

Habbo Hotel

Users signed up to the Habbo site dressed in avatars of a black man wearing a grey suit and an Afro hairstyle and blocked entry to the pool, declaring that it was "closed due to AIDS," flooding the site with internet sayings, and forming swastika-like formations. Then when all their black cartoon avatars got banned, they'd call Habbo racist. This was all done "for the lulz," or just for fun.
At this point, Anonymous’ actions had not taken on a political bent. Some members of Anon would argue it was better that way.

The future of cyberspace

Is cyberspace infinite?

Of all the things we now take for granted, cyberspace is near the top of the list. The promise of the Internet for the twenty-first century is to make everything always available to everyone everywhere. All of human culture and achievement, the great and the not so great, may, one day soon, be just a click away.

When one is online, cyberspace can seem a lot like outer space or, to use the latest jargon, 'the cloud'. It appears infinite and ethereal. The information is simply out there. If, instead, we thought more about the real-world energy and the real estate that the Internet uses, we would start to realize that things are not so simple. Cyberspace is in fact physical space. And the longer it takes us to change our concept of the Internet—to see quite clearly its physical there-ness—the closer we'll get to blogging our way to oblivion.