-->
Showing posts with label Future. Show all posts
Showing posts with label Future. Show all posts

Cyberspace’s Ecological Impact

Electricity consumption in data centers worldwide doubled between 2000 and 2005, but the pace of growth slowed between 2005 and 2010. This slowdown was the result of the 2008 economic crisis, the increasing use of virtualization in datacenters, and the industry's efforts to improve energy efficiency. However, the electricity consumed by datacenters globally in 2010 amounted to 1.3% of the world electricity use. Power consumption is now a major concern in the design and implementation of modern infrastructures because energy-related costs have become an important component of the total cost of ownership of this class of systems.

Thus, energy management is now a central issue for servers and datacenter operations, focusing on reducing all energy-related costs, such as investment, operating expenses and environmental impacts. The improvement of energy efficiency is a major problem in cloud computing because it has been calculated that the cost of powering and cooling a datacenter accounts for 53% of its total operational expenditure. But the pressure to provide services without any failure leads to a continued scaling systems for all levels of the power hierarchy, from the primary feed sources to the support. In order to cover the worst-case situations, it is normal to over-provision Power Distribution Units (PDUs), Uninterrupted Power Supply (UPS) units, etc. For example, it has been estimated that power over-provisioning in Google data centers is about 40%.

Cyberspace

Furthermore, in an attempt to ensure the redundancy of power systems, banks of diesel generators are kept running permanently to ensure that the system does not fail even the moments that these support systems would take to boot up. These giant generators work continuously to ensure high availability in the event of a failure of any critical system, emitting large quantities of diesel exhaust, i.e., pollution. Thus, it is estimated that only about 9% of the energy consumed by datacenters is in fact used in computing operations, everything else is basically wasted to keep the servers ready to respond to any unforeseen power failure.

When we connect to the Internet, cyberspace can resemble a lot to outer space in the sense that it seems infinite and ethereal; the information is just out there. But if we think about the energy of the real world and the physical space occupied by the Internet, we will begin to understand that things are not so simple. Cyberspace has indeed real expression in the physical space, and the longer it takes to change our behavior in relation to the Internet, in order to clearly see its physical characteristics, the closer we will be to enter a path of destruction of our planet.

Previous Post – Next Post

Cyberspace's Social Impact

Despite being fashionable and many people refer to it, only a few seem to know what the "cloud" really is. A recent study by Wakefield Research for Citrix, shows that there is a huge difference between what U.S. citizens do and what they say when it comes to cloud computing. The survey of more than 1,000 American adults was conducted in August 2012 and showed that few average Americans know what cloud computing is.

For example, when asked what "the cloud" is, a majority responded it's either an actual cloud, the sky or something related to the weather (29%). 51 percent of respondents, believe stormy weather can interfere with cloud computing and only 16% were able to link the term with the notion of a computer network to store, access and share data from Internet-connected devices. Besides, 54% of respondents claimed to have never used a cloud when in fact 95% of those who said so are actually using cloud services today via online shopping, banking, social networking and file sharing.

Cloud Computing

What these results suggest is that the cloud is indeed transparent to users, fulfilling one of its main functions, which is provide content and services easily and immediately. However, the lack of knowledge about the computing model that supports all of our everyday activities, leads to a growing disengagement with the consequent deterioration of the security concerns of content and privacy.
In reality, cyberspace is not an aseptic place filled only with accurate and useful information. The great interest of cyberspace lies precisely in that it allows for social vitality, based on a growing range of multimedia services. Its fascination comes from acting as a booster technology for the proliferation of all forms of sociability, being a connectivity instrument. Therefore, cyberspace is not a purely cybernetic thing, but a living, chaotic, and uncontrolled entity.

Beyond these concerns, others equally serious are emerging. By analyzing our daily use of these new technological tools, we conclude that the growth of the Internet is suffocating the planet. We have to face the CO2 emissions produced by our online activities as internal costs to the planet.
We can start by showing some awareness of the problem, restricting our uploads and even removing some. Why not? What about reducing our photos on Facebook and Instagram? Keeping them permanently available consumes energy! If no one cares about our videos on YouTube, why not delete them? At least keep them where they do not need to be consuming energy.

We still have to go further and think that if awareness and self-discipline are not enough, we must consider the possibility of a cost for the sharing of large volumes of personal information. It is perhaps the only way to get most people to stop making unconscious use of the cloud, clogging it by dumping huge amounts of useless information into cyberspace. The goal is not to limit the access to information, this should always be open access, but rather give it a proper and conscientious use.

Next Post

Scientists replicate brain using a chip

 
Scientists are getting closer to the dream of creating computer systems that can replicate the brain. Researchers at the Massachusetts Institute of Technology (MIT) have designed a computer chip that mimics how the brain's neurons adapt in response to new information. Such chips could eventually enable communication between artificially created body parts and the brain and it could also pave the way for artificial intelligence devices.

There are about 100 billion neurons in the brain, each of which forms synapses - the connections between neurons that allow information to flow - with many other neurons. This process is known as plasticity and is believed to underpin many brain functions, such as learning and memory.

Brain

Bacteria Inspire Robotics


Researchers at Tel Aviv University have developed a computational model that better explains how bacteria move in a swarm -- and this model can be applied to human-made technologies, including computers, artificial intelligence, and robotics. The team of scientists has discovered how bacteria collectively gather information about their environment and find an optimal path to growth, even in the most complex terrains.

Studying the principles of bacteria navigation will allow researchers to design a new generation of smart robots that can form intelligent swarms, aid in the development of medical micro-robots used to diagnose or distribute medications in the body, or "de-code" systems used in social networks and throughout the Internet to gather information on consumer behaviors.

Bacteria
Simulated interacting agents collectively navigate towards a target (credit: American Friends of Tel Aviv University)

A new quantum state of matter?


Researchers at the University of Pittsburgh have made advances in better understanding correlated quantum matter by studying topological states in order to advance quantum computing, a method that harnesses the power of atoms and molecules for computational tasks.

Through his research, W. Vincent Liu and his team have been studying orbital degrees of freedom and nano-Kelvin cold atoms in optical lattices (a set of standing wave lasers) to better understand new quantum states of matter. From that research, a surprising topological semimetal has emerged.

quantum

Since the discovery of the quantum Hall effect by Klaus Van Klitzing in 1985, researchers like Liu have been particularly interested in studying topological states of matter, that is, properties of space unchanged under continuous deformations or distortions such as bending and stretching. The quantum Hall effect proved that when a magnetic field is applied perpendicular to the direction a current is flowing through a metal, a voltage is developed in the third perpendicular direction. Liu's work has yielded similar yet remarkably different results.

"We never expected a result like this based on previous studies," said Liu. "We were surprised to find that such a simple system could reveal itself as a new type of topological state -- an insulator that shares the same properties as a quantum Hall state in solid materials."
"This new quantum state is very reminiscent of quantum Hall edge states," said Liu. "It shares the same surface appearance, but the mechanism is entirely different: This Hall-like state is driven by interaction, not by an applied magnetic field."

Liu says this liquid matter could potentially lead toward topological quantum computers and new quantum devices for topological quantum telecommunication. Next, he and his team plan to measure quantities for a cold-atom system to check these predicted quantum-like properties.

Quantum cryptography breached?

Quantum cryptography has been pushed onto the market as a way to provide absolute security for communications and, as far as we know, no current quantum cryptographic system has been compromised in the field. It is already used in Swiss elections to ensure that electronic vote data is securely transmitted to central locations.

Quantum cryptography relies on the concept of entanglement. With entanglement, some statistical correlations are measured to be larger than those found in experiments based purely on classical physics. Cryptographic security works by using the correlations between entangled photons pairs to generate a common secret key. If an eavesdropper intercepts the quantum part of the signal, the statistics change, revealing the presence of an interloper.

The Swiss general approach can be summed up as follows: if you can fool a detector into thinking a classical light pulse is actually a quantum light pulse, then you might just be able to defeat a quantum cryptographic system. But even then the attack should fail, because quantum entangled states have statistics that cannot be achieved with classical light sources—by comparing statistics, you could unmask the deception.

But there's a catch here. I can make a classical signal that is perfectly correlated to any signal at all, provided I have time to measure said signal and replicate it appropriately. In other words, these statistical arguments only apply when there is no causal connection between the two measurements.

You might think that this makes intercepting the quantum goodness of a cryptographic system easy. But you would be wrong. When Eve intercepts the photons from the transmitting station run by Alice, she also destroys the photons. And even though she gets a result from her measurement, she cannot know the photons' full state. Thus, she cannot recreate, at the single photon level, a state that will ensure that Bob, at the receiving station, will observe identical measurements.


That is the theory anyway. But this is where the second loophole comes into play. We often assume that the detectors are actually detecting what we think they are detecting. In practice, there is no such thing as a single photon, single polarization detector. Instead, what we use is a filter that only allows a particular polarization of light to pass and an intensity detector to look for light. The filter doesn't care how many photons pass through, while the detector plays lots of games to try and be single photon sensitive when, ultimately, it is not. It's this gap between theory and practice that allows a carefully manipulated classical light beam to fool a detector into reporting single photon clicks.

Since Eve has measured the polarization state of the photon, she knows what polarization state to set on her classical light pulse in order to fake Bob into recording the same measurement result. When Bob and Alice compare notes, they get the right answers and assume everything is on the up and up.
The researchers demonstrated that this attack succeeds with standard (but not commercial) quantum cryptography equipment under a range of different circumstances. In fact, they could make the setup outperform the quantum implementation for some particular settings.

(Adapted from ArsTechnica)

Software to Prevent Child Abuse

Investigators estimate that there are currently more than 15 million photographs and videos of child abuse victims circulating on the Internet, or in the Darknet. By the time this material has been tracked down and deleted, pedophiles have long since downloaded it to their computers. Seeking and tracking hundreds of thousands of illegal media files in the suspect’s computer was tedious and extremely time-consuming process for investigators, until now.

Researchers from Fraunhofer Institute come up with an automated assistance system, called “desCRY”, that can detect child-pornographic images and video, from among even large volumes of data.
 
desCRY search results

The desCRY software uses novel pattern-recognition processes to navigate through digital photos and videos in search of illegal content, no matter how well-hidden it may be. The heart of the software consists of intelligent pattern-recognition algorithms that automatically analyze and classify images and video sequences combining technologies such as facial and skin-tone recognition with contextual and scene analyses to identify suspicious content.

The software searches all of the files in a computer, e-mail attachments and archives included and has many types of filtering allowing for a wide variety of search options. It can perform content-based data sorting and filtering, for instance. This way, investigators can sort files by person, object or location, for example. 
The algorithms use up to several thousand characteristics that describe properties such as color, texture and contours in order to analyze whether an image depicts child abuse. If the system is run on a standard PC, it classifies up to ten images per second, drastically accelerating the investigation works.

Quantum Cloning Advances

Quantum cloning is the process that takes an arbitrary, unknown quantum state and makes an exact copy without altering the original state in any way. Quantum cloning is forbidden by the laws of quantum mechanics as shown by the no cloning theorem. Though perfect quantum cloning is not possible, it is possible to perform imperfect cloning, where the copies have a non-unit fidelity with the state being cloned.

The quantum cloning operation is the best way to make copies of quantum information therefore cloning is an important task in quantum information processing, especially in the context of quantum cryptography. Researchers are seeking ways to build quantum cloning machines, which work at the so called quantum limit. Quantum cloning is difficult because quantum mechanics laws only allow for an approximate copy—not an exact copy—of an original quantum state to be made, as measuring such a state prior to its cloning would alter it. The first cloning machine relied on stimulated emission to copy quantum information encoded into single photons.

Scientists in China have now produced a theory for a quantum cloning machine able to produce several copies of the state of a particle at atomic or sub-atomic scale, or quantum state. A team from Henan Universities in China, in collaboration with another team at the Institute of Physics of the Chinese Academy of Sciences, have produced a theory for a quantum cloning machine able to produce several copies of the state of a particle at atomic or sub-atomic scale, or quantum state. The advance could have implications for quantum information processing methods used, for example, in message encryption systems.

In this study, researchers have demonstrated that it is theoretically possible to create four approximate copies of an initial quantum state, in a process called asymmetric cloning. The authors have extended previous work that was limited to quantum cloning providing only two or three copies of the original state. One key challenge was that the quality of the approximate copy decreases as the number of copies increases.

The authors were able to optimize the quality of the cloned copies, thus yielding four good approximations of the initial quantum state. They have also demonstrated that their quantum cloning machine has the advantage of being universal and therefore is able to work with any quantum state, ranging from a photon to an atom. Asymmetric quantum cloning has applications in analyzing the security of messages encryption systems, based on shared secret quantum keys.

The Future of Computers - Artificial Intelligence

What is Artificial Intelligence?


The term “Artificial Intelligence” was coined in 1956 by John McCarthy at the Massachusetts Institute of Technology defining it as the science and engineering of making intelligent machines.

Nowadays it’s a branch of computer science that aims to make computers behave like humans and this field of research is defined as the study and design of intelligent agents where an intelligent agent is a system that perceives its environment and takes actions that maximize its chances of success.

This new science was founded on the claim that a central property of humans, intelligence—the sapience of Homo Sapiens—can be so precisely described that it can be simulated by a machine. This raises philosophical issues about the nature of the mind and the ethics of creating artificial beings, issues which have been addressed by myth, fiction and philosophy since antiquity.

Artificial Intelligence includes programming computers to make decisions in real life situations (e.g. some of these “expert systems” help physicians in the diagnosis of diseases based on symptoms), programming computers to understand human languages (natural language), programming computers to play games such as chess and checkers (games playing), programming computers to hear, see and react to other sensory stimuli(robotics) and designing systems that mimic human intelligence by attempting to reproduce the types of physical connections between neurons in the human brain (neural networks).


The Future of Computers - Optical Computers

An optical computer (also called a photonic computer) is a device that performs its computation using photons of visible light or infrared (IR) beams, rather than electrons in an electric current. The computers we use today use transistors and semiconductors to control electricity but computers of the future may utilize crystals and metamaterials to control light.


An electric current creates heat in computer systems and as the processing speed increases, so does the amount of electricity required; this extra heat is extremely damaging to the hardware. Photons, however, create substantially less amounts of heat than electrons, on a given size scale, thus the development of more powerful processing systems becomes possible. By applying some of the advantages of visible and/or IR networks at the device and component scale, a computer might someday be developed that can perform operations significantly faster than a conventional electronic computer.

Coherent light beams, unlike electric currents in metal conductors, pass through each other without interfering; electrons repel each other, while photons do not. For this reason, signals over copper wires degrade rapidly while fiber optic cables do not have this problem. Several laser beams can be transmitted in such a way that their paths intersect, with little or no interference among them - even when they are confined essentially to two dimensions.


Electro-Optical Hybrid computers


Most research projects focus on replacing current computer components with optical equivalents, resulting in an optical digital computer system processing binary data. This approach appears to offer the best short-term prospects for commercial optical computing, since optical components could be integrated into traditional computers to produce an optical/electronic hybrid. However, optoelectronic devices lose about 30% of their energy converting electrons into photons and back and this switching process slows down transmission of messages.

Pure Optical Computers



All-optical computers eliminate the need for switching. These computers will use multiple frequencies to send information throughout computer as light waves and packets thus not having any electron based systems and needing no conversation from electrical to optical, greatly increasing the speed.

The Future of Computers - Quantum nanocomputers

A quantum computer uses quantum mechanical phenomena, such as entanglement and superposition to process data. Quantum computation aims to use the quantum properties of particles to represent and structure data using quantum mechanics to understand how to perform operations with this data.



The quantum mechanical properties of atoms or nuclei allow these particles to work together as quantum bits, or qubits. These qubits work together to form the computer’s processor and memory and can can interact with each other while being isolated from the external environment and this enables them to perform certain calculations much faster than conventional computers. By computing many different numbers simultaneously and then interfering the results to get a single answer, a quantum computer can perform a large number of operations in parallel and ends up being much more powerful than a digital computer of the same size.



A quantum nanocomputer would work by storing data in the form of atomic quantum states or spin. Technology of this kind is already under development in the form of single-electron memory (SEM) and quantum dots. By means of quantum mechanics, waves would store the state of each nanoscale component and information would be stored as the spin orientation or state of an atom. With the correct setup, constructive interference would emphasize the wave patterns that held the right answer, while destructive interference would prevent any wrong answers.



The energy state of an electron within an atom, represented by the electron energy level or shell, can theoretically represent one, two, four, eight, or even 16 bits of data. The main problem with this technology is instability. Instantaneous electron energy states are difficult to predict and even more difficult to control. An electron can easily fall to a lower energy state, emitting a photon; conversely, a photon striking an atom can cause one of its electrons to jump to a higher energy state.



I promise to write a series of articles on quantum issues very shortly...

The Future of Computers - Mechanical nanocomputers

Pioneers as Eric Drexler proposed as far back as the mid-1980's that nanoscale mechanical computers could be built via molecular manufacturing through a process of mechanical positioning of atoms or molecular building blocks one atom or molecule at a time, a process known as “mechanosynthesis.”


Once assembled, the mechanical nanocomputer, other than being greatly scaled down in size, would operate much like a complex, programmable version of the mechanical calculators used during the 1940s to 1970s, preceding the introduction of widely available, inexpensive solid-state electronic calculators.

Drexler's theoretical design used rods sliding in housings, with bumps that would interfere with the motion of other rods. It was a mechanical nanocomputer that uses tiny mobile components called nanogears to encode information.


Drexler and his collaborators favored designs that resemble a miniature Charles Babbage's analytical engines of the 19th century, mechanical nanocomputers that would calculate using moving molecular-scale rods and rotating molecular-scale wheels, spinning on shafts and bearings.


For this reason, mechanical nanocomputer technology has sparked controversy and some researchers even consider it unworkable. All the problems inherent in Babbage's apparatus, according to the naysayers, are magnified a million fold in a mechanical nanocomputer. Nevertheless, some futurists are optimistic about the technology, and have even proposed the evolution of nanorobots that could operate, or be controlled by, mechanical nanocomputers.

The Future of Computers - Chemical and Biochemical nanocomputers

Chemical Nanocomputer


In general terms a chemical computer is one that processes information in by making and breaking chemical bonds and it stores logic states or information in the resulting chemical (i.e., molecular) structures. In a chemical nanocomputer computing is based on chemical reactions (bond breaking and forming) and the inputs are encoded in the molecular structure of the reactants and outputs can be extracted from the structure of the products meaning that in these computers the interaction between different chemicals and their structures is used to store and process information.

nanotubes

These computing operations would be performed selectively among molecules taken just a few at a time in volumes only a few nanometers on a side so, in order to create a chemical nanocomputer, engineers need to be able to control individual atoms and molecules so that these atoms and molecules can be made to perform controllable calculations and data storage tasks. The development of a true chemical nanocomputer will likely proceed along lines similar to genetic engineering.

nanotubes

The Future of Computers - Electronic nanocomputers

Due to our fifty years of experience with electronic computing devices, including the extensive research and industrial infrastructure built up since the late 1940s, advances in nanocomputing technology are likely to come in this direction making the electronic nanocomputers appear to present the easiest and most likely direction in which to continue nanocomputer development in the near future.
Electronic nanocomputers would operate in a manner similar to the way present-day microcomputers work. The main difference is one of physical scale. More and more transistor s are squeezed into silicon chips with each passing year; witness the evolution of integrated circuits (IC s) capable of ever-increasing storage capacity and processing power.

The ultimate limit to the number of transistors per unit volume is imposed by the atomic structure of matter. Most engineers agree that technology has not yet come close to pushing this limit. In the electronic sense, the term nanocomputer is relative; by 1970s standards, today's ordinary microprocessors might be called nanodevices.

How it works


The power and speed of computers have grown rapidly because of rapid progress in solid-state electronics dating back to the invention of the transistor in 1948. Most important, there has been exponential increase in the density of transistors on integrated-circuit computer chips over the past 40 years. In that time span, though, there has been no fundamental change in the operating principles of the transistor.
Even microelectronic transistors no more than a few microns (millionths of a meter) in size are bulk-effect devices. They still operate using small electric fields imposed by tiny charged metal plates to control the mass action of many millions of electrons.

Although electronic nanocomputers will not use the traditional concept of transistors for its components, they will still operate by storing components, information in the positions of electrons.

At the current rate of miniaturization, the conventional transistor technology will reach a minimum size limit in a few years. At that point, smallscale quantum mechanical efects, such as the tunneling of electrons through barriers made from matter or electric fields, will begin to dominate the essential effects that permit a mass-action semiconductor device to operate Still, an electronic nanocomputer will continue to represent information in the storage and movement of electrons.
Nowadays, most eletronic nanocomputers are created through microscopic circuits using nanolithography.

The Future of Computers - Nanocomputers

Scientific discussion of the development and fabrication of nanometer-scale devices began in 1959 with an influential lecture by the late, renowned physicist Richard Feynman. Feynman observed that it is possible, in principle, to build and operate submicroscopic machinery. He proposed that large numbers of completely identical devices might be assembled by manipulating atoms one at a time.


Feynman's proposal sparked an initial flurry of interest but it did not broadly capture the imagination of the technical community or the public. At the time, building structures one atom at a time seemed out of reach. Throughout the 1960s and 1970s advances in diverse fields prepared the scientific community for the first crude manipulations of nanometer-scale structures. The most obvious development was the continual miniaturization of digital electronic circuits, based primarily upon the invention of the transistor by Shockley, Brattain, and Bardeen in 1948 and the invention of the integrated circuit by Noyce, Kilby, and others in the late 1950s. In 1959, it was only possible to put one transistor on an integrated circuit. Twenty years later, circuits with a few thousand transistors were commonplace.


Scientists are trying to use nanotechnology to make very tiny chips, electrical conductors and logic gates. Using nanotechnology, chips can be built up one atom at a time and hence there would be no wastage of space, enabling much smaller devices to be built. Using this technology, logic gates will be composed of just a few atoms and electrical conductors (called nanowires) will be merely an atom thick and a data bit will be represented by the presence or absence of an electron.

A nanocomputer is a computer whose physical dimensions are microscopic; smaller than the microcomputer, which is smaller than the minicomputer. (The minicomputer is called "mini" because it was a lot smaller than the original (mainframe) computers.)

A component of nanotechnology, nanocomputing will, as suggested or proposed by researchers and futurists, give rise to four types of nanocomputers:


Keep reading the next posts…

Nanotechnology

What is nanotechnology?


Nanotechnology (sometimes shortened to "nanotech") is the study of manipulating matter on an atomic and molecular scale, i.e., is the engineering of tiny machines — the projected ability to build things from the bottom up, using techniques and tools being developed today to make complete, highly advanced products.

nanotechnology


In order to understand the unusual world of nanotechnology, we need to get an idea of the units of measure involved. A centimeter is one-hundredth of a meter, a millimeter is one-thousandth of a meter, and a micrometer is one-millionth of a meter, but all of these are still huge compared to the nanoscale. A nanometer (nm) is one-billionth of a meter, smaller than the wavelength of visible light and a hundred-thousandth the width of a human hair.

As small as a nanometer is, it's still large compared to the atomic scale. An atom has a diameter of about 0.1 nm. An atom's nucleus is much smaller -- about 0.00001 nm.

nanotechnology

Generally, nanotechnology deals with structures sized between 1 to 100 nanometer in at least one dimension, and involves developing materials or devices within that size. Quantum mechanical effects are very important at this scale, which is in the quantum realm.

The Future of Computers - Overview

Computers' Evolution


The history of computers and computer technology thus far has been a long and a fascinating one, stretching back more than half a century to the first primitive computing machines. These machines were huge and complicated affairs, consisting of row upon row of vacuum tubes and wires, often encompassing several rooms to fit it all in.



In the past twenty years, there has been a dramatic increase in the processing speed of computers, network capacity and the speed of the internet. These advances have paved the way for the revolution of fields such as quantum physics, artificial intelligence and nanotechnology. These advances will have a profound effect on the way we live and work, the virtual reality we see in movies like the Matrix, may actually come true in the next decade or so.
Today's computers operate using transistors, wires and electricity. Future computers might use atoms, fibers and light. Take a moment to consider what the world might be like, if computers the size of molecules become a reality. These are the types of computers that could be everywhere, but never seen. Nano sized bio-computers that could target specific areas inside your body. Giant networks of computers, in your clothing, your house, your car. Entrenched in almost every aspect of our lives and yet you may never give them a single thought.



As anyone who has looked at the world of computers lately can attest, the size of computers has been reduced sharply, even as the power of these machines has increased at an exponential rate. In fact, the cost of computers has come down so much that many households now own not only one, but two, three or even more, PCs.
As the world of computers and computer technology continues to evolve and change, many people, from science fiction writers and futurists to computer workers and ordinary users have wondered what the future holds for the computer and related technologies. Many things have been pictured, from robots in the form of household servants to computers so small they can fit in a pocket. Indeed, some of these predicted inventions have already come to pass, with the introduction of PDAs and robotic vacuum cleaners.

Understanding the theories behind these future computer technologies is not for the meek. My research into quantum computers was made all the more difficult after I learned that in light of her constant interference, it is theoretically possible my mother-in-law could be in two places at once.

Nanotechnology is another important part of the future of computers, expected to have a profound impact on people around the globe. Nanotechnology is the process whereby matter is manipulated at the atomic level, providing the ability to “build” objects from their most basic parts. Like robotics and artificial intelligence, nanotechnology is already in use in many places, providing everything from stain resistant clothing to better suntan lotion. These advances in nanotechnology are likely to continue in the future, making this one of the most powerful aspects of future computing.

In the future, the number of tiny but powerful computers you encounter every day will number in the thousands, perhaps millions. You won't see them, but they will be all around you. Your personal interface to this powerful network of computers could come from a single computing device that is worn on or in the body.

Quantum computers are also likely to transform the computing experience, for both business and home users. These powerful machines are already on the drawing board, and they are likely to be introduced in the near future. The quantum computer is expected to be a giant leap forward in computing technology, with exciting implications for everything from scientific research to stock market predictions.

Moore's law


Visit any site on the web writing about the future of computers and you will most likely find mention of Moore's Law. Moore's Law is not a strictly adhered to mathematical formula, but a prediction made by Intel's founder co-founder Gordon Moore in 1965 in a paper where he noted that number of components in integrated circuits had doubled every year from the invention of the integrated circuit in 1958 until 1965 and predicted that the trend would continue "for at least ten years".



Moore predicted that computing technology would increase in value at the same time it would decrease in cost describing a long-term trend in the history of computing hardware. More specifically, that innovation in technology would allow for the number of transistors that can be placed inexpensively on an integrated circuit to double approximately every two years. The trend has continued for more than half a century and is not expected to stop until 2015 or later.

A computer transistor acts like a small electronic switch. Just like the light switch on your wall, a transistor has only two states, On or Off. A computer interprets this on/off state as a 1 or a 0. Put a whole bunch of these transistors together and you have a computer chip. The central processing unit (CPU) inside your computer probably has around 500 million transistors.

Shrinking transistor size not only makes chips smaller, but faster. One benefit of packing transistors closer together is that the electronic pulses take less time to travel between transistors. This can increase the overall speed of the chip. The capabilities of many digital electronic devices are strongly linked to Moore's law: processing speed, memory capacity, sensors and even the number and size of pixels in digital cameras. All of these are improving at (roughly) exponential rates as well.



Not everyone agrees that Moore's Law has been accurate throughout the years, (the prediction has changed since its original version), or that it will hold true in the future. But does it really matter? The pace at which computers are doubling their smarts is happening fast enough for me.

Thanks to the innovation and drive of Gordon Moore and others like him, computers will continue to get smaller, faster and more affordable.

The future of cyberspace


Is cyberspace infinite?


Of all the things we now take for granted, cyberspace is near the top of the list. The promise of the Internet for the twenty-first century is to make everything always available to everyone everywhere. All of human culture and achievement, the great and the not so great, may, one day soon, be just a click away.
cyberspace

When one is online, cyberspace can seem a lot like outer space or, to use the latest jargon, 'the cloud'. It appears infinite and ethereal. The information is simply out there. If, instead, we thought more about the real-world energy and the real estate that the Internet uses, we would start to realize that things are not so simple. Cyberspace is in fact physical space. And the longer it takes us to change our concept of the Internet—to see quite clearly its physical there-ness—the closer we'll get to blogging our way to oblivion.
cyberspace