A Brief History Of Information

 

Informations natural state used to be one of motion, of activity. Information is generated by interactions, information is interaction, as without comparison, without a context, without interaction, there is nothing.

There is no temperature with only hot. There is no darkness without light. In quantum physics a particle has no identity until it meets another and its superposed wave function collapses.

It is like that with all information. Information cannot exist without context.


Life appeared and information could be stored in useful form. A cratered moon records the history of the impacts but it does so by accident, as a by product of the events, not


And along came one of the greatest inventions of mankind; writing- the freezing of information.

Not in a perfect, complete form, as that would require freezing and including all the information's context, which would have to be all information everywhere, it would have to be everything, and if that was done the information would of course cease to exist.

Information could survive, with writing, in an incomplete form pretty much permanently, degraded only by it's physical media and the outside world's ability to access it- altered through changes in language, customs and culture as well as physical accessibility. From the human perspective though, written information becomes for most practical purposes though, solid.

"In the Phaedrus, Plato argued that the new arrival of writing would revolutionize culture for the worst. He suggested that it would substitute reminiscence for thought and mechanical learning for the true dialect of the living quest for truth by discourse and conversation." Marhall McLuhan1954 From "Essential McLuhan"Edited by Eric McLuhan & Frank Zingrone p 285.

The only time it would thaw, was when someone would read it and re-introduce it into the dynamic environment of their minds.


This changed again when the frozen information could be manipulated in chunks, with the advent of computers. Assembly line manipulation became possible. Databases could be organized and re-organized, yielding new information in its relationships at every turn. Programs and procedures could be devised with the confidence that the machines would tirelessly follow them to the letter. Impossibly boring manipulations with potentially exciting results became first promised, practical, then routine.

The primitive PC's of the 80's and early 90's empowered the individual. Enabled us to do, well more. More of what was previously segregated into specialist fields. We could publish magazines from home! We could solve amazing equations, do our own complex financial planning! Design like there was no tomorrow. Write print quality letters. We became, in effect, our own secretaries.

"As technology advances, it reverses the characteristics of every situation again and again. The age of automation is going to be the age of "do it yourself"." McLuhan in 1957 Edited by Eric McLuhan & Frank Zingrone "Essential McLuhan" Routledge 1997 p283.

 

 

 



 


A Brief History Of Computation: Moores Law

Speed matters: Computers double in speed every 18 months.

That's Moore's Law, named after Gordon Moore (co-founder of Intel in 1968). "Moore is widely known for "Moore's Law," in which he predicted that the number of transistors that the industry would be able to place on a computer chip would double every year. In 1995, he updated his prediction to once every two years. While originally intended as a rule of thumb in 1965, it has become the guiding principle for the industry to deliver ever-more-powerful semiconductor chips at proportionate decreases in cost." From his Executive Bio.

"In 1965, Gordon Moore was preparing a speech and made a memorable observation. When he started to graph data about the growth in memory chip performance, he realized there was a striking trend. Each new chip contained roughly twice as much capacity as its predecessor, and each chip was released within 18-24 months of the previous chip. If this trend continued, he reasoned, computing power would rise exponentially over relatively brief periods of time. Moore's observation, now known as Moore's Law, described a trend that has continued and is still remarkably accurate. It is the basis for many planners' performance forecasts. In 26 years the number of transistors on a chip has increased more than 3,200 times, from 2,300 on the 4004 in 1971 to 7.5 million on the Pentium® II processor." From http://www.intel.com/intel/museum/25anniv/hof/moore.htm

Ray Kurzweil takes it further, stating that it has been this way for a lot longer than we first thought, this trend did not start when noticed, in the sixties.

To understand the the implications of this exponential growth, let's go back to 1984, the birth of Macintosh, a year dear to me, a year computers, and us, got liberated from the text based interface. Let's say we put one dollar in the bank back then. And let's say this bank gave interest in line with the speed of computers evolution. We are in 1999 (at time of writing anyway), that was 1984 so that's 15 years right? OK, that's seven and a half times our dollar has doubled in value. It started at $1 and in 24 months it was worth $2, then 4, 8, 16, 32, 64, 128, finally to be worth $256 at next count ($192 now).

So guess what kind of money we are looking at for the next couple of years? Have a look at the table below, which is based on numbers taken from "The Age Of The Spiritual Machine" by Ray Kurzweil.

Current new machines costing a thousand US dollars or so have the processing power of an insect brain. In ten years we will be able to spend the same amount of money and get the processing power of a mouse brain (with a bank balance then at 8,192 dollars) We will be able to buy the equivalent processing power of a human brain in 2023 (with over half a million in the bank). In 2060 we will be able to get a machine with the processing power of all human brains. Our bank balance would be 137,216 million dollars at that point!

 

 Year  1984  1986  1988  1990  1992  1994  1996  1998   2000
2002
 $  1  2  4  8  16  32  64  128  256  512
Capacity:                insect    
                     
  Year  2004  2006  2008  2010  2012  2014  2016  2018  2020  2022
  $  1,024  2,048  4,096  8,192  16,384  32,768  65,536  131,072  262,144  524,288
 Capacity:       mouse            human
                     brain
  Year  2024  2026 2028  2030   2032 2034   2036 2038   2040 2042 
  $  1 m  2 m 4 m   8 m 16 m    33 m  67 m  134 m 269 m   536 m
Capacity:                    
                     
  Year  2046  2048  2050  2052  2054  2056  2058  2060    
  $ 1,072 m 2,144 m
4,288 m
8,576 m 17,152 m 34,304 m 68,608 m 137,216 million dollars
 Capacity:               all human brains

 

Every decade someone predicts the slow down of Moore law, but so far it's just kept on going. New Scientist magazine http://www.newscientist.com has published an interesting article where Moores law has been pushed to the extremes of current understanding of science;

"'People have been claiming the law is about to break down every decade since it was formulated,' says Seth Lloyd, a physicist based at MIT. 'But they've all been wrong. I thought, let's see where Moores law has to stop and can go no further." From The Last Computer. 2 September 2000 page 26.

"To begin with, he wasn't too concerned with the details of how the ultimate computer might work- those can be sorted out by the engineers of the future. Instead he stuck to considering basic physical quantities such as energy, volume and temperature (Nature vol 406 p1047). The speed of a computer Lloyd realized, is limited by the total energy available to it. The argument for this is rather subtle. A computer performs a logical operation by flipping a '0' to a '1' or vice versa. But there is a limit to how fast this can be done because of the need to change a physical state representing a '0' to a state representing a '1'."

"In the quantum world any object, including a computer, is simply a packet of waves of various frequencies all superimposed. Frequency is linked to energy by Planck's constant, so if the wave packet has a wide range of energies, it is made up of a large range of different frequencies. As these waves interfere with one another, the overall amplitude can change very fast. On the other hand, a small energy spread means a narrow range of frequencies , and much slower change in state."

"Because a computer can't contain negative energies, the spread in energy of a bit cannot be greater then its total energy. In 1998, Norman Margolus and Lev Levtin of MIT calculated that the minimum time for a bit to flip is Planck's constant divided by four times the energy."

"Lloyd had built on Margolus's work by considering a hypothetical 1-kilogram laptop. Then the maximum energy available is a quantity famously given by the formula E=mc2 If this mass-energy were turned into a form such as radiant energy, you'd have 1017 joules in photons, says Lloyd."

The article goes on to imagine a computer which would be able to carry out 1051 operations a second. Compare that to a current computers processing capacity of about 1012 operations a second.

As for memory, the article jumps straight to Boltzmann's constant. "What limits memory? The short answer is entropy."

"Entropy is intimately connected to information, because information needs disorder: a smooth, ordered system has almost no information content"

"Entropy is linked to the number of distinguishable states a system can have by the equation inscribed on Boltsmann's headstone S=K ln W. Entropy (S) is the natural logarithm of the number of states (W) multiplied by Boltzmann's constant (k). Equally, to store a lot of information you need a lot of indistinguishable states."

"To register one bit of information you need two states, one representing 'on,' the other 'off.' Similarly, 2 bits require 4 states and so on. In short, both the entropy of a system and its information content are proportional to the logarithm of both states."

Discussing issues of the speed of information transfer, constrained by the speed of light vs. information capacity, the discussion even touches on making the computer into a black hole. But that radical solution would be a one of computer,"exploding with the answer to its calculation." The 1 kilo non-black hole computer is capable of storing 1031 bits with.

The temperature of the computer? About a billion degrees.

Estimated time to build the ultimate computer within the realm of our current understanding of the laws of physics? 200 years...

 

 



 

 

 

A Brief History Of The Internet

1957 - USSR launched the first Sputnik satellite in to orbit around the earth. As a counter measure and tactical advancement the USA Department of Defense formed the Advanced Research Projects Agency [ARPA]. The new Department was formed due to military advisors feeling uneasy about present form of communication in battle or disaster conditions.
1961 - Leonard Kleinrock, publishes his theories on small packet-switching.
1962 - J.C.R. Licklider of MIT writes a series of memos in August discussing his "Galactic Network" concept.
1965 - Paul Baran writes a paper on "Distributed Communications on Networks".
1965 - ARPA funds a fact finding paper on "Network of time-sharing computers".
1967 - ARPA Principal Investigators semi-annual meeting at the University of Michigan.
1968 - After Roberts and the DARPA funded community had refined the overall structure and specifications for the ARPANET, an RFQ was released by DARPA. The contract to formulate and build Network hardware and programmes was won by BBN.
1969 - University of California Los Angeles contracted by ARPANET to be the Network Measurement Centre went live on 1st September.
Doug Engelbarts group at SRI contracted by ARPANET to be the Network Information, hooked up on 1st October, becoming the second machine on the network, breathing life in the network for the first time.
1972 - The Inter Networking Working Group was formed to help establishing protocols.
1974 - Vint Cerf and Bob Kahn publish a paper outlining TCP for the first time (Transmission Control Program), allowing for an Inter-net, networking across different networks.
1989 - Tim Berners-Lee released the initial HTTP (Hyper Text Transfer Protocol) protocols which will become the World Wide Web.
1990 - ARPANET name is drop ceases to exist. The first GUI Web browser is developed by Tim Berners-Lee.
1993 - Mosaic takes the Internet by storm as the best browser around (and the only easy to use), Both Microsoft's Internet Explorer and Netscape were originally based on Mosaic.
2000 - The Internet is a part of daily life for most people in the developed world.

"Web sites appear faster than Starbucks outlets, showing up at a rate of more than 4,400 per day. As of 1999 there were 3.6 million sites. The number of Web pages, which may be the best gauge of size, has also skyrocketed in the last 12 months. NEC Research says there are around 1.5 billion Web pages, an 88 percent increase from 1998. This suggests 1.9 million Web pages are created each day. IDC expects the number to hit 8 billion in 2002, exceeding the world's population."

"Domain-name registrar Network Solutions recently said it registered 4.7 million new domain names in 1999. That's more than double NSI's registrations in 1998, and brings total registrations to about 8.1 million. While this indicates extraordinary growth and a lot of new sites in the pipeline, the Online Computer Library Center found only 3.6 million sites in September, less than half of what NSI has registered to date. Moreover, the OCLC reports that almost one-third of Web sites are either transitory or unfinished, suggesting that a large portion of the registered names are just placeholders."

"The Internet Software Consortium's January crawl of the Web found the number of Internet domain hosts, or live IP addresses at any one time, has climbed to 72.4 million, an increase of 67 percent in the last year. Top-level domains are experiencing growth across the spectrum, with the .coms seeing a 105-percent boost. Although its lead has slid 4 percent since 1998, the U.S. is still the definitive leader in site ownership, with 55 percent of all public sites. The Western states account for the majority of those sites. But only a handful of sites get any attention: Alexa Internet found that 80 percent of Web traffic goes to only 15,000, or 0.4 percent, of all sites. " From David Lake at The Standard.

 

 

 



 

 

 

resulting in: The Information Explosion

"More information has been produced in the last 30 years than in the previous 5,000. About 1,000 books are published internationally every day, and the total of all printed knowledge doubles every eight years" according to Peter Large (Information Anxiety).

Information waits for no man.

Vannevar Bush raised the alarm in The Atlantic Monthly already in 1945: "Thus far we seem to be worse off than ever before - for we can enormously extend the record; yet even in it's present bulk we can hardly consult it."

And the wired world certainly hasn't slowed down the information onslaught as Evan I. Schwartz's mind boggling numbers bring it home in his book Webonomics:"Roy Williams, a researcher at the California Institute of Technology's Center for Advanced Computing Research, estimates that all the information from all of human history stored on paper in the world today amounts to about 200 petabytes. A byte roughly equals a printed character. So a petabyte is about one quadrillion (or thousand trillion) characters. That figure includes all the paper in corporate filing cabinets, all government archives, all homes, all schools, universities, and libraries."

"By the year 2000, Williams estimates, the amount of online information that will have accumulated in just a the few decades leading up to the new millennium will be about two and a half times that amount now on paper. "

As Marshall McLuhan put it on The Best of Ideas on CBC Radio in 1967: "One of the effects of living with electric information is that we live habitually in a state of information overload. There's always more than you can cope with".

Richard Saul Wurman, in his book What-If, Could be, puts it in maybe even after but bleaker terms: "Everyone spoke of an information overload, but what there was in fact was a non-information overload."

Trying to stay afloat is not the answer, you need to start swimming in the information.

 

 

 



 

 

 

The Next Step: Liquid

Information has changed.

Originally, information could only exist directly, then it could be frozen, or stored, as discussed earlier.

When the information then gets melted- when it gets digitized, it doesn't revert to its earlier state, it becomes liquid, it doesn't behave in any previously inherent ways, it gathers a new, relative identity.

Previously its identity was in relation to its physical world. In cyberspace there are new relationships, relationships with less physical constraints. Depending on the forces it interacts with, it can go anywhere, be processed in any way and change into anything. This is both gtood and bad.

With computers, you can make music from a picture, paint a picture from sound. You can treat any information in any way to interact with any other information.

The Liquid Information Company was founded not to make things more liquid, but to but to integrate the aspects and characteristics of the physical world, including those of people, into the liquiscape.

Digital information has characteristics far beyond it's name, the ones and zeros, as do you have characteristics beyond your atoms, even your genes. It has characteristics emergent from its environment, which is games and word processors, databases, the Internet and more. Currently, the information is getting liquid within specific environments. The emergent characteristics are constrained by its history and how we are used to interact with what we percieve that kind of information to be, the email, the game, the word processor etc. At the same time there is a mad rush to integrate these different environments. Watching TV on your computer monitor, getting email on your mobile phone, surfing the web on TV etc.

The point though, from the human perspective, and not the machines, is that integration and flow, greater liquidity is all well and fine, but it'll only be an information tidal wave if the real world and the human is ignored.

Our job is to increase the liquification not just of information, but of how it behaves in relation to us. real world humans. The real world constraint computers forgot.

 





[Comment On This Article]


 

©1995-2001 The Liquid Information Company