Thursday 11 August 2011

Future of the Web: Semantic Web

The Semantic Web is an idea of World Wide Web inventor Tim Berners-Lee that the
Web as a whole can be made more intelligent and perhaps even intuitive about how to
serve a user's needs. Berners-Lee observes that although search engines index much of
the Web's content, they have little ability to select the pages that a user really wants or
needs. He foresees a number of ways in which developers and authors, singly or in
collaborations, can use self-descriptions  and other techniques so that context-
understanding programs can selectively find what users want.

Who invented the Web & Why?

"CERN is a meeting place for physicists from all over the world, who collaborate on
complex physics, engineering and information handling projects. Thus, the need for the
WWW system arose "from the geographical dispersion of large collaborations, and the
fast turnover of fellows, students, and visiting scientists," who had to get "up to speed
on projects and leave a lasting contribution before leaving."
CERN possessed both the financial and computing resources necessary to start the
project. In the original proposal, Berners-Lee outlined two phases of the project:
First, CERN would "make use of existing software and hardware as well as
implementing simple browsers for the user's workstations, based on an analysis of the
requirements for information access needs by experiments."
Second, they would "extend the application area by also allowing the users to add new
material."
Berners-Lee expected each phase to take three months "with the full manpower
complement": he was asking for four software engineers and a programmer. The
proposal talked about "a simple scheme to incorporate several different servers of
machine-stored information already available at CERN." 

What is Home Page of a web site?

For a Web user, the home page is the first Web page that is displayed after starting a
Web browser like Netscape's Navigator or Microsoft's Internet Explorer. The browser is
usually preset so that the home page is  the first page of the browser manufacturer.
However, you can set it to open to any Web site. For example, you can specify that
"http://www.yahoo.com" or "http://whatis.com" be your home page. You can also
specify that there be no home page (a blank space will be displayed) in which case you
choose the first page from your bookmark list or enter a Web address.
2) For a Web site developer, a home page is the first page presented when a user selects
a site or presence on the World Wide Web. The usual address for a Web site is the home
page address, although you can enter the address (Uniform Resource Locator) of any
page and have that page sent to you. 

What is a Web site?

A Web site is a related collection of World Wide Web (WWW) files that includes a
beginning file called a home page. A company or an individual tells you how to get to
their Web site by giving you the address of their home page. From the home page, you
can get to all the other pages on their site. For example, the Web site for IBM has the
home page address of http://www.ibm.com. IBM's home page address leads to
thousands of pages but a web site can also be just of few pages.

URL

URL (Uniform Resource Locator, previously Universal Resource Locator) - pronounced
YU-AHR-EHL or, in some quarters, UHRL -  is the address of a file (resource)
accessible on the Internet. The type of file or resource depends on the Internet
application protocol. Using the World Wide Web's protocol, the Hypertext Transfer
Protocol (HTTP), the resource can be an HTML page (like the one you're reading), an
image file, or any other file supported by HTTP. The URL contains the name of the
protocol required to access the resource, a  domain name that identifies a specific
computer on the Internet, and a path name (hierarchical description of a file location) on
the computer.
On the Web (which uses the Hypertext Transfer Protocol), an example of a URL is:
       http://www.ietf.org/rfc/rfc2396.txt
Which describes a Web page to be accessed with an HTTP (Web browser) application
that is located on a computer named WWW.ietf.org. The path name for the specific file in
that computer is /rfc/rfc2396.txt.
An HTTP URL can be for any Web page, not just a home page, or any individual file.
Examples:
http://dawn.com
http://www.vu.edu.pk
http://www.smeda.org.pk

World Wide Web

Browser 
A browser is an application program that provides a way to look at and interact with all
the information on the World Wide Web. The word "browser" seems to have originated
prior to the Web as a generic term for user interfaces that let you browse (navigate
through and read) text files online. By the time the first Web browser with a graphical
user interface was generally available (Mosaic, in 1993), the term seemed to apply to Web
content, too. Technically, a Web browser is a client program that uses the Hypertext
Transfer Protocol (HTTP) to make requests of Web servers throughout the Internet on
behalf of the browser user.

Quantum Computing with Molecules

by Neil Gershenfeld and Isaac L. Chuang
Factoring a number with 400 digits--a numerical feat needed to break some security
codes--would take even the fastest supercomputer in existence billions of years. But a
newly conceived type of computer, one that exploits quantum-mechanical interactions,
might complete the task in a year or so, thereby defeating many of the most
sophisticated encryption schemes in use. Sensitive data are safe for the time being,
because no one has been able to build a practical quantum computer. But researchers
have now demonstrated the feasibility of this approach. Such a computer would look
nothing like the machine that sits on your desk; surprisingly, it might resemble the cup of
coffee at its side. 
Several research groups believe quantum computers based on the molecules in a liquid
might one day overcome many of the limits facing conventional computers. Roadblocks
to improving conventional computers will ultimately arise from the fundamental physical
bounds to miniaturization (for example, because transistors and electrical wiring cannot
be made slimmer than the width of an atom). Or they may come about for practical
reasons--most likely because the facilities for fabricating still more powerful microchips
will become prohibitively expensive. Yet  the magic of quantum mechanics might solve
both these problems. 

World Wide Web -1989

"CERN is a meeting place for physicists from all over the world, who collaborate on
complex physics, engineering and information handling projects. Thus, the need for the
WWW system arose "from the geographical dispersion of large collaborations, and the
fast turnover of fellows, students, and visiting scientists," who had to get "up to speed
on projects and leave a lasting contribution before leaving." 
CERN possessed both the financial and computing resources necessary to start the
project. In the original proposal, Berners-Lee outlined two phases of the project: 
First, CERN would "make use of existing software and hardware as well as
implementing simple browsers for the user's workstations, based on an analysis of the
requirements for information access needs by experiments." 
Second, they would "extend the application area by also allowing the users to add new
material." 
Berners-Lee expected each phase to take three months "with the full manpower
complement": he was asking for four software engineers and a programmer. The
proposal talked about "a simple scheme to incorporate several different servers of
machine-stored information already available at CERN."
Set off in 1989, the WWW quickly gained great popularity among Internet users. For
instance, at 11:22 am of April 12, 1995, the WWW server at the SEAS of the University
of Pennsylvania "responded to 128 requests in one minute. Between 10:00 and 11:00
 

Apple Macintosh – 1984

Apple introduced the Macintosh to the nation on January 22, 1984. The original
Macintosh had 128 kilobytes of RAM, although this first model was simply called
"Macintosh" until the 512K model came out in September 1984. The Macintosh retailed
for $2495. It wasn't until the Macintosh that the general population really became aware
of the mouse-driven graphical user interface.

IBM PC – 1981

On August 12, 1981, IBM released their new computer, re-named the IBM PC. The
"PC" stood for "personal computer" making IBM responsible for popularizing the term
"PC".
The first IBM PC ran on a 4.77 MHz Intel 8088 microprocessor. The PC came equipped
with 16 kilobytes of memory, expandable to 256k. The PC came with one or two 160k
Floppy Disks Drives and an optional color monitor. The price tag started at $1,565,
which would be nearly $4,000 today. 

Cray 1 – 1 976

It looked like no other computer before, or for that matter, since. The Cray 1 was the
world's first "supercomputer," a machine that leapfrogged existing technology when it
was introduced in 1971. 
And back then, you couldn't just order up fast processors from Intel. "There weren't any
microprocessors," says Gwen Bell of The Computer Museum History Center. "These
individual integrated circuits that are on the board performed different functions." 
Each Cray 1, like this one at The Computer Museum History Center, took months to
build. The hundreds of boards and thousands of wires had to fit just right. "It was really
a hand-crafted machine," adds Bell. "You think of all these wires as a kind of mess, but
each one has a precise length.

Altair 8800 – 1975

By 1975 the market for the personal computer was demanding a product that did not
require an electrical engineering background and thus the first mass produced and
marketed personal computer (available both as a kit or assembled) was welcomed with
open arms. Developers Edward Roberts, William Yates and Jim Bybee spent 1973-1974
to develop the MITS (Micro Instruments Telemetry Systems ) Altair 8800. The price was
$375, contained 256 bytes of memory (not 256k),but had no keyboard, no display, and
no auxiliary storage device. Later, Bill Gates and Paul Allen wrote their first product for
the Altair -- a BASIC compiler (named after a planet on a Star Trek episode).

Intel 4004 – 1971

The 4004 was the world's first universal microprocessor. In the late 1960s, many
scientists had discussed the possibility of a computer on a chip, but nearly everyone felt
that integrated circuit technology was not yet ready to support such a chip. Intel's Ted
Hoff felt differently; he was the first person to recognize that the new silicon-gated MOS
technology might make a single-chip CPU (central processing unit) possible.
Hoff and the Intel team developed such architecture with just over 2,300 transistors in
an area of only 3 by 4 millimeters. With its 4-bit CPU, command register, decoder,
decoding control, control monitoring of machine commands and interim register, the
4004 was one heck of a little invention. Today's 64-bit microprocessors are still based on
similar designs, and the microprocessor is still the most complex mass-produced product
ever with more than 5.5 million transistors performing hundreds of millions of
calculations each second - numbers that are sure to be outdated fast. 

ARPANET – 1969

The Advanced Research Projects Agency was formed with an emphasis towards
research, and thus was not oriented only to a military product. The formation of this
agency was part of the U.S. reaction to the  then Soviet Union's launch of Sputnik in
1957. (ARPA draft, III-6). ARPA was assigned to research how to utilize their
investment in computers via Command and Control Research (CCR). Dr. J.C.R.
Licklider was chosen to head this effort.
Developed for the US DoD Advanced Research Projects Agency
60,000 computers connected for communication among research organizations and
universities

Compiler - 1952

Grace Murray Hopper an employee of Remington-Rand worked on the NUIVAC. She
took up the concept of reusable software in her 1952 paper entitled "The Education of a
Computer" and developed the first software that could translate symbols of higher
computer languages into machine language. (Compiler)

UNIVAC 1 – 1951

UNIVAC-1. The first commercially successful electronic computer, UNIVAC I, was
also the first general purpose computer - designed to handle both numeric and textual
information. It was designed by J. Presper Eckert and John Mauchly. The
implementation of this machine marked the real beginning of the computer era.
Remington Rand delivered the first UNIVAC machine to the U.S. Bureau of Census in
1951. This machine used magnetic tape for input.
first successful commercial computer
design was derived from the ENIAC (same developers)
first client = U.S. Bureau of the Census
$1 million
48 systems built .

Floppy Disk – 1950

Invented at the Imperial University in Tokyo by Yoshiro Nakamats

Transistor – 1947

The first transistor was invented at Bell Laboratories on December 16, 1947 by William
Shockley. This was perhaps the most important electronics event of the 20th century, as
it later made possible the integrated circuit and microprocessor that are the basis of
modern electronics. Prior to the transistor  the only alternative to its current regulation
and switching functions (Transfer resISTOR) was the vacuum tubes, which could only
be miniaturized to a certain extent, and wasted a lot of energy in the form of heat.
Compared to vacuum tubes, it offered:
smaller size
better reliability
lower power consumption
lower cost

ENIAC – 1946

ENIAC I (Electrical Numerical Integrator And Calculator). The U.S. military sponsored
their research; they needed a calculating device for writing artillery-firing tables (the
settings used for different weapons under varied conditions for target accuracy).
John Mauchly was the chief consultant and J Presper Eckert was the chief engineer.
Eckert was a graduate student studying at the Moore School when he met John Mauchly
in 1943. It took the team about one year to design the ENIAC and 18 months and
500,000 tax dollars to build it.
The ENIAC contained 17,468 vacuum tubes, along with 70,000 resistors and 10,000
capacitors.

Harvard Mark 1 – 1943

Howard Aiken and Grace Hopper designed the MARK series of computers at Harvard
University. The MARK series of computers began with the Mark I in 1944. Imagine a
giant roomful of noisy, clicking metal parts, 55 feet long and 8 feet high. The 5-ton
device contained almost 760,000 separate pieces. Used by the US Navy for gunnery and
ballistic calculations, the Mark I was in operation until 1959. 
The computer, controlled by pre-punched paper tape, could carry out addition,
subtraction, multiplication, division and reference to previous results. It had special
subroutines for logarithms and trigonometric functions and used 23 decimal place
numbers. Data was stored and counted mechanically using 3000 decimal storage wheels,
1400 rotary dial switches, and 500 miles of wire. Its electromagnetic relays classified the
machine as a relay computer. All output was displayed on an electric typewriter. By
today's standards, the Mark I was slow, requiring 3-5 seconds for a multiplication
operation

ABC – 1939

The Atanasoff-Berry Computer was the world's first electronic digital computer. It was
built by John Vincent Atanasoff and Clifford Berry at Iowa State University during 1937-
42. It incorporated several major innovations in computing including the use of binary
arithmetic, regenerative memory, parallel processing, and separation of memory and
computing functions.

Vacuum Tube – 1904

A vacuum tube is just that: a glass tube surrounding a vacuum (an area from which all
gases has been removed). What makes it interesting is that when electrical contacts are
put on the ends, you can get a current to flow though that vacuum.
A British scientist named John A. Fleming made a vacuum tube known today as a diode.
Then the diode was known as a "valve,"

The “Turing test”

A test proposed to determine if a computer has the ability to think. In 1950, Alan Turing
(Turing, 1950) proposed a method for determining if machines can think. This method is
known as The Turing Test.
The test is conducted with two people and a machine. One person plays the role of an
interrogator and is in a separate room from the machine and the other person. The
interrogator only knows the person and machine as A and B. The interrogator does not
know which the person is and which the machine is.
Using a teletype, the interrogator, can ask A and B any question he/she wishes. The aim
of the interrogator is to determine which  the person is and which the machine is.
The aim of the machine is to fool the interrogator into thinking that it is a person. If the
machine succeeds then we can conclude that machines can think.

Evolution of Computing

Turing Machine – 1936
Introduced by Alan Turing in 1936, Turing machines are one of the key abstractions
used in modern computability theory, the study of what computers can and cannot do. A
Turing machine is a particularly simple kind of computer, one whose operations are
limited to reading and writing symbols on a tape, or moving along the tape to the left or
right. The tape is marked off into squares, each of which can be filled with at most one
symbol. At any given point in its operation,  the Turing machine can only read or write
on one of these squares, the square located directly below its "read/write" head.

Introduction

  Charles Babbage (1791-1871)
Creator of the Analytical Engine - the first general-purpose digital computer (1833)
The Analytical Engine was not built until 1943 (in the form of the Harvard Mark I)                                The Analytical Engine
A programmable, mechanical, digital machine
Could carryout any calculation
Could make decisions based upon the results of the previous calculation
Components: input; memory; processor; output                                                                                      Ada, Countess of Lovelace(1815-52)
Babbage: the father of computing
Ada: the mother?
Wrote a program for computing the Bernoulli’s sequence on the Analytical Engine  -
world’s 1st computer program
Ada: A programming language specifically designed by the US Dept of Defense for
developing military applications was named Ada to honor her contributions towards
computing
A lesson that we all can learn from Babbage’s Life
Charles Babbage had huge difficulties raising money to fund his research
As a last resort, he designed a clever mathematical scheme along with Ada, the Countess
of Lovelace
It was designed to increase their odds while gambling.  They bet money on horse races
to raise enough money to support their research experiments
Guess what happened at the end?  The lost every penny that they had.
Fast
Bored
Storage
Name:
Email Address:

Put a website form like this on your site.