01.05.2017 Views

632598256894

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

in the early 1970s. The network was originally developed by a consortium of research colleges and<br />

universities and the federal government, which was looking for a way to share research data and<br />

provide a secure means of communicating and for backing up defense facilities. The original network<br />

was called ARPAnet. ARPAnet was sponsored by the Department of Defense‟s Advanced Research<br />

Projects Agency (ARPA). It was replaced in the 1980s by the current network, which originally was<br />

not very user friendly and was used mostly by techies. The Internet‟s popularity exploded with the<br />

development of the World Wide Web and the necessary software programs that made it much more<br />

user friendly to explore.<br />

The Internet works on a set of software standards, the first of which, TCP/IP, was developed in the<br />

1970s. The entire theory behind the Internet and TCP/IP, which enables computers to speak to each<br />

other over the Internet, was to create a network that had no central controller. The Internet is not like a<br />

string of Christmas lights, where if one light in the string goes out, the rest of the lights stop<br />

functioning. Rather, if one computer in the network is disabled, the rest of the network continues to<br />

perform because traffic is rerouted around it.<br />

Each computer in the Internet has an Internet protocol (IP) address. Similar to one‟s postal address,<br />

it consists of a series of numbers (e.g., 155.48.178.21), and it tells the network where to leave e-mail<br />

messages and data. When a user types in a URL (e.g., www.babson.edu), computers on the Internet,<br />

called domain name servers (DNSs), convert the URL to an IP address. The message or data that is to<br />

be sent is broken into a series of packets. These packets contain the IP address of the sender, the IP<br />

address of the recipient, the packet number of the message (e.g., 7 of 23), and the data itself. Based on<br />

the Internet traffic, these packets could travel different routes to the destination IP address. The<br />

receiving computer then reassembles the packets into a complete message.<br />

The second standard that makes the Internet work is Hypertext Mark-up Language (HTML). Using<br />

a Web browser, the computer converts the HTML or other programming language into the image that<br />

the user sees on the computer monitor. This language allows data to be displayed on the user‟s screen.<br />

It also allows a user to click on an Internet link and jump to a new page on the Internet. While HTML,<br />

along with its set of tags, remains the underlying programming language for the World Wide Web and<br />

is powerful in its own right, it is not dynamic and has its limitations—it was designed to display data.<br />

Therefore, languages such as JavaScript, Java, and Perl, which create animation, perform calculations,<br />

create dynamic Web pages, and access and update databases with information on the host‟s Web<br />

server, were developed to complement HTML. One such complementing language is extensible<br />

markup language (XML), which allows users to define tags for their documents to focus on what the<br />

data is. XML, an open standard overseen by the World Wide Web Consortium, allows a user to define<br />

the content (the data) in a document separately from the formatting of the document, which is done<br />

with HTML tags. Just as a browser is required to understand how to handle the HTML tags, an XMLaware<br />

application is necessary to know how to handle the XML tags. XML allows information to be<br />

shared between different computers, operating systems, and applications without requiring<br />

conversation.<br />

Bringing the Use of Web Technologies In-House<br />

Internet technology has radically changed the manner in which corporate information systems process<br />

their data. In the early and mid-1990s, corporate information systems used distributed processing<br />

techniques. Using this method, some of the processing would take place on the central computer (the

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!