Key Topics

8.1 Introduction

The vision of the Internet and World Wide Web goes back to an article by Vannevar Bush in the 1940s. Bush was an American scientist who had done work on submarine detection for the US Navy. He designed and developed the differential analyser which was a mechanical computer whose function was to evaluate and solve first-order differential equations. It was funded by the Rockefeller Foundation and developed by Bush and others at MIT in the early 1930s. Bush supervised Claude Shannon at MIT, and Shannon’s initial work was to improve the differential analyser.

Bush became director of the office of Scientific Research and Development, and he developed a win-win relationship between the US military and universities. He arranged large research funding for the universities to carry out applied research to assist the US military. This allowed the military to benefit from the early exploitation of research results, and it also led to better facilities and laboratories at the universities. It led to close links and cooperation between universities such as Harvard and Berkeley, and this would eventually lead to the development of ARPANET by DARPA.

Bush outlined his vision of an information management system called the ‘memex’ (memory extender) in a famous essay ‘As We May Think’ [Bus:45]. He envisaged the memex as a device electronically linked to a library and able to display books and films. It describes a proto-hypertext computer system and influenced the later development of hypertext systems (Fig. 8.1).

Fig. 8.1
figure 1_8

Vannevar Bush

A memex is a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory.

It consists of a desk, and while it can presumably be operated from a distance, it is primarily the piece of furniture at which he works. On the top are slanting translucent screens, on which material can be projected for convenient reading. There is a keyboard, and sets of buttons and levers. Otherwise it looks like an ordinary desk.

Bush predicted that:

Wholly new forms of encyclopedias will appear, ready made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified.

This description motivated Ted Nelson and Douglas Engelbart to independently formulate ideas that would become hypertext. Tim Berners-Lee would later use hypertext as part of the development of the World Wide Web.

8.2 The ARPANET

There were approximately 10,000 computers in the world in the 1960s. These were expensive machines ($100K–$200K) with limited processing power. They contained only a few thousand words of magnetic memory, and programming and debugging was difficult. Further, communication between computers was virtually nonexistent.

However, several computer scientists had dreams of worldwide networks of computers, where every computer around the globe is interconnected to all of the other computers in the world. For example, LickliderFootnote 1 wrote memos in the early 1960s on his concept of an intergalactic network. This concept envisaged that everyone around the globe would be interconnected and able to access programs and data at any site from anywhere.

The US Department of Defense founded the Advanced Research Projects Agency (ARPA) in the late 1950s. ARPA embraced high-risk, high-return research, and Licklider became the head of its computer research program. He developed close links with MIT, UCLA and BBN Technologies.Footnote 2 The concept of packet switchingFootnote 3 was invented in the 1960s, and several organisations including the National Physical Laboratory (NPL), the RAND Corporation and MIT commenced work on its implementation.

The early computers had different standards for data representation, and the standard employed by each computer needed to be known for communication. This led to recognition of the need for common standards in data representation, and a US government committee developed ASCII (American Standard Code for Information Interchange) in 1963. This was the first universal standard for data, and it allowed machines from different manufacturers to exchange data. The standard allowed a 7-bit binary number to stand for a letter in the English alphabet, an Arabic numeral or a punctuation symbol. The use of 7 bits allowed 128 distinct characters to be represented. The development of the IBM 360 mainframe standardised the use of 8 bits for a word, and 12-bit or 36-bit words became obsolete.

The first wide area network connection was created in 1965. It involved the connection of a computer at MIT to a computer in Santa Monica via a dedicated telephone line. This showed that a telephone line could be used for data transfer. ARPA recognised the need to build a network of computers in the mid-1960s, and this led to the ARPANET project in 1966 which aimed to implement a packet-switched network with a network speed of 56 Kbps. ARPANET was to become the world’s first packet-switched network.

BBN Technologies was awarded the contract to implement the network. The first two nodes were based at UCLA and SRI with plans for a total of 19 nodes. The network management was performed by interconnected ‘Interface Message Processors’ (IMPs) in front of the major computers. The IMPs eventually evolved to become the network routers that are used today.

The team at UCLA called itself the Network Working Group and saw its role as developing the Network Control Protocol (NCP). This was essentially a rule book that specified how the computers on the network should communicate. The first host-to-host connection was made between a computer in UCLA and a computer at the Stanford Research Institute (SRI) in late 1969. Several other nodes were added to the network until it reached its target of 19 nodes in 1971.

The Network Working Group developed the telnet protocol and file transfer protocol (FTP) in 1971. The telnet program allowed the user of one computer to remotely log in to the computer of another computer. The file transfer protocol allows the user of one computer to send or receive files from another computer. A public demonstration of ARPANET was made in 1972, and it was a huge success. One of the earliest demos was that of Weizenbaum’s ELIZA program. This is a famous AI program that allows a user to conduct a typed conversation with an artificially intelligent machine (psychiatrist) at MIT.

The viability of packet switching as a standard for network communication had been clearly demonstrated. Ray Tomlinson of BBN Technologies developed a program that allowed electronic mail (e-mail) to be sent over the ARPANET. Over 30 institutions were connected to the ARPANET by the early 1970s.

8.3 TCP/IP

ARPA was renamed to DARPA (Defence Advanced Research Projects Agency) in 1973. It commenced a project to connect seven computers on four islands using a radio-based network and a project to establish a satellite connection between a site in Norway and in the United Kingdom. This led to a need for the interconnection of the ARPANET with other networks. The key problems were to investigate ways of achieving convergence between ARPANET, radio-based networks and the satellite networks, as these all had different interfaces, packet sizes and transmission rates. Therefore, there was a need for a network-to-network connection protocol.

An International Network Working Group (INWG) was formed in 1973. The concept of the transmission control protocol (TCP) was developed at DARPA by Bob Kahn and Vint Cerf, and they presented their ideas at an INWG meeting at the University of Sussex in England in 1974 [KaC:74]. TCP allowed cross network connections, and it began to replace the original NCP protocol used in ARPANET.

TCP is a set of network standards that specify the details of how computers communicate, as well as the standards for interconnecting networks and computers. It was designed to be flexible and provides a transmission standard that deals with physical differences in host computers, routers and networks. It is designed to transfer data over networks which support different packet sizes and which may sometimes lose packets. It allows the inter-networking of very different networks which then act as one network.

The new protocol standards were known as the transport control protocol (TCP) and the Internet protocol (IP). TCP details how information is broken into packets and reassembled on delivery, whereas IP is focused on sending the packet across the network. These standards allow users to send e-mail or to transfer files electronically, without needing to concern themselves with the physical differences in the networks. TCP/IP consists of four layers (Table 8.1).

Table 8.1 TCP layers

The Internet protocol (IP) is a connectionless protocol that is responsible for addressing and routing packets. It breaks large packets down into smaller packets when they are travelling through a network that supports smaller packets. A connectionless protocol means that a session is not established before data is exchanged, and packet delivery with IP is not guaranteed as packets may be lost or delivered out of sequence. An acknowledgement is not sent when data is received, and the sender or receiver is not informed when a packet is lost or delivered out of sequence. A packet is forwarded by the router only if the router knows a route to the destination. Otherwise, it is dropped. Packets are dropped if their checksum is invalid or if their time to live is zero. The acknowledgement of packets is the responsibility of the TCP protocol. The ARPANET employed the TCP/IP protocols as a standard from 1983.

8.4 Birth of the Internet

The use of ARPANET was initially limited to academia and to the US military, and in the early years, there was little interest from industrial companies. It allowed messages to be sent between the universities that were part of ARPANET. There were over 2,000 hosts on the TCP/IP-enabled network by the mid-1980s.

It was decided to shut down the network by the late 1980s, and the National Science Foundation (NSF) commenced work on the NSFNET in the mid-1980s. This network consisted of multiple regional networks connected to a major backbone. The original links in NSFNET were 56 Kbps, but these were updated to 1.544 Mbps T1 links in 1988. The NSFNET T1 backbone initially connected 13 sites, but this increased due to growing academic and industrial from around the world. The NSF realised that the Internet had significant commercial potential.

The Internet began to become more international with sites in Canada and several European countries connected. DARPA formed the computer emergency response team (CERT) to deal with any emergency incidents arising from the operation of the network.

The independent not-for-profit company, Advanced Network Services (ANS), was founded in 1991. It installed a new network (ANSNET) that replaced the NSFNET T1 network and operated over T3 (45 Mbps) links. It was owned and operated by a private company rather than the US government, with the NSF focusing on research aspects of networks rather than the operational side.

The ANSNET network was a distributive network architecture operated by commercial providers such as Sprint, MCI and BBN. The network was connected by major network exchange points, termed network access points (NAPs). There were over 160,000 hosts connected to the Internet by the late 1980s.

8.5 Birth of the World Wide Web

The World Wide Web was invented by Tim Berners-Lee in 1990 at CERN in Switzerland. CERN is a key European centre for research in the nuclear field, and it employs several thousand physicists and scientists. Berners-Lee obtained a degree in physics in the mid-1970s at Oxford University in England. His parents had been involved in the programming of the Ferranti Mark I computer in the 1950s.

The invention of the World Wide Web was a revolutionary milestone in computing. It transformed the use of the Internet from mainly academic use to where it is now an integral part of peoples’ lives. Users could now surf the web, that is, hyperlink among the millions of computers in the world and obtain information easily. It is revolutionary in that:

  • No single organisation is controlling the web.

  • No single computer is controlling the web.

  • Millions of computers are interconnected.

  • It is an enormous marketplace of billions of users.

  • The web is not located in one physical location.

  • The web is a space and not a physical thing.

One of the problems that scientists that CERN faced in late 1989 was keeping track of people, computers, documents and databases. The centre had many visiting scientists who spent several months there, as well as a large pool of permanent staff. There was no efficient and effective way to share information among scientists.

A visiting scientist might need to obtain information or data from a CERN computer or to make the results of their research available to CERN. Berners-Lee developed a program called ‘Enquire’ to assist with information sharing and in keeping track of the work of visiting scientists. He returned to CERN in the mid-1980s to work on other projects and devoted part of his free time to consider solutions to the information sharing problem.

He built on several existing inventions such as the Internet, hypertext and the mouse. Hypertext was invented by Ted Nelson in the 1960s, and it allowed links to be present in text. For example, a document such as a book contains a table of contents, an index and a bibliography. These are all links to material that is either within the book itself or external to the book. The reader of a book is able to follow the link to obtain the internal or external information. The mouse was invented by Doug Engelbart in the 1960s, and it allowed the cursor to be steered around the screen.

The major leap that Berners-Lee made was essentially a marriage of the Internet, hypertext and the mouse into what has become the World Wide Web. His vision and its subsequent realisation benefited CERN and the wider world.

He created a system that gives every web page a standard address called the universal resource locator (URL). Each page is accessible via the Hypertext Transfer Protocol (HTTP), and the page is formatted with the Hypertext Markup Language (HTML). Each page is visible using a web browser. The key features of Berners-Lee invention are presented in Table 8.2.

Table 8.2 Features of World Wide Web

Berners-Lee invented the well-known terms such as URL, HTML and World Wide Web, and he wrote the first browser program that allowed users to access web pages throughout the world. Browsers are used to connect to remote computers over the Internet and to request, retrieve and display the web pages on the local machine.

The early browsers included Gopher developed at the University of Minnesota, and Mosaic developed at the University of Illinois. These were replaced in later years by Netscape which dominated the browser market until Microsoft developed Internet Explorer. The development of the graphical browsers led to the commercialisation of the World Wide Web.

The World Wide Web creates a space in which users can access information easily in any part of the world. This is done using only a web browser and simple web addresses. The user can then click on hyperlinks on web pages to access further relevant information that may be on an entirely different continent. Berners-Lee is now the director of the World Wide Web Consortium, and this MIT-based organisation sets the software standards for the web.

8.6 Applications of the World Wide Web

Berners-Lee realised that the World Wide Web offered the potential to conduct business in cyberspace, rather than the traditional way of buyers and sellers coming together to do business in the marketplace:

Anyone can trade with anyone else except that they do not have to go to the market square to do so.

The growth of the World Wide Web has been phenomenal since its invention. Exponential growth rate curves became a feature of newly formed Internet companies and their business plans. The World Wide Web has been applied to many areas including:

  • Travel industry (booking flights, train tickets and hotels)

  • E-marketing

  • Online shopping (e.g. www.amazon.com)

  • Portal sites (such as Yahoo)

  • Recruitment services (such as www.jobserve.com)

  • Internet banking

  • Online casinos (for gambling)

  • Newspapers and news channels

  • Social media (Facebook)

The prediction in the early days was that the new web-based economy would replace traditional bricks and mortar companies. It was expected that most business would be conducted over the web, with traditional enterprises losing market share and going out of business. Exponential growth of e-commerce companies was predicted, and the size of the new web economy was estimated to be in trillions of US dollars.

New companies were formed to exploit the opportunities of the web, and existing companies developed e-business and e-commerce strategies to adapt to the brave new world. Companies providing full e-commerce solutions were concerned with the selling of products or services over the web to either businesses or consumers. These business models are referred to as business-to-business (B2B) or business-to-consumer (B2C). The characteristics of e-commerce websites are presented in Table 8.3.

Table 8.3 Characteristics of e-commerce

8.7 Dot-Com Companies

The success of the World Wide Web was phenomenal, and it led to a boom in the formation of ‘new economy’ businesses. These businesses were conducted over the web and included the Internet portal company, Yahoo; the online bookstore, Amazon; and the online auction site, eBay. Yahoo provides news and a range of services, and most of its revenue comes from advertisements. Amazon initially sold books but now sells a collection of consumer and electronic goods. eBay brings buyers and sellers together in an online auction space.

Boo.com was an online fashion company that failed dramatically in late 1999. Pets.com was an online pet supplies and accessory company that lasted 1 year in business. Priceline.com is an online travel firm that offers airlines and hotels a service to sell unfilled seats or rooms cheaply. ETrade.com is an online share dealing company. Some of these new technology companies were successful and remain in business. Others were financial disasters due to poor business models, poor management and poor implementation of the new technology.

Some of these technology companies offered an Internet version of a traditional bricks and mortar company, with others providing a unique business offering. For example, eBay offers an auctioneering Internet site to consumers worldwide which was a totally new service and quite distinct from traditional auctioneering.

Yahoo was founded by David Filo and Jerry Yang who were students at Stanford in California. It was used by them as a way to keep track of their personal interests and the corresponding websites on the Internet. Their list of interests grew over time and became too long and unwieldy. Therefore, they broke their interests into a set of categories and then subcategories, and this is the core concept of the website.

There was a lot of interest in the site from other students, family and friends and a growing community of users. The founders realised that the site had commercial potential, and they incorporated it as a business in 1995. The company launched its initial public offering (IPO) 1 year later in April 1996, and the company was valued at $850 million (or approximately 40 times its annual sales).

Yahoo is a portal site and offers free e-mail accounts to users, a search engine, news, shopping, entertainment, health and so on. The company earns a lot of its revenue from advertisement (including the click through advertisements that appear on a yahoo web page). It also earns revenue from services such as web hosting, web tools, larger mailboxes and so on.

Amazon was founded by Jeff Bezos in 1995 as an online bookstore. Its product portfolio diversified over time to include the sale of CDs, DVDs, toys, computer software and video games. Its initial focus was to build up the ‘Amazon’ brand throughout the world, and its goal was to become the world’s largest bookstore. It initially sold books at a loss by giving discounts to buyers in order to build market share. It was very effective in building its brand through advertisements, marketing and discounts.

It has become the largest online bookstore in the world and has a sound business model with a very large product catalogue; a well-designed website with good searching facilities, good checkout facilities and good order fulfilment. It also developed an associate model, which allows its associates to receive a commission for purchases of Amazon products made through the associate site.

eBay was founded in California by Pierre Omidyar in 1995. Omidyar was a French-American programmer born in Paris, and he moved to the United States with his parents. He later studied computing and worked as a programmer at Claris prior to setting up eBay. The eBay site brings buyers and sellers together and allows buyers to bid for items. The company earns revenue by charging a commission for each transaction. The IPO of eBay took place in 1998 and was highly successful.

Millions of items are listed, bought and sold on eBay every day. The sellers are individuals and international companies who are selling their products and services. Any legal product that does not violate the company’s terms of service may be bought or sold on the site. A buyer makes a bid for a product or service and competes against several other bidders. The highest bid is successful, and payment and delivery is then arranged. The revenue earned by eBay includes fees to list a product and commission fees that are applied whenever a product is sold. It is an international company with a presence in over 20 countries.

There have been a number of strange offerings on eBay. One man offered one of his kidneys for auction as part of the market for human organs. Other unusual cases have been towns that have been offered for sale (as a joke). Any product lists that violate eBay’s terms of service are removed from the site as soon as the company is aware of them. The company also has a fraud prevention mechanism which allows buyers and sellers to provide feedback on each other and to rate each other following the transaction. The feedback may be positive, negative or neutral, and relevant comments included. This offers a way to help to reduce fraud as unscrupulous sellers or buyers will receive negative ratings and comments.

Priceline was founded by Jay Walker and offers a service to airlines and hotels to sell unfilled seats or rooms cheaply. It was valued at $10 billion at its IPO, despite the fact that unlike airlines it had no assets and was actually selling flights at a loss.

8.7.1 Dot-Com Failures

Several of the companies formed during the dot-com era were successful and remain in business today. Others had inappropriate business models or poor management and failed in a spectacular fashion. This chapter considers some of the dot-com failures and highlights the reasons for failure.

Webvan.com was an online grocery business based in California. It delivered products to a customer’s home within a 30-min period of their choosing. The company expanded to several other cities before it went bankrupt in 2001. Many of its failings were due to management as the business model was reasonable. The management was inexperienced in the supermarket or grocery business, and the company spent excessively on infrastructure. It had been advised to build up an infrastructure to deliver groceries as quickly as possible, rather than developing partnerships with existing supermarkets. It built warehouses, purchased a fleet of delivery vehicles and top of the range computer infrastructure before running out of money.

Boo.com was founded in 1998 by Ernst Malmsten and others as an online fashion retailer based in the United Kingdom. The company spent over $135 million of shareholder funds in less than 3 years and went bankrupt in 2000. Its website was poorly designed for its target audience and went against many of the accepted usability conventions of the time. The website was designed in the days before broadband with most users employing 56 K modems. However, its design included the latest Java and Flash technologies, and it took several minutes to load the first page of the website for most users for the first released version of the site. The navigation of the website was inconsistent and changed as the user moved around the site. The net effect was that despite extensive advertising by the company, users were not inclined to use the site.

Other reasons for failure included poor management and leadership, lack of direction, lack of communication between departments, spirally costs left unchecked and hiring staff and contractors in large numbers leading to crippling payroll costs. Further, a large number of products were returned by purchasers, and there was no postage charge applied for this service. The company incurred a significant cost for covering postage for these returns. The company went bankrupt in 2000, and an account of its formation and collapse is in the book, Boo Hoo [MaP:02]. This book is a software development horror story, and the poor software development practices employed is evident from the fact that while it had up to 18 contractor companies working to develop the website, the developers were working without any source code control mechanism in place.

Pets.com was an online pet supply company founded in 1998 by Greg McLemore. It sold pet accessories and supplies. It had a well-known advertisement as to ‘why one should shop at an online pet store?’. The answer to this question was: ‘Because Pets Can’t Drive!’. Its mascot (the Pets.com sock puppet) was well known. It launched its IPO in February 2000 just before the dot-com collapse.

Pets.com made investments in infrastructure such as warehousing and vehicles. It needed a critical mass of customers in order to break even, and its management believed that it needed $300 million of revenue to achieve this. They expected that this would take a minimum of 4–5 years, and, therefore, there was a need to raise further capital. However, following the dot-com collapse, there was negative sentiment towards technology companies, and it was apparent that it would be unable to raise further capital. They tried to sell the company without success, and it went into liquidation 9 months after its IPO.

Kozmo.com was founded by Joseph Park and Yong Kang in New York in 1998 as an online company that promised free 1-h delivery of small consumer goods. It provided point-to-point delivery usually on a bicycle and did not charge a delivery fee. Its business model was deeply flawed, as it is expensive to offer point-to-point delivery of small goods within a 1-h period without charging a delivery fee. The company argued that they could make savings to offset the delivery costs as they did not require retail space. It expanded into several cities in the United States and raised about $280 million from investors. It had planned to launch an IPO, but this was abandoned due to the dot-com collapse. The company ceased trading in 2001.

8.7.2 Business Models

A business model converts a technology idea or innovation into a commercial reality and needs to be appropriate for the company and its intended operating market. A company with an excellent business idea but with a weak business model may fail, whereas a company with an average business idea but an excellent business model may be quite successful. Several of the business models in the dot-com era were deeply flawed, and the eventual collapse of many of these companies was predictable. Chesbrough and Rosenbroom [ChR:02] have identified six key components in a business model (Table 8.4).

Table 8.4 Characteristics of business models

8.7.3 Bubble and Burst

Netscape was founded as Mosaic Communications by Marc Andreessen and Jim Clark in 1994. It was renamed as Netscape in late 1994. Its initial public offering in 1995 demonstrated the incredible value of the new Internet companies. The company had planned to issue the share price at $14 but decided at the last minute to issue it at $28. The share price reached $75 later that day. This was followed by what became the dot-com bubble where there were a large number of public offerings of Internet stock, and where the value of these stocks reached astronomical levels. Eventually, reality returned to the stock market when it crashed in April 2000, and share values returned to more realistic levels.

The vast majority of these companies were losing substantial sums of money, and few expected to deliver profits in the short term. Financial instruments such as the balance sheet, profit and loss account and price to earnings ratio are normally employed to estimate the value of a company. However, investment bankers argued that there was a new paradigm in stock market valuation for Internet companies. This paradigm suggested that the potential future earnings of the stock be considered in determining its appropriate value. This was used to justify the high prices of shares in technology companies, as frenzied investors rushed to buy these overpriced and overhyped stocks. Common sense seemed to play no role in decision making. The dot-com bubble included features such as:

  • Irrational exuberance on the part of investors

  • Insatiable appetite for Internet stocks

  • Incredible greed from all parties involved

  • A lack of rationality and common sense by all concerned

  • Traditional method of company valuation not employed

  • Interest in making money rather than in building the business first

  • Following herd mentality

  • Questionable decisions by Federal Reserve Chairman Alan Greenspan

  • Questionable analysis by investment firms

  • Conflict of interest for investment banks

  • Market had left reality behind

There were winners and losers in the boom and collapse. Many made a lot of money from the boom, with others including pension funds and life assurance funds making significant losses. The investment banks typically earned 5–7% commission on each successful IPO, and it was therefore in their interest not to question the boom too closely. Those who bought and disposed early obtained a good return. Those who kept their shares for too long suffered losses.

The full extent of the boom can be seen in the rise and fall of the value of the Dow Jones and NASDAQ from 1995 through 2002 (Fig. 8.2).

Fig. 8.2
figure 2_8

Dow Jones (1995–2002)

The extraordinary rise of the Dow Jones from a level of 3,800 in 1995 to 11,900 in 2000 represented a 200% increase over 5 years or approximately 26% annual growth (compound) during this period. The rise of the NASDAQ over this period is even more dramatic. It rose from a level of 751 in 1995 to 5,000 in 2000, representing a 566% increase during the period. This is equivalent to a 46% compounded annual growth rate of the index (Fig. 8.3).

Fig. 8.3
figure 3_8

NASDAQ (1995–2002)

The fall of the indices has been equally as dramatic especially in the case of the NASDAQ. It peaked at 5,000 in March 2000 and fell to 1,200 (a 76% drop) by September 2002. It had become clear that Internet companies were rapidly going through the cash raised at the IPOs, and analysts noted that a significant number would be out of cash by end of 2000. Therefore, these companies would either go out of business or would need to go back to the market for further funding. This led to questioning of the hitherto relatively unquestioned business models of many of these Internet firms. Funding is easy to obtain when stock prices are rising at a rapid rate. However, when prices are static or falling, with negligible or negative business return to the investor, then funding dries up. The actions of the Federal Reserve in rising interest rates to prevent inflationary pressures also helped to correct the irrational exuberance of investors. However, it would have been more appropriate to have taken this action 2–3 years earlier.

Some independent commentators had recognised the bubble, but their comments and analysis had been largely ignored. These included ‘The Financial Times’ and the ‘Economist’ as well as some commentators in the investment banks. Investors rarely queried the upbeat analysis coming from Wall Street and seemed to believe that the boom would never end. They seemed to believe that rising stock prices would be a permanent feature of the US stock markets. Greenspan had argued that it is difficult to predict a bubble until after the event, and that even if the bubble had been identified, it could not have been corrected without causing a contraction. Instead, the responsibility of the Fed according to Greenspan was to mitigate the fallout when it occurs.

There have, of course, been other stock market bubbles throughout history. For example, in the 1800 s, there was a rush on railway stock in England, leading to a bubble and eventual burst of railway stock prices in the 1840 s. There has been a recent devastating property bubble and collapse (2002–2009) in the Republic of Ireland. The failure of the Irish political class, the Irish Central bank and financial regulators, the Irish Banking sector in their irresponsible lending policies and failures of the media in questioning the bubble are deeply disturbing. Its legacy will remain in the country for many years and will require resilience in dealing with its aftermath.

8.8 Facebook and Twitter

Facebook is a social medial site that was founded by Mark Zuckerberg and other Harvard students in 2004. It allows users to add a personal profile, to add photos or albums of photos, to add other users as friends and to send messages to other friends. It was initially used by just Harvard University students, but it is now widely used throughout the world. It is the most widely used social media site with in excess of 600 million users.

Facebook allows users to set their own privacy settings which allow them to choose which users may see specific parts of their profile. Most of its revenue is generated from advertisement such as banner ads.

Twitter is a social networking and mini-blogging service that was founded by Jack Dorsey in 2006. It allows users to send or receive short messages of 140 characters called tweets. The company was founded in 2006 and has 200 million users. It is described as the SMS of the Internet.

The use of Twitter fluctuates with important external events leading to a spike in its use. Twitter messages are public, but senders may also send short private messages that are visible just to their followers.

8.9 E-Software Development

An organisation that conducts part or all of its business over the World Wide Web will need to ensure that its website is fit for purpose. It needs to be of high quality, reliable and usable. Web applications are quite distinct from other software systems in that:

  • They may be accessed from anywhere in the world.

  • They may be accessed by many different browsers.

  • It is a distributed system with millions of servers and billions of users.

  • The usability and look and feel of the application is a key concern.

  • The performance and reliability of the website are key concerns.

  • Security threats may occur from anywhere.

  • A large number of transactions may occur at any time.

  • There are strict availability constraints (typically 24 × 7 × 365).

Rapid application development (RAD) or joint application development (JAD) lifecycles are often employed for website development. The spiral life cycle is used rather than waterfall, as it is often the case that the requirements for web-based systems will not be fully defined at project initiation.

The spiral is, in effect, a reusable prototype, and the customer examines the current iteration and provides feedback to the development team. This is addressed in the next spiral. The approach is to partially implement the system as this leads to a better understanding of the requirements of the system, and it then feeds into the next development cycle. The process repeats until the requirements and product are fully complete. The spiral model is shown in Fig. 8.4.

Fig. 8.4
figure 4_8

Spiral life cycle model

There are various roles involved in web-based software development including content providers who are responsible for providing the content on the web, web designers who are responsible for graphic design of the website, programmers who are responsible for the implementation and administrators who are responsible for administration of the website.

Sound software engineering practices need to be employed to design, develop and test the website. The project team needs to produce similar documentation as the waterfall model, except that the chronological sequence of delivery of the documentation is more flexible. The joint application development is important as it allows early user feedback to be received on the look and feel and correctness of the application. The approach is often ‘design a little, implement a little, and test a little’.

Various technologies are employed in web development. These include HTML which is used to design simple web pages; CGIs (Common Gateway Interface) are often employed in sending a completed form to a server; cookies are employed to enable the server to store client-specific information on client’s machine. Other popular technologies are Java, JavaScript, VB Script, Active X and Flash. There are tools such as Dreamweaver to assist with website design.

Testing plays an important role in assuring quality, and various types of web testing include:

  • Static testing

  • Unit testing

  • Functional testing

  • Browser compatibility testing

  • Usability testing

  • Security testing

  • Load/performance/stress testing

  • Availability testing

  • Post-deployment testing

The purpose of post-deployment testing is to ensure that the performance of the website remains good, and this is generally conducted as part of a service level agreement (SLA). Service level agreements typically include a penalty clause if the availability of the system or its performance falls below defined parameters. Consequently, it is important to identify as early as possible potential performance and availability issues before they become a problem.

Most websites are operating 24 × 7 × 365 days a year, and there is the potential for major financial loss in the case of a major outage of the electronic commerce website.

8.10 E-Commerce Security

The World Wide Web consists of unknown users and suppliers with unpredictable behaviour operating in unknown countries around the world. These users and websites may be friendly or hostile and the issue of trust arises:

  • Is the other person whom they claim to be?

  • Can the other person be relied upon to deliver the goods on payment?

  • Can the other person be trusted not to inflict malicious damage?

  • Is financial information kept confidential on the server?

Hostility may manifest itself in various acts of destruction. For example, malicious software may attempt to format the hard disk of the local machine, and if successful, all local data will be deleted. Other malicious software may attempt to steal confidential data from the local machine including bank account or credit card details. The denial of service attack is when a website is overloaded by a malicious attack, and where users are therefore unable to access the website for an extended period of time.

The display of web pages on the local client machine may involve the downloading of programs from the server and running the program on the client machine (e.g. Java Applets). Standard HTML allows the static presentation of a web page, whereas many web pages include active content (e.g. Java Applets or Active X). There is a danger that a Trojan Horse Footnote 4 may be activated during the execution of active content.

Security threats may be from anywhere (e.g. client side, server side, transmission) in an e-commerce environment, and therefore a holistic approach to security is required. Internal and external security measures need to be considered. Internal security is generally implemented via procedures and access privileges.

It is essential that the user is confident in the security provided as otherwise they will be reluctant to pass credit card details over the web for purchases. This has led to technologies such as Secure Socket Layer (SSL) and Secure HTTP (S-HTTP) to ensure security.

8.11 Review Questions

  1. 1.

    Describe the development of the Internet.

  2. 2.

    Describe the development of the World Wide Web and its key constituents.

  3. 3.

    Describe the applications of the World Wide Web.

  4. 4.

    Describe the key constituents of an electronic commerce site.

  5. 5.

    Describe a successful dot-com company that you are familiar with. What has made the company successful?

  6. 6.

    Describe a dot-com failure that you are familiar with. What caused the company to fail?

  7. 7.

    Discuss the key components of a business model.

  8. 8.

    Discuss software development in a web environment.

8.12 Summary

This chapter considered the evolution of the Internet from the early work on packet switching and ARPANET, to the subsequent development of the TCP/IP network protocols that specify how computers communicate and the standards for interconnecting networks and computers.

TCP/IP provides a transmission standard that deals with physical differences in host computers, routers and networks. It is designed to transfer data over networks which support different packet sizes and which may sometimes lose packets. TCP details how information is broken into packets and reassembled on delivery, whereas IP is focused on sending the packet across the network.

The invention of the World Wide Web by Tim Berners-Lee was a revolutionary milestone in computing. It transformed the Internet from mainly academic use to commercial use and led to a global market of consumers and suppliers. Today, the World Wide Web is an integral part of peoples’ lives.

The growth of the World Wide Web was exponential, and the boom led to the formation of many ‘new economy’ businesses. These new companies conducted business over the web as distinct from the traditional bricks and mortar companies. Some of these new companies were very successful (e.g. Amazon) and remain in business. Others were financial disasters due to poor business models, poor management and poor implementation of the new technology.