Peer-to-peer (P2P) Computing - An Overview

Introduction

In general, P2P describes an environment where computers connect to each other in a distributed environment that does not use a centralized control point to route or connect data traffic. Thus, a true peer-to-peer network computing brings two or more computers into direct communication so they can interact with each other without involving  other computers in their workflow. The computers can shift roles; acting as client, server, or both, therefore they can not be simply described as the client-server model.  Programs thought of as peer-to-peer, such as Napster and ICQ do not meet these unique requirements because they relied on centralized servers to connect its user community. That is, they are actually hybrids, combining features of both peer-to-peer and client-server programming.

A simple example of a P2P implementation comes from the IEEE 802.11b's ad hoc working-group mode. In this mode, computers join a wireless network as peers, and every computer in the network can communicate directly with every other computer in the network without having to first communicate with a centralized server.  Another excellent example of true peer-to-peer networking is our everyday telephone system. One person initiates a telephone conversation. After answering the call, there is no asymmetry: both persons can both speak, listen or do both at the same time. Either person can end the telephone conversation at any time, by just hanging up the telephone. If the telephone just followed the client-server computing  model, we would have needed two telephones; one for incoming calls and one for outgoing calls.

Peer-to-Peer Computing Applications

A P2P application has to deal with several computers (peers) running on different physical machines. These peers can be running a daunting variety or versions of operating systems and work in varied security, application, and tools environments. Given such a distributed environment for any P2P application to survive, there comes a need for a viable mechanism for peers to uniquely identify  and communicate with each other. Since most P2P networks are ad hoc, the network must dynamically discover peers and track when peers join or leave the group. Security will become an issue when peers are getting  connected. Sharing files and other computing resources among peers are certainly bound to bring enthusiasm among hackers  and other mischief makers to disrupt and subvert the computers participating in that P2P task.  Thus security has become a major bottleneck for general deployment of P2P networks. Research communities, having realized the significance of P2P networks in the world of Internet computing in the days to come, have started to work on proper and effective authentication, and efficient authorization mechanisms.

The rise of Napster and its technical descendants such as Guntella, Morpheus and others, has in the recent past caught everyone's attention. These products rely on an abbreviated form of distributing to connect millions of users, letting them share music, videos, images and other files. File sharing is one viable killer application of this P2P networking. Companies are working overnight to design and develop applications that will use P2P networking to create ad hoc  work groups and virtual communities. When merging with wireless technologies such as the wireless application protocol (WAP), or IEEE 802.11, P2P can also become the foundation for offering location-based services. Such services are bound to facilitate forming virtual play groups such as students communicating with each other at a resort. P2P networks can share computing cycles among computers.

 

of The Gnutella protocol follows the strict requirement of peer-to-peer computing model. Actually there are no Gnutella servers. When a Gnutella servant initializes, it knows nothing about the Gnutella network

 

 

Peer-to-Peer Computing Applications

A primary application of this sort of computing model is emerging content distribution model where the Net content is distributed using peer-to-peer (P2P) networking concepts. 

The ever-growing popularity of the Net has resulted in an explosion of its content — audio files, video files, e-books and more. Though the current `centralized content' distribution model, in which content is stored at a central server to be accessed by clients, provides some control to the providers, it has some in-built weaknesses. 

If an organization has hosted a bandwidth-intensive content file, for example, a digitized movie, it has to transmit millions of data bytes in response to each customer request. This means that to keep the system performance from deteriorating as the number of download requests shoot up, the content provider has to increase parameters such as the bandwidth capacity, processing power etc. which results in a rise in delivery cost. Some sites witness a sudden surge in traffic when popular web sites posts messages/articles that contain references to them. This sudden increase in the number of hits to a web site is known as the Slashdot effect. 

These issues point to a major shortcoming that has crept into the centralized content distribution method. There is an inverse relationship between the system's efficiency and demand, as more and more people start to use the system, it becomes less and less efficient. 

To get around the content distribution bottleneck of the `centralized content' model, Net scientists proposed the P2P-based content distribution. The P2P model lets computers exchange content directly with one another over the Net without the help of a central server. In this model the central server continues to provide the content but peers requesting the content will also start distributing the content, that is, if you have already downloaded a file from the server, the P2P method allows others near you to download the file from your PC — they don't need to go to original server. If, apart from you, others also have the same file stored in their PCs, the system automatically pulls the content from these peer machines simultaneously. As the file gets downloaded from multiple sources, it reaches the target machine quite fast without incurring any additional drain on the Net's resources. The beauty of the system is that whoever downloads a file becomes a download source and this way the system becomes more efficient as its user-base increases. 

Kontiki 

Kontiki is a system that has implemented the P2P content distribution network that delivers video, audio, software and other documents. To experiment with the system, download the software from the link http://www.kontiki.com/kontikitv/installing.html and install it. After invoking the program, select the content providers from the available list that include news channels, virus information providers and software distributors. Once the download links are selected, the software starts to automatically download from multiple sources. As the Kontiki software makes every user's PC a server and a receiver, when a resource gets downloaded to your desktop it will make the PC a server that can transmit this file. 

BitTorrent (http://bitconjurer.org/BitTorrent/index.html) is another product that can be tried in this regard. A graphical demonstration of the P2P content distribution concepts is a feature of this site. 

AimingClick 

While going through a document in a Windows application, you may come across an unfamiliar word or a phrase. If it is a technical word that needs to be further explored, you may seek the help of a search engine to collect the links to sites that contain more information on the word. If the requirement is only to know the meaning of the word, an on-line dictionary service needs to be accessed. So, depending on the context one has to seek help from an appropriate service. AimingClick is a software — available at: http://www.aimingtech.com/aimingclick/home.htm — that helps you get to know the meaning of an unfamiliar word or phrase in a document. 

Domain Name Analyser 


Domain Name Analyzer — available at: http://softnik.com/products/domain.htm — is an elegant free Windows program that helps in finding a proper domain name. If you have a set of domain names in mind for your new site, just type all of them in the program's input box, and select the button that starts the domain availability check. A list of domain names that are available then appears on the screen. Apart from the domain name analyser, the site also hosts the program `Watch My Domains' that helps you track domain related details.

 The rise of file-sharing networks (first generation peer-to-peer applications) such as Napster and Gnutella shouldn't just make the entertainment industry nervous. In theory, if everyone could get their products and services from other people's PCs running on a widely connected network, then the need for any centralized servers or services would go away. For this reason, many ASPs have shied away from considering the support and revenue model provided by emerging next-generation peer-to-peer (P2P) applications. But this is a mistake. P2P applications may well be the catalyst for new revenue sources in two areas: hosted business systems and wireless services. Before discussing the new revenue streams, let's define what we mean by P2P applications. The best way to think of a P2P application is as a shared desktop for projects. New companies such as Groove Networks and NextPage are working on systems that allow PCs to host and participate in shared projects using the Internet as a synchronization platform rather than a hosting platform. The interest in these applications comes from two common Internet complaints: security and reliability. Many companies don't want their collaboration services hosted because they're concerned about properly restricting access, and they want to be able to work whether a connection exists or not. P2P applications overcome these objections by relying on local data storage to save copies of the project data. This enhances P2P applications by providing interfaces not only to the normal communication services such as e-mail but also to up-and-coming collaboration services (i.e., instant messaging, application sharing, and file sharing). While none of these services is new, the presentation of the services is new. Each of these services is used in the context of an individual's view into shared projects or tasks. Sharing a file with other participants involves simply marking the file as shared; then the system takes care of distribution automatically when the host system is online. ASPs can make money in this environment in two ways. First, they can add value to P2P applications by making sure the applications link properly with other hosted applications. Many ASPs already host key elements of these P2P platforms (such as e-mail). By making these key hosted elements accessible from P2P applications, the ASP will be able to maintain the hosted revenue stream. But the real opportunity will be integrating P2P applications with a hosted line of business applications. Most analysts agree that the industry will only be able to benefit from the real value of P2P applications when organizations can integrate group discussions with corporate data. For example, an international engineering company would benefit from the discussions about a particular new product only if the actual engineering drawings and financial information necessary to validate cost and production assumptions were part of the discussion. For this to be possible, someone (probably the ASP and the software vendor) will have to work with P2P application vendors to make sure that this information is easily accessible from the P2P application. The second way that ASPs can make money from P2P applications is by becoming the service provider that delivers the P2P application to wireless devices. One of the significant features promised by the P2P vendors is access to the "shared desktop" from anywhere, with different views based on the receiving device. Since one of the key benefits touted by P2P vendors is the ability to work offline, it only makes sense that they extend this offline experience to non-PC devices as well. Both IBM (Lotus Sametime Everyplace) and Microsoft (Mobile Information Server--the next release enhances Windows CE integration) are working on ways to allow P2P applications to extend to non-PC devices. Since these systems have long deployment cycles, ASPs need to start testing and evaluating now if they want to be ready by the time the dominant P2P applications come to market, which should be some time in mid-2002.