Friday, December 13, 2024

P2P in B2B

Datamation content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.

By Carl Lehmann

In recent months, peer-to-peer (P2P) computing has been much hyped, promising to simplify information search and retrieval, better enable resource sharing, and help integrate software applications now distributed across computer networks.

www.metagroup.com

Simply defined, P2P is an IT architectural precept that links and exploits all available networked technology (including, CPUs, storage, files, data, metadata, and network availability). This enables users, applications, and systems to centrally find and locally bind all network resources (i.e., transfer, translate, or integrate them).

When centrally coordinated, we believe P2P computing can have dramatic implications for solving complex, costly IT problems — including data discovery, acquisition, and life-cycle management. This will prove true first for problems associated with business-to-business (B2B) commerce, managing supply chains, enabling service chains, and managing content.

Through 2003, Sun, Microsoft, Intel, and a handful of emerging vendors like Thinkstream, XDegrees, Consilient, and Groove Networks will lead the way by refining P2P architectural principles and applications. Early-adopter Global 2000 companies will experiment with P2P during this time to improve acquisition and management of distributed data in internal portals, external B2B initiatives, and corporate-wide knowledge sharing programs.

By 2005, P2P will be a common IT architectural model within large distributed networks and B2B integration initiatives (such as supply or service chain planning).

We do not believe that Napster, Gnutella, and other services such as United Devices (which enables CPU sharing) exemplify sound distributed P2P technology. They are little more than file indexing and sharing tools, riddled with security and control flaws and acting more like viruses than reliable distributed IT frameworks. P2P is a maturing architectural approach, originally conceived in mature computing environments (i.e., Unix, TCP/IP) and driven by the pervasive deployment of network infrastructures and servers (mainly the Internet).

Unfortunately, from an e-business market perspective, too many vendors claiming to offer P2P solutions are confusing and fragmenting the market. We believe this will continue through 2003, wherein hastily designed and relatively narrow P2P interpretations (tailored to specific markets or content) will focus on early niche markets (by offering, say, improved searching in financial or legal databases). Larger vendors, meanwhile, will try to include P2P in their latest computing paradigms (i.e., Microsoft’s .Net, Sun’s Project Jxta, or Intel’s P2P Working Group).

From a technical perspective, the control of processes, workflow, business rule management, and overall system performance has not been thought through and will require more thorough analysis. But P2P’s greatest technical concern is security — whether of data, applications, or entire systems.

In its nascent state, P2P is inherently an environment of unstable connectivity and unpredictable network addressing. This chaotic state complicates authentication, authorization, and access management. To be secure, P2P solutions must be predicated on or integrated with public key infrastructure technology and process management. These, however, will not be pervasive parts of the architecture until 2004. (Early thought leaders such as Thinkstream have begun to address these security issues.)

Another P2P limitation is it assumes that metadata (data about data) will be pervasive, timely, and accurate. Indeed, structuring metadata within a central repository to find relevant content spread across a distributed network eliminates the costs and management problems associated with more resource-intensive approaches (such as content aggregation common in B2B procurement, or index “spidering” common to most search strategies).

Metadata, if well designed, is smaller and lighter than the data itself, lending to rapid search and retrieval and helping to simplify problems associated with B2B linking and integration. (For example, many companies have, in setting up supply chain management systems, had difficulty integrating the infrastructures of their various suppliers.)

However, at this point metadata is neither common nor standard. Therefore, for P2P in B2B applications to be pragmatic, companies must enforce discipline in systems, data, and metadata design. Pending market improvements in distributed directory services and format tools (known by acronyms like UDDI, RDF, and ebXML) will be required to ensure metadata availability and integrity.

Additionally, P2P must exploit principles inherent in existing hierarchical and distributed networks. Companies considering P2P as an architecture should evaluate their system designs — whether hierarchical systems that centrally find resources, or distributed systems that locally bind with required resources (i.e., data translation or transfer).

Business Impact:Peer-to-peer computing architecture can streamline e-business processes and leverage investments in unused or underused IT infrastructure. It can also lower integration costs among commerce chain partners.

Bottom Line:Technical flaws in the nascent peer-to-peer computing market will soon be overcome. By 2005, peer-to-peer will become an embedded architectural precept in e-business infrastructures.

Carl Lehmann is an analyst for META Group, an IT consulting firm based in Stamford, Conn.

Subscribe to Data Insider

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more.

Similar articles

Get the Free Newsletter!

Subscribe to Data Insider for top news, trends & analysis

Latest Articles