Frameworks, Standards Do The Grunt Work

Adhering to conventional development practices isn't always possible-or advisable-when long-term corporate goals are at stake
Posted February 1, 2000

Paul Tindall

(Page 1 of 3)

February 2000

Frameworks, Standards Do The Grunt Work

Adhering to conventional development practices isn't always possible—or advisable—when long-term corporate goals are at stake
By Paul Tindall

While we all strive for the perfect solution to a given business problem, we must remember that we cannot always find it in the corporate application landscape. We must strike a constant balance between the functionality we deliver, the resources required to do so, and the time needed to arrive at completion. These constraints invariably lead to conflicts in decision-making among managers, architects, and developers. That said, how do we look at the possible solutions within those constraints and make an appropriate decision?

Figure 1: Commercial frameworks let you focus on business logic Click here

The underlying goal of corporate application development is not necessarily to remain pure to the industry or theoretical trends simply for purity's sake. Companies entrust us with their resources with the expectation that we will add value to their business process. Good design decisions are an investment that we make as developers to meet these expectations. There is no value add in constantly re-working previously built functionality. Our development efforts should always keep long-term goals in mind, even if it means being a little impure at times.

The term enterprise development, while a buzzword in the software industry these days, is an ambiguous one. Enterprise application development is by no means a new concept to business, as mainframes have existed for some 20 years or so, performing the daily chores to take care of corporate informational needs. It is, though, a new concept to some software developers, many of whom may have had to implement only simple departmental productivity applications up to this point.

But now, corporate IS departments are becoming decentralized, with functionality moving away from mainframes and toward organizations and their servers. At the same time, the development tools used to build applications are simultaneously becoming more powerful and easier to use. In fact, some of today's smaller applications will become the de facto enterprise applications of tomorrow.

Because of this shift, organizational-level IS leaders, designers, and implementers must build the next generation of corporate solutions, because they best understand organization-level business processes and informational needs.

Coupled with this change in IS philosophy and shift in application development responsibilities is the globalization of corporations in terms of geography, business activities, and the underlying information driving the corporate business processes. Corporate competitive advantages are becoming more defined by the ability to transfer knowledge into systems and information to enforce best practices across the globe.

The unified model
Still one other factor helps to define the enterprise application—that of flexibility in architecture. As a company fuels its growth through mergers and acquisitions, it must meld disparate business processes into a unified model. This invariably affects the applications running the enterprise, since part of the impetus for combining companies is to eliminate overlapping functions.

Corporations soon realize that information from disparate systems becomes more valuable when taken in aggregate. Thus, the term enterprise development takes form in terms of interfacibility—the ability to gather information from other applications coupled with the ability to provide information to other applications. Some would also call this feature interoperability. Examples of such systems may be the corporate Enterprise Resource Planning (ERP) system, in-house applications supporting other organizations, or third-party applications implemented by the company.

Another feature of enterprise applications is that of extensibility. While it can be rather easy to throw an application together that meets the needs of today, it is more difficult to anticipate the needs of tomorrow and design accordingly. If we follow an incremental develop-and-deploy approach, we need to make sure that for every step forward, we will not have to take a few steps backward with the next release. Expanding our mode of thinking a bit more, we realize that once we implement a successful application within an individual organization, most likely other organizations will want to follow suit once we demonstrate the benefits. If we design accordingly, it should be trivial to replicate the majority of an application to meet new business needs. Expanding our thinking yet again, we realize that as a company goes through mergers and acquisitions, we may need to enhance the business processes within our application. Again, if we design accordingly, this should not be an issue. This leads us to define application extensibility within the realm of enterprise development.

At yet another level, the corporation begins hooking its enterprise applications together in modes beyond just simple information sharing. Whether they are internal or external to the company, few applications can drive business processes in isolation. As such, they must begin working together within the context of some business workflow. Thus, the term enterprise development takes on a collaborative definition. As an example, one system, in the course of providing its functionality, may signal other systems into action, and this in turn may signal still other systems. While human interaction may be required somewhere in the process, it is not mandatory.

Because there are users with differing roles across the user base, no single user typically exercises the entire breadth of functionality provided by an enterprise application. The application is multifaceted in that it means different things to different people. There may be many human interfaces, both of the input and output variety. There are information generators as well as consumers. In most cases, the number of consumers far outweighs the number of generators, since it is this dispersal of information and knowledge that drives such applications.

Thus, we have a series of attributes that help define what an enterprise application really entails. To summarize, an enterprise application has the following features:

  • Support for many sites, geographies, organizations, and users.
  • Extensibility by design because it will need enhancement over its lifetime.
  • Two-way interoperability with other systems
  • Collaboration capabilities with systems both inside and outside the company.
  • More functionality than a single user can exercise.

    Although these attributes are applicable to any application, they become mandatory when we face the rigors of the enterprise.

    Patterns and frameworks
    Other ambiguous terms abound when speaking of enterprise development, most notably patterns and frameworks. Both are critical to successful enterprise development, but they have different meanings. A pattern is the design of a core functional element in an abstract form—though it extends beyond pure theory, as it typically evolves from ideas and techniques proven out in repeated, real-world situations. There are many industry-accepted patterns for implementing a variety of tasks across a diverse range of development tools and technologies. Because we typically implement patterns in an object-oriented language, patterns and object-orientation share a common modeling methodology.

    A framework is the tangible, reusable implementation of multiple patterns on a given platform using one of these development tools and technologies. A framework may also define the necessary communication and distribution mechanisms to make the pieces work together. Frameworks have existed for quite some time in commercial form. In the not-too-distant past, they came to us as Fourth Generation Languages (4GLs) used to develop client-server applications. Alternately, they existed in the form of source-level GUI and I/O libraries meant to deliver applications in a consistent, cross-platform manner. Before that, they came in the form of mainframe development and control software such as IBM's CICS and JCL tools. Now, they manifest themselves in several incarnations:

    Commercial frameworks. One incarnation of a modern-day framework is that of modeling tools and accompanying source-code generators. Here, an application or application component is first defined using a standard or proprietary modeling language. With a few mouse clicks after the model is complete, the tool generates source code. Some tools produce database schemas as well. The framework may fully realize itself simply as a set of runtime components referenced by the source code, as framework source code interspersed with the application code, or as a combination between the two. Some of the more sophisticated tools can even generate for multiple deployment languages (Visual Basic, C++, Java, etc.), database servers (SQL Server, Oracle, etc.) and distribution architectures (COM/DCOM, Corba, Java RMI, etc.).

    Some commercial frameworks extend beyond the infrastructure side and actually begin to layer on some of the business process functionality. Examples include IBM's San Francisco Project, which attempts to define a core set of frameworks across several business domains. For some time, Oracle has provided business frameworks for accounting, manufacturing, and other popular problem domains.

    Application servers. The term "application server" is itself a multi-aspect term because it attempts to implement some or all of the components that make up an enterprise application. In this form, the application server not only embodies the hardware and operating system, but also inherently defines a framework via its programming model. This model typically rests upon selected design patterns implemented by the application server vendor. This form of framework has similarities to the modeling approach in that support exists for multiple development languages, database servers, and distribution architectures. Some in the industry say a full-function application server is simply a re-incarnation of the mainframe on updated hardware.

    Custom frameworks. With the emergence of enterprise development tools and components, it is not too difficult a task to develop a framework suited to a specific business process or organization. Microsoft has provided a suite of server products, development tools, and distribution technologies to enable the development of a custom framework for enterprise applications. The official moniker for this is the Microsoft Distributed interNet Applications (Microsoft DNA) architecture. While DNA is Microsoft's attempt to fully define the tools, technologies, and implementation details needed to build such applications, it is not itself a framework.

    Microsoft DNA. Whether you are a devout promoter, a casual user, or merely an observer, Microsoft is a major player in the enterprise development market. No other set of tools and technologies allows you to have a dynamic, database-driven website up and running in a short amount of time. No other set of tools and technologies allows you to build a strong, multi-tier application in a short amount of time. No other company provides the set of online support and technical information, as does Microsoft. But while Microsoft has provided the tools, guidelines, and sample applications, it is not the definitive source on how to build multi-tier applications. It is merely one of several sources that should be taken into consideration.

    The decision process
    The framework decision process can be rather complex based on individual projects. Unfortunately, it is probably the most important decision to make at the outset of an application development project. The project team spends considerable time and money for software licensing, developer training, and so forth before the actual start of the software project. A bad decision at this stage can wreak havoc later. But the capabilities of the development staff are only one factor. For a given framework, there are learning curves, development and deployment costs, and feature lists to consider. A common issue with commercial framework solutions is that vendors, in trying to tailor functionality to the lowest common denominator of their potential customer base, may expend a significant amount of effort perfecting a feature you find unnecessary at the expense of a feature you value.

    A commercial framework should provide between 40 percent and 60 percent of an application's functionality. While this sounds appealing, it is hard to determine the level of difficulty encountered or success rate at implementing the remaining functionality required by the application. In addition, the 80/20 rule applied to application development says that 20 percent of the time is spent implementing 80 percent of the functionality, and 80 percent of the time is spent implementing 20 percent of the functionality. In most cases, the former 80 percent represents the template functionality of the application (e.g., database interaction, network access, system services, client user interface design, etc.). The latter 20 percent represents the functionality that is both more difficult to implement and what gives the application its character and competitive advantage along with the look and feel that matches the business process flow.

    Put another way, this 20 percent represents the value-added business logic embedded within the application. Looking back at the commercial framework and where the effort savings resides—in the 80 percent realm or the 20 percent realm—is what should drive the decision for using a particular commercial framework. For example, if the effort savings reside completely in the 80 percent template functionality area, it probably does not offer significant value. If, on the other hand, it covers the 20 percent value-added functionality, then it is probably worth a look. The former category indicates horizontal frameworks, while the latter is where vertical-based frameworks reside. (Good vertical frameworks typically implement up to 60 percent of an application's code.)

    Tools and technologies
    While you don't technically need to master the following technologies to understand frameworks, you will need to at least be comfortable with them:

    Windows NT Networking. While it may seem strange to make an apparently obvious statement about NT Networking as a core component of an enterprise framework, it is still worth mentioning because of several key features. Most important, NT Networking represents an integrated security model. If properly configured, a user needs only to log in to the network once to gain access to areas beyond the network. Since the other server products that make up this framework run atop NT Server, they have access to this same security mechanism. This makes it easier on both the end-user, who does not have to remember another set of passwords, and the developer, who does not have to implement a log-in- and password-management process. NT Networking also can support various network configurations, including Wide Area and Dial-Up networking.

    SQL Server. In any large-scale application, it is important to have a database server that can meet performance and load handling requirements. It is also important to have a database server that has sufficient online backup facilities, recovery features, transaction logging, two-phase commits, triggering, stored procedures and so forth. Small-scale database systems simply will not hold up to the extreme needs of managing enterprise-level data. Additionally, advanced features such as integrated replication and an administrative API are highly desirable. While there are several server options here, SQL Server 6.x/7.0 meets these requirements handily. In addition, SQL Server offers a graphical user interface in the form of the SQL Enterprise Manager, eliminating the need to use a query console window to perform administrative and developmental tasks. SQL Server also exposes the underpinnings of the Enterprise Manager in the form of an SQL-DMO (SQL-Data Management Objects). This programming module can be invaluable when it comes to automating complex administrative tasks on the server. This may include activities such as setting up a new server, or simply running a weekly re-index and recompile of the views and stored procedures that need to follow a certain processing order.

    Additionally, SQL Server has an SQL Executive component. This component is responsible for managing the replication tasks, backups, restores, and so forth. The SQL Executive can also manage tasks that are external to SQL Server with its ability to call the NT command processor.

    COM/DCOM. The COM architecture is the foundation for Microsoft's OLE and ActiveX technologies. COM is both a formal specification and a binary implementation. Technically, any platform can implement COM, not just Win32. It's so ubiquitous on the Win32 platform because Microsoft has provided the reference (and hence the standard) implementation of the specification. On the Win32 platform specifically, COM relies on Microsoft's Dynamically Loaded Library (DLL) mechanism. The DLL architecture allows for a high level of runtime modularity (as opposed to source-code level), allowing binary modules to load in and out of a process address space at runtime. COM, and hence our framework, relies heavily on this dynamic nature of COM to support long term flexibility over the life of the application.

    Any programming language that can access the Win32 COM API and implement a virtual function table can generate a COM class. Visual Basic is such a language, allowing a developer to build these types of classes while simultaneously hiding the gory implementation details.

    DCOM takes COM across process boundaries. While applications frequently implement DCOM boundaries on a single physical machine, it is really a technology meant for communicating between machines. DCOM adds the necessary functionality to make a client application think it is simply invoking a local COM object, when it is really invoking a COM-style proxy locally that invokes the object remotely. There are some optimizations in the DCOM engine to minimize the effects of remote invocation, since COM's original design did not account for network latency. DCOM also adds a modicum of a security infrastructure to ensure that only privileged clients can invoke a given object.

    Visual Basic 6.0, Enterprise Edition. The capabilities of Visual Basic 6.0 (VB6) extend far beyond form design. It enables developers to build custom ActiveX controls that encapsulate core business process flows into a component that can run in a variety of locations. VB6 also enables developers to create ActiveX Dynamic Link Libraries (DLLs) that are also usable in a variety of locations. It also can host many applications created by other development tools.

    VB development extends beyond simply the user interface and client machine, allowing us to develop modules that run on a server as part of a distributed application. Concerning the ease of development, VB6 has all sorts of goodies within the Integrated Development Environment. Features such as IntelliSense can help the developer finish a variable reference with just the first few letters being typed, or showing the calling convention for a native or user-defined function or method. VB6 also has a feature known as the Class Builder Utility, a very simple class modeler and code generator that can save significant time in generating well-formed class modules. The IDE also performs auto-correction of the code, color-coding of key words and comment blocks, and block indenting. While these features may seem minor, it is within the IDE where developers will spend most of their time during the coding phase, so every productivity improvement counts.

    Internet Explorer 4/5. Internet Explorer 4/5 (IE4/5) has been adopted as the standard browser by many companies for a multitude of reasons. Using standard HTTP form processing techniques, the browser will work in conjunction with the Internet Information Server, using Active Server Pages (ASP) to support simple data management. VB-based client applications, or browser-hosted ActiveX controls, implement complex data management that is too difficult to implement using the HTTP form approach.

    Microsoft Transaction Server. Microsoft Transaction Server (MTS) provides several functions that may not be apparent from its name. First, it is a DCOM surrogate, improving the management and administration of these components on a server. Second, it is a transaction coordinator, assisting in performing disparate database transactions as a group and rolling them back as a group if any part fails. Third, MTS is a resource-pooling manager, allowing multiple logical objects to run in the context of a pool of physical ones. It also provides database connection pooling for the DCOM libraries to minimize the performance issues associated with login and connection.

    Figure 2: Object-rich frameworks cut coding drudgery Click here

    Internet Information Server, 4.0. IIS is the foundation for ASP, a VBScript-based environment for the dynamic generation of browser-agnostic HTML pages. In addition, IIS and MTS integrate tightly when the two are running on the same physical machine, bypassing some of the normal activation processes to improve overall performance.

    Visual InterDev 6.0. Visual InterDev has a powerful IDE much like Visual Basic, allowing for more rapid development of ASP pages than can be done in a conventional text editor (which up until release 6.0 was the primary path). In addition, Visual InterDev provides debug facilities for stepping through some server-side pages during generation, or through the completed page on the client that may have some embedded scripting code itself.

    OLEDB/ADO. Database access is foundational to any enterprise application. While many applications may still use ODBC or other forms of legacy driver methods, OLEDB and ADO are the most appropriate choices for new application development or significant refreshes to existing applications. In addition to providing access to an RDBMS, OLEDB/ADO is the foundation upon which Microsoft plans to allow access to other structured data such as network directory services. Additionally, ADO provides a mechanism to represent structured data created by your application and can serve as a temporary storage space or a transport mechanism.

    XML and the MSMXL Parsing Engine. Extensible Markup Language (XML) is one of the hottest topics among enterprise developers. Similar to HTML, XML is a textual format for representing structured information. The difference between HTML and XML is that the former represents format and the latter represents data.

    While XML is a straightforward specification, its flexibility makes the development of a parser a non-trivial task. IBM has made a publicly available Java- based parser for some time. It has only been with the release of IE5 that Microsoft has provided a standalone COM-based parser in the form of MSXML.DLL. Now that Microsoft has provided us with this invaluable tool, we can divert our attention from trying to build a complex parser and begin creating value-added solutions from it. XML is a data transfer mechanism, with multiple roles in providing a data conduit between processes within a system (P2P), processes across systems (S2S interfaces), and processes across businesses (B2B interfaces).

    What's powerful about MSXML is its COM basis that gives it the ability to run within Visual Basic, ASP, and IE. Even more powerful is the fact that data formatted as XML in a Windows-based COM environment is readable as by a Unix-based Java XML reader in another environment.

    CDONTS. Collaborative Data Objects for NT Server (CDONTS) provides many features, one being SMTP (Simple Mail Transport Protocol) capability that bypasses MAPI (Mail API). This is important because MAPI requires the use of a mail service such as Exchange, which adds overhead in terms of administration and performance. While there is a similar CDO (non-NT server) version, it lacks an SMTP-based messaging engine. Fortunately, CDONTS can be run on an NT Workstation development machine. In production mode, CDONTS can be used with both IIS and MTS to provide server-side mail processing for collaboration and notification activities.

    Paul Tindall takes a contrarian's approach to development, hence the title of his book, Developing Enterprise Applications—An Impurist's View, from which his article is adapted. Tindall,formerly an application development manager at Compaq, recently joined, an e-business incubator. Contact him at

    © 1999 FAWCETTE TECHNICAL PUBLICATIONS, all rights reserved.

  • Page 1 of 3

    1 2 3
    Next Page

    0 Comments (click to add your comment)
    Comment and Contribute


    (Maximum characters: 1200). You have characters left.