A decade ago, everyone was excited about a new technology architecture that was going to revolutionize the way business is conducted in corporate America. It would provide a new paradigm for information processing that would facilitate collaboration and information sharing across a vast number of systems and organizations. What was this new technology? Client/server computing.
Now the chorus sings again about the latest revolutionary technology, the World Wide Web. You learned in 8th grade Social Studies that history is bound to repeat itself, and those who do not learn from the mistakes of the past are doomed to repeat them. With this in mind, we are now poised on the edge of the next technological precipice. There have been numerous systems development failures using client/server architecture, but there also have been many successes. By understanding the strengths of the client/server architecure, you will be able to implement them in your Active Server Pages development.
There are two major keys to the successful implementation of any new technology-a solid understanding of the foundations of the technology and a framework for its implementation in your business. Throughout this book, you will learn about the tools and techniques to meet this new challenge (opportunity) head-on and how to leverage this experience in your own development.
This provides a brief overview of the architecture and how it has evolved over the years.
The client/server revolution of the early eighties was a boon to developers for a number of reasons. Looking at its implementation in the past enables you to leverage the inherent strengths of client/server in your ASP development.
Scripting enables for a simple yet powerful method of adding dynamic content to your Web site.
The choices you make as you decide where to place functionality, on the client and on the server, will expand your application options.
Do you remember the first time that you ever used a PC database? For many of you, it was dBase. dBase and those programs like it (Paradox, FoxPro, and Access) provide a quick and easy way to create two-tier client/server applications. In the traditional two-tier client/server environment, much of the processing is performed on the client workstation, using the memory space and processing power of the client to provide much of the functionality of the system. Field edits, local lookups, and access to peripheral devices (scanners, printer, and so on) are provided and managed by the client system.
In this two-tier architecture, the client has to be aware of where the data resides and what the physical data looks like. The data may reside on one or more database servers, on a mid-range machine, or on a mainframe. The formatting and displaying of the information is provided by the client application as well. The server(s) would routinely only provide access to the data. The ease and flexibility of these two-tier products to create new applications continue to be driving many smaller scale business applications.
The three-tier, later to be called multi-tier, architecture grew out of this early experience with "distributed" applications. As the two-tier applications percolated from individual and departmental units to the enterprise, it was found that they do not scale very easily. And in our ever-changing business environment, scaleability and maintainability of a system are primary concerns. Another factor that contributes to the move from two to multi-tier systems is the wide variety of clients within a larger organization. Most of us do not have the luxury of having all of our workstations running the same version of an operating system, much less the same OS. This drives a logical division of the application components, the database components, and the business rules that govern the processes the application supports.
In a multi-tier architecture, as shown in Figure 3.1, each of the major pieces of functionality is isolated. The presentation layer is independent of the business logic, which in turn, is separated from the data access layer. This model requires much more analysis and design on the front-end, but the dividends in reduced maintenance and greater flexibility pay off time and again.
Multi-tier architecture supports enterprise-wide applications
Imagine a small company a few years back. They might produce a product or sell a service, or both. They are a company with a few hundred employees in one building. They need a new application to tie their accounting and manufacturing data together. It is created by a young go-getter from accounting. (Yes, accounting.) He creates an elegant system in Microsoft Access 1.0 that supports the 20 accounting users easily (they all have identical hardware and software). Now, move forward a few years: The company continues to grow, and they purchase a competitor in another part of the country. They have effectively doubled their size, and the need for information sharing is greater than ever. The Access application is given to the new acquisitions accounting department, but alas, they all work on Macintosh computers. Now, the CIO is faced with a number of challenges and opportunities at this juncture. She could purchase new hardware and software for all computer users in her organization (yikes!), or she could invest in creating a new application that will serve both user groups. She decides on the latter.
A number of quesions come to mind as she decides which path to take:
A few years ago, you might have suggested using a client/server cross-platform development toolkit or a 4GL/database combination, which supports multiple operating systems. Today, the answer will most likely be an intranet application. A multi-tier intranet solution provides all of the benefits of a cross-platform toolkit without precluding a 4GL/Database solution. If created in a thoughtful and analysis-driven atmosphere, the multi-tier intranet option provides the optimal solution. Designed correctly, the intranet application will provide them with the flexibility of the client/server model without the rigid conformance to one vendor's toolset or supported platform.
In her new model, the client application will be the browser that will support data entry, local field edits, and graphical display of the data. The entry to the database information will be the intranet Web server. The Web server will interact with a number of back-end data sources and business logic models through the use of prebuilt data access object. These objects will be created and managed through server-side scripting on the Web server. This scenario that has just been discussed can be implemented today with Active Server Pages, using the information, tools, and techniques outlined within this book.
The same way that businesses have been effectively using multi-tier architectures on their LANS and WANS can now be taken advantage of on the Internet and intranet. The role of the client (aka browser) and the server, when designed correctly, can provide the best of the traditional client/server architecture with the control and management found in more centralized systems.
Developing a multi-tier client/server system involves three basic steps:
Take a look at each of these steps, and by the end of the following discussion, you will understand how to effectively use the C/S model in your Inter/intranet development.
The most important step, of course, is the first. Before undertaking any new development effort, you need to have a thorough understanding of the information your users require. From this, you can develop a firm, well-documented feature set. From these pieces of information, you can continue on and complete the functional specification for the new application.
![]()
It is always so tempting, with the advent of RAD (Rapid Application Development) tools, to write code first and to ask questions later. While this is a method that can be successful in small applications, it can lead to major problems when used in a more substantial systems development effort. Just remember, your users can have a system chosen from two of the following three attributes: fast, good, and cheap. The fast/cheap combination, however, has never been a good career choice.
You now have the idea, the specifications, and the will to continue. Now you can use the C/S model to complete your detail design and start development. But first, take a brief look at each of the steps (bet you're glad this isn't a 12-step process) and how the client and server component roles are defined.
In traditional C/S development, the choice of the communication protocol is the basis for the system. Choosing from the vast number of protocols and selecting appropriate standards is the first step. Specifying connectivity options and internal component specs (routers, bridges, and so on) is again a vital decision when creating a system.
In the Internet world, these choices are academic. You will utilize the existing TCP/IP network layer and the HTTP protocol for network transport and communication.
Now you get to the heart of your application design decisions. Sadly, there are no quick and easy answers when you begin to choose the data stores that your application will interact with. What is important to remember is that the choices that you make now will affect the system over its entire useful life. Making the correct choices concerning databases, access methods, and languages will guarantee the success or failure of your final product.
A very helpful way to think about your application is to break it down into functions that you wish to perform. Most client/server applications are built around a transaction processing model. This allows you to break the functions into discrete transactions and handle them from beginning to end. In the Internet world, it is very helpful to think of a Web pages as being a single transaction set. The unit of work that will be done by any one page, either a request for information or the authentication of actions on data sent, can be considered a separate transaction. Using this model, it is easy to map these document-based transactions against your data stores. The Active Server Pages environment, through server-side scripting and data access objects, enables you to leverage this model and to create multi-tier client/server Internet applications.
If your application will be using legacy data from a database back-end or host-based computer, you need to have a facility for accessing that data. The ASP environment provides a set of component objects that enable connectivity to a number of DBMS systems. Through the use of scripting on the server, you can also create instances of other OLE objects that can interact with mid-range or mainframe systems to add, retrieve, and update information.
As you have already learned, one of the great benefits of the C/S architecture is its fundamental guidelines to provide a multi-platform client application. Never before has this been easier to achieve. With the advent of the WWW and the Internet browser, you can provide active content to users from a variety of platforms. While there has been a great movement toward standardization of HTML, there are many vendor-specific features found in browsers today. This means you have a couple of important choices to make, similar to the choices that you had to make when creating traditional multi-platform client applications. When developing with traditional cross-platform toolkits, you have a number of options.
Code to the Lowest Common Denominator
This involves selecting and implementing the features available on all of the client systems you wish to support. This is a good way to support everyone, but you'll have to leave out those features within each system that make them unique. For example, you might want to implement a container control for your OS/2 application, but there is no similar control available on the Mac. As a consequence, this falls out of the common denominator controls list.
Create a separate application for each client
This option ensures that each client application takes full use of the features of the particular operating system. The big drawback of course is that you have multiple sets of client code to support. This might be achievable for the first version, but having to manage and carry through system changes to each code base can be a huge effort.
The majority of the client code is shared
This last option is a good choice in most scenarios. The majority of the code is shared between applications. You can then use conditional compilation statements to include code which is specific for any one client system. This is even easier when using a browser as the client. Within an HTML document, if a browser does not support a particular tag block, it will ignore it.
As stated laboriously in the preceding sections, the client/server has been a buzzword for years now. Many definitions of this architecture exist, ranging from an Access application with a shared database to an all-encompassing transaction processing system across multiple platforms and databases. Throughout all of the permutations and combinations, some major themes remain consistent:
The client and the server have well-defined roles, the client requesting a service and the server fulfilling the service request.
The communication between the client and server (or the client-middleware-server) is a well-defined set of rules (messages) that govern all communications-a set of transactions that the client sends to be processed.
Due to the clearly defined roles and message-based communication, the server or service provider is responsible for fulfilling the request and returning the requested information (or completion code) to the client. The incoming transaction can be from a windows client, an OS/2 machine, or an Web browser.
The client can send a transaction to a service provider and have the request fulfilled without having to be aware of the server that ultimately fulfills the request. The data or transaction might be satisfied by a database server, a mid-range data update, or a mainframe transaction.
I remember when I first started surfing the Web. One of my first finds was a wonderful and informative site offering the latest and greatest in sporting equipment. They had a very well organized page with interesting sports trivia, updated scores during major sporting events, and a very broad selection of equipment and services. Over the next few months, I visited the site from time to time to see what was new and interesting in the world of sporting goods. What struck me was that the content did not seem to change over time. The advertisements were the same, the information provided about the products was the same, and much of the time, the 'updated' information was stale. Last summer, while looking for new wheels for roller blades, it was a surprise to find that the Christmas special was still running.
We surf the Web for a number of reasons: to find information, to view and purchase products, and to be kept informed. There is nothing worse than going to a fondly remembered site and being confronted with stale advertising or outdated information. The key to having a successful site is to provide up-to-date dynamic content.
Most of the information provided by current sites on the Internet consists of links between static informational pages. A cool animated GIF adds to the aesthetic appeal of a page, but the informational content and the way it is presented is the measure by which the site is ultimately judged.
To provide the most useful and entertaining content, you must be able to provide almost personal interaction with your users. You need to provide pre- and post-processing of information requests, as well as the ability to manage their interactions across your links. You must provide current (real-time) content and be able to exploit those capabilities that the user's browser exposes. One of the many components that is available in the Active Server Pages environment is an object through which you can determine the capabilities of the user's browser. This is just one of the many features you will be able to use to provide a unique and enjoyable experience for your users.
![]()
See "Using the Browser Capability Component" for more information about the Browser Capability Object, in Chapter 13.
A great, yet basic and simple example of something that really shows you how a page is changing with each hit is the hit counter. This capability, while easy to implement, will in itself show the user that the page is constantly changing. It is also very easy to have the date and time show up as a minor part of your pages. All of these little things (in addition, of course, to providing current information) help your Web site seem new and up-to-date each time it is visited.
As you head into the next several chapters, you will be given the tools and techniques to provide dynamic content in your Internet and intranet applications.
There are a variety of tools available today that enable you to create Internet applications. The best of the new breed of tools, called scripting languages, enable you to add value to your Web pages by providing client-based functionality. You can perform field edits and calculations, write to the client window, and employ a host of other functions without having to take another trip to the server for additional information.
What is so exiting about the newest scripting technology is that it is implemented, not only on the client, but now also on the server. With Active Server Pages, you can leverage your knowledge of scripting on the server. In addition to the basic control and flow that many scripting languages provide, you also can access objects from within your scripts that provide additional functionality and power. These objects, discussed in Part III, provide you with the capability to communicate with those multiple tiers of information in the client/server model.
Take a quick process check: You know about the C/S multi-tier architecture and how to effectively use it on the Internet and intranet. You have a good understanding of the type of content that you must provide, and you have learned about the scripting, which can tie all of the pieces together and make them work. The next step is to decide what functionality should go where.
Obviously, a great chunk of the processing ultimately resides on the server. All database access and access to other internal data sources will be provided from the server. The inter-page linking and responding to user requests will also be on the server. The decision as to what functionality to place in the browser client is the same as you discovered when reading about the challenge of supporting multiple operating systems in the traditional client/server environment:
This will provide the greatest guarantee that your active content can be viewed in its entirety on any browser.
By determining the capabilities of the browser as the information is requested, you can tailor the returned document to exploit the browsers capabilities.
Most sites that you visit today have a link to a text-only version of the document. This capability is important, not only to ensure that all users can get the information, but also to enable those users with less capable equipment to have a full and rich experience from your active content.
The scripting language you use on the client depends wholly on the capabilities of the browsers that request your pages. Java Script is supported in the Netscape Navigator family of browsers. VBScript and Java Script are supported in the Microsoft Internet Explorer browser. Given the remarkable changes to the browser software over the past year, you can expect that the two major scripting dialects will be supported across all major browsers in the near future.
![]()
VBScript and JavaScript When used within the confines of Internet Explorer, VBScript and Java Script are functionally equivalent. They both provide a rich, basic-like scripting language that is interpreted by the browser at run-time to provide client-side intelligence and an enhanced user experience. VBScript is a subset of the popular Visual Basic language. For the legions of Visual Basic developers, VBScript is a natural progression and tool for creating interactive Web pages. With the release of Active Server Pages, scripting has been taken to another level. Now you can use the same versatile scripting to add value to the server-side of the process as well as to the client side.
Figuring out how to best employ scripting in an Internet environment can be a daunting task. You have learned how you can benefit from the experience of thousands of developers who have used the multi-tier architecture for creating enterprise-wide applications. You can create Internet applications with the same transaction-based, flexible, and client-neutral functionality that has been driving businesses for the past decade.
Here are some of the topics that are discussed in the coming chapters:
© 1997, QUE Corporation, an imprint of Macmillan Publishing USA, a Simon and Schuster Company.