Software

Software is at the heart of a biosurveillance system. Software underlies the collection, storage, algorithmic analysis, and display of data. Although biosurveillance systems will always include manual elements, the long-term trend will be to automate as much of the process as possible to increase reliability and decrease time latencies. Given the centrality of software in biosurveillance systems, it is important to understand the basics of software.

Modern software is designed and built from existing software components, similar to a modern house in which the windows and door assemblies come prehung. Examples of pre-fabricated (pre-fab) components are database management systems (e.g., Oracle, Microsoft SQL Server, MySQL), Web Servers (e.g., Microsoft IIS, Apache), and application servers (e.g., WebSphere, JBoss, Tomcat). Just as in the construction of house, the use of these pre-fab components increases quality of the overall system and decreases its cost. Pre-fab components are used wherever possible with as little custom programming as necessary to knit them together.

Development of software is costly. It is so costly that humans systematically fail to appreciate how costly it is. On average 71% (Standish Group, 2004) of software development projects experience problems that result in delays and cost overruns. Nearly 20% of projects completely fail.1

The most effective way to mitigate this risk is to keep software projects small, as the effort and cost is not linearly related to project size. Small tightly defined software projects have significantly fewer problems. In stark contrast, the development of a biosurveillance system from scratch is a very large, poorly defined task. Building a biosurveillance system from components partly addresses this risk, but a far less risky approach is to acquire an existing biosurveillance system and to extend and customize the system to meet the organization's needs.

3.1. Software Selection

Organizations must select software. They must select the prefab components from which to create a system and possibly the entire system. The key criteria include the software provider's track record and the functionality and technical evaluations of the software. Neglecting to conduct due diligence on these criteria can result in significant expense and lost time. In the worst case, the result may prove useless to the organization.

The first criterion is the provider's track record with the software. References from those who use the software are the best way to assess a provider's track record.The goal of checking references is to determine whether users are satisfied with the software. When asking for references, the organization should also obtain the number of customers using the software. A large, well-satisfied user base is the best indicator that the provider's software performs well. It is important to ascertain the financial position of the company or division providing the software. Large corporations frequently request financial statements from small software providers to demonstrate that the provider has the financial strength to continue to improve the software, fix defects, and provide support.

It is almost inevitable that an organization will work with a small provider offering new software. In this situation, there is risk that the provider may not be in business to support the product in the future. Many organizations require a code escrow arrangement, which places a copy of the source code of the software into the hands of an escrow company. A code escrow provides protection for the organization by ensuring a copy of the source code is available in case the provider goes out of business or prematurely terminates ongoing support for the product.

Evaluation of software functionality includes a detailed assessment of the organization's needs against what the software provides. An evaluation should be done on a working product to demonstrate that the needs are truly met. It is insufficient to evaluate the behavior of sample or prototype software. This evaluation may demonstrate how the software could work, but not how it actually does work. The original target market for the software must also be included in an evaluation of functionality. As markets change, software providers adapt by redirecting their software to target newer, emergent markets. When providers try to adapt their existing software to a new market, they may overlook functionality that would have been created had the software originally been designed for the new market. To compensate for the missing functionality, providers adopt the terminology of the new market, incorporating a slightly different meaning to fit

1 The FBI Trilogy project is a recent example of large project failure. After consuming $170 million dollars the FBI chose to cancel this project in 2005 outright (Sharma, 2005, Charette, 2005).

their software. When dealing with software that has been redirected, it is important to ensure terminology has not been redirected as well.

The final criterion is a technical evaluation of the software. The technical evaluation should answer two questions: "Will the software work well in my organization?" and "Is the software designed to hold up overtime?" To answer the first question, the acquiring organization should create a list of the pre-fab components and software languages it currently uses and supports and compare it to the list of components and languages utilized/required by the software. The software will not fit well in the organization's environment if there is a mismatch. For example, if an organization already uses Oracle as its database, the addition of new software that requires Microsoft SQL Server will increase the operational costs to the organization of supporting two database management systems, and require additional technical skills. To answer the second question, the organization must understand the design of the software, which we discuss next.

3.2. Software Design

In the construction of a biosurveillance system, we are concerned with individual pieces of software but also software systems. A software system is a group of software components that operate together for a common purpose. These components may operate on separate computers and even in different geographic regions, but they work according to a coherent design. Improperly designed software systems are frequently abandoned because of high maintenance costs.

A typical modern software system uses three layers (also called tiers): a database layer, a business layer, and a presentation layer (Figure 35.2). A layer is a conceptual way of grouping similar functions together so that each layer can be built and modified relatively independently of the others. The database layer stores data and responds to requests from the business layer. Requests to the database layer may be to create records, read records, update records, and delete records. The business layer is responsible for distributing data, processing data, and making requests to the database layer. Algorithmic analysis of biosurveillance data is an example of a function that would reside in the business layer. The presentation layer is responsible for displaying the data obtained from the business layer to the user. A map-generating program would reside in the presentation layer.

System designers make many decisions about which pre-fab components to include in these layers and how to arrange these components. The wide-scale adoption of this three-layer design, however, has resulted in standard approaches to the layers. The PHIN standard for the database layer, for example, recommends commercial off-the-shelf software (e.g., Oracle, SQLServer, or Sybase).

System designers often design the business and presentation layers together. This design practice does not violate the layering principle because the resulting design still maintains

Database Layer t

Business Layer Presentation Layer figure 35.2 Three-layered design of software systems. Modern software design involves layering. This figure shows a three-tier design involving a database layer, a business layer for distributing and processing data, and a presentation layer for displaying data and results of analyses.

the separation of the layers. There are two standard approaches for the construction of the business and presentation layers: the Web-application approach and the desktop-application approach. Web-based applications use an Internet browser as the presentation layer. The browser accesses a Web server and Web application server in the business layer. Desktop applications locate the presentation layer on a user's local computer. In the desktop application approach, the business layer may reside on the user's local computer or on a central server. An advantage of a desktop application approach is the ability for the program to interact with items on the user's computer, including other applications such as Excel, Access, and Word. However, the desktop application approach entails maintenance issues on the local desktop, which may involve significant expense if there are many users of the system. The software must be installed and maintained on each machine. Web-based applications have the advantage that the technical staff can install software updates on the central servers.

3.3. Toolsets

A toolset is basically a programming language. Modern toolsets (e.g., the Microsoft .NET toolset) also include a set of pre-fab components that are commonly used to build software. These components make the difference between building a house with the old-fashioned hammer and nails approach and framing a house with a high-performance pneumatic nailer or preassembled walls.The components in the toolset vary in size and complexity. Some are small and simple, such as a tool for formatting data that are to be presented to a user on a screen. Others, such as Axis, which creates Web services, are quite large and complex.

The choice of toolset affects the cost and effort to build software initially and to add functionality or customize it later. The most important factors in choosing a toolset (or in evaluating the toolset being used by a provider of software that you are considering) are toolset popularity, experience, and productivity.

Toolset popularity is the number of software developers using the toolset. Popular toolsets often have articles, code examples, and complete solutions to common problems freely available on the Internet. This free support improves productivity. Toolsets used in the 1990s for constructing software systems are unpopular enough today that it is difficult to build or maintain a product based on them. These toolsets include FoxPro, Delphi, Progress, and PowerBuilder. The most popular toolsets at present are Java and Microsoft. NET.

Toolset experience is the measure of the years of experience that a particular software developer has using the toolset. Software developers with many years experience using a particular toolset are significantly more efficient as system developers. The Microsoft .NET toolset faced this problem when it was initially released.

Toolset productivity is the final factor to consider. Toolset productivity is the measure of how efficiently programmers can accomplish a programming task by using the toolset. Research on software languages has demonstrated that programmers are much less productive using languages such as C and C++ than Java or Microsoft .NET in the construction of software systems.

Current toolsets for constructing Web-based applications are LAMP, Java, and Microsoft .NET. Desktop applications are being constructed using Java or Microsoft .NET.

3.4. Open Source Versus Proprietary Software

Open source is a movement in the software community that has been growing for the past 20 years. The fundamental idea of open source is that software—and its source code—should be freely available.

The obvious benefit of open source to the biosurveillance community is free software. A less obvious—but perhaps more important—benefit derives from the availability of source code, which allows organizations to more easily tailor the software to meet their needs.

If a biosurveillance organization has access to the source code, it can modify the software to meet its needs. An organization with a need not anticipated by the software designer, say incorporating a new algorithm such as BARD, can modify the code so that the system can interact with BARD. Or it can ask or contract with another organization (such as BARD's developers) to do the work. In contrast, if a biosurveillance organization does not have access to the source code, it is at the mercy of the organization that owns the source code, whose development priorities and cost structures may not be compatible with the organization's schedule or budget. Even worse, the organization that developed the software may no longer exist.

The benefits of open source are so compelling that it is an emerging criteria for the selection of software in any industry.

Was this article helpful?

0 0

Post a comment