contents.gifindex.gifprev1.gifnext1.gif

Guidelines

Guideline One

A site with a database management system needs a Database Administrator (DBA).

The administrator is critical in any large database installation. The DBA checks performance statistics and memory usage, performs general maintenance and backup for the database, initiates traces, executes database utilities, sets buffer sizes, determines how often buffers are flushing, and, in general, understands how database settings and events affect overall performance.

The DBA also performs periodic tuning of the database, including:

* using monitoring tools

* allocating table spaces

* examining output of the query optimizer to ensure that the most efficient paths are being used

* updating table statistics

If a site experiences a performance problem, this usually means there is a bottleneck somewhere in the system. The DBA can help to isolate the bottleneck. Once the bottleneck is identified, the site can determine whether to apply more resources or correct the situation that is stressing the existing resources.

Guideline Two

Some COBOL operations are particularly stressful to a database. The more the application uses these operations, the more likely it will slow the performance of the RDBMS.

The more you understand about your RDBMS and how it operates, the more you can help your COBOL applications to work efficiently with it.

File Input and Output

Consider these standard COBOL I/O operations:

READ
REWRITE
WRITE
DELETE
OPEN

Each has an associated cost in terms of database performance. Each asks the database to do something that takes time. So if there are I/O operations that are optional in your COBOL application, you may want to remove them.

For example, let's examine file OPENs.

Developers sometimes OPEN and CLOSE files unnecessarily, using the OPEN-CLOSE pairing as a way to demarcate each new procedure:

OPEN file-A
procedural code
CLOSE file-A

But it's important to know that file OPENs are expensive in terms of performance. If you can eliminate non-essential OPENs from portions of your application, you can probably make an improvement in processing speed.

READ operations can also affect performance. All COBOL I/O is based on key indexes. Examining the output of your query optimizer allows you to determine if the most efficient execution path is being used for READs. The execution path for a given query can change as your data changes, and as the size of the tables changes. It is also affected by database initialization parameters and any statistical information that you may have gathered on your tables. It might be helpful to know that, typically, the index path is the most efficient for Acu4GL applications.

Transactions

Large transactions are also very expensive. The main problem here is that the database will hold locks on indexes and rows throughout an entire transaction. Thus, the database is creating additional overhead for an extended period of time if the transaction is lengthy.

In addition, complex information tracking must take place to ensure that transactions can be successfully rolled back.

Often application designers decide to err on the side of safety when applying transaction management to a mature application. Which operations should be included in a single transaction? The safe approach is to group everything that is related into one transaction. But this scheme is expensive--even more so when a database is involved. The lengthier the transaction, the longer the locks are held and system resources are tied up. The extensive data verification in COBOL programs only prolongs this.

If performance is an issue, give some thought to dividing each transaction into smaller and more efficient subgroups.

Tables with multiple indexes

If you use tables with multiple indexes, keep in mind that when a record is updated, locks are put onto all of the indexes, and they are all basically rewritten during the course of the update. This is a costly process. There may be multiple columns per index, and multiple indexes per table. Each rewrite implies a certain amount of wait time.

There are two things you can do in this circumstance:

a. Restructure your data

b. Use the Vision file system for some of your data

Restructuring the data

The benefits of data restructuring may be significant.

For example, if you have any situations in which two indexes start out with the same columns, you may be able to improve performance appreciably by changing your data definition.

Suppose two indexes both start with MONTH, DAY, YEAR. These identical first three columns can cause the RDBMS's query optimizer to choose the wrong index, in which case you will generate a significant amount of unnecessary and unproductive overhead. Restructuring one of the indexes can make a difference.

Using Vision files

If you cannot restructure your data but are finding certain operations to be too expensive, you might want to consider moving some data into the Vision indexed file system.

Guiding the data searches

You can guide the data searches that result from database queries, and thus improve performance, by making use of an external variable called a4gl_where_constraint. This process is explained in The WHERE Constraint topic.

Guideline Three

A database can interact with your COBOL program in surprising ways. When you introduce a database to a mature COBOL application, your application may experience errors you have not seen before. Being aware of this possibility helps you to prepare your response.

For example, your existing application without the database may rarely exceed the limit on the number of data files that can be open simultaneously. But when you add a database, you increase the likelihood significantly. This is because the query processing may often require temporary tables. Also, the ORDER BY clause (used in SQL statements to sequence retrieved data) opens temporary work files. So you may find that these files cause you to reach the limit sooner. (Note that proper tuning of the query optimizer can reduce the number of temporary files required.)

When you upgrade to a new version of the RDBMS, be careful. The new software components may interact with your applications differently than their predecessors did. This is to be expected. It's important to rerun your pre-installation simulations (see Guideline Four) and determine whether your system resources are still adequate. Investigate any differences in the two simulations. You may have to make adjustments to compensate for the differences in the RDBMS versions.

If you notice a change in performance with a new release of your RDBMS, keep in mind that certain database settings can have a significant effect on speed. Fine-tuning your settings can make a difference.

Guideline Four

We cannot overemphasize the importance of planning ahead for growth. You need to be able to predict the system resources that your application will require when it reaches a full load, both in terms of users and database size.

Before you choose and install hardware, it is best to run a simulation. Your hardware vendor or RDBMS vendor can help you to simulate a large number of users on a given platform with a given RDBMS.

Setting up and running a simulation that includes your own application does cost money. But if you are moving into an installation of any size, the consequences of not knowing what to expect can be far greater than the cost of the simulation.

A potentially costly mistake is to test in your office with a small database and a small number of users. Databases respond differently to small amounts of data and a small number of users than they do to large amounts of data and a large number of users. Functionally the database works differently as the load increases. You might think of this adjustment as switching into a different gear. Significant changes in the load on your database can lead to large increases in overhead. The behaviors and loads you will encounter as the database expands cannot be determined from a linear projection based on the smaller scenario.