Database Transition

We’ve been conducting an ongoing survey of associations’ IT issues and priorities and it’s not surprising that database comes out as the number one priority area in need of attention. This seems to be independent of the state of maturity that each organisation has reached. In other words, the harder their database is working for them the further they want to go with it.

At the low-maturity end we typically find multiple Access databases – the result of a piecemeal approach, a lack of database design experience and a limited budget. At the next level of maturity these have been consolidated and rationalised but are still imprisoned within the office. The next step is to make the data web-accessible and that means moving it to the server configuration (in a data centre).

The generic problem (emphasised by recent research) is that a lot of associations have traditionally run their membership database on Access. However, they need much of the same data on their website. Typically this creates a need for synchronisation of multiple DBs and all the pain that involves.

In a recent project we looked at one association’s situation and, ideally, would have liked to wipe the slate clean and start again. However, we wanted to examine an alternative minimal-change approach that would be less expensive and less disruptive.

We decided to leave Access on the desktop as an administration and reporting tool but remoting the data itself to the (open source) web platform that most associations to use. This allows the data to be available online (e.g. membership directory, booking, renewals etc). It also increases security and allows for administration away from the office.

This is made possible by the availability of ODBC drivers for most common SQL databases, coupled with the (good but incomplete) standardisation of SQL. The connection between the two uses a special internet port with security provided by means of a password and, optionally, an SSL-encrypted tunnel.

Access includes an export facility. Once the necessary tables are exported to the remote database (mySQL in the case in hand) there need to be a few adjustments to the field definitions. For example, an autonumber field in Access must map to a particular set of SQL field properties.

Now the data is on the remote server it should be backed up then deleted from the desktop. The table can be reconnected to the application using Access’ “link table” feature.

The biggest problem with this is that Access does some automagical stuff when forms use linked tables. It appears that ordinary queries tend to result in data buffering which sometimes makes searches very slow. There is a capability to use “pass-through” queries but, although these overcome the buffering problem, they are read-only. We were not able to overcome this although we suspect that a skilled Access hacker could probably script around this. The easier solution is to redesign the form(s) that do updates to use the linked tables directly.

While looking into this we built web interfaces to the application and this is now the way the organisation use their database. It’s very fast and works from any location. We’ve extended it with new functions such as event booking that allow direct member input. Access is still on hand for ad-hoc reporting as needed.

Finally, there is the matter of data integrity now that the database is multi-user. This can be achieved via a locking table or columns that signal the applications not to start more than one update on the same record. This can be approached in a pessimistic or optimistic fashion.

A writeup is available on our website (draft at time of writing).