Sunday, February 25, 2007

Project estimates - understanding the developers reticence

Perhaps there is nothing more frustrating for a developer than to be asked to implement functionality that was architected in a context that he or she has very little knowledge about. What compounds the frustration is the fact that technology evolves rapidly so that knowledge and skills gained on an older version of the application are no longer relevant or meaningful in a newer version. When the rubber hits the road and it is time to implement, timelines on a project plan that were estimates suddenly become deadlines. Quite often the developer would have furnished those estimates based on experience with the older version.

There are good reasons for architects and infrastructure personnel to acquire the latest version of a software despite the angst of their developers. These include, maximizing features available in the purchased version, improved licensing agreements, better price performance ratios and reduction in support overhead. This situation was once again played out at a recent client where the specific technology involved was Microsoft's SQL Server database. In this article I seek to traverse the roadmap of the SQL server product and Microsoft's database interfaces to elucidate the complexities that a Software Architect must keep in mind while setting expectations amongst his/her stakeholders.

A Brief History of SQL Server
The first SQL Server version 1.0 was introduced to the market via an alliance between Ashton-Tate, Sybase and Microsoft. At its core it was built on Sybase relational technologies. At that time the only way a developer could interface with the database was through a library called DBLib. This library utilized a protocol called the Tabular Data Stream to communicate with the database. Like many of the application of its time it suffered from poor error-checking and unexplained crashes.



To further its market penetration, Microsoft decided to tightly couple SQL Server with its NT operating systems. To facilitate this, it began to rewrite the database engine core. By 1995, the SQL Sever version 6.0 had been built and the partnership with Sybase had come to an end. Subsequently, Microsoft went on to release additional versions. Version 6.0 centralized the administrative capabilities of SQL Server and therefore found new converts amongst the support and administrative staff within a lot of its customers technology staff. By its version 7.0 however, it had introduced its own architectural framework and eliminated all dependency on its initial database engine core. The 2000 version came with additional features which included data warehousing and data transformation services (DTS). This feature enabled SQL Server to support large scale data movement which its BulkCopy feature attempted to support. The latest version adds a reporting service and an Integration service that eliminates the need for third party reporting tools and improves performance respectively.

A Brief History of SQL Server Connectivity
To improve on the value delivered by its front-end technologies, a number of more mature interfaces technologies were built. These included the well-known Object Database Connectivity (ODBC), the Joint Engine Technology (JET) and the Object Linking and Embedding (OLE). These interfaces allowed the developer to connect to multiple databases. The ODBC interface provided a object layer that connected to various native API within each of the databases. This interface received a favorable reception in the developer community and is still widely used today as an integral part of the Windows operating system. The performance of these interfaces was poor and therefore were never able to gain entry into the enterprise market. Several additional interfaces were built including DAO, COM and ADO. Each of these were also suffered from increasingly complex layered designs and hierarchies that impacted performance and hindered comprehension of the technology.

By the mid-to-late nineties, it became obvious that a whole new approach was needed in connecting to the database engine. The spread of the Internet, advent and adoption of XML as the descriptive language of choice and the need to create the next generation of COM resulted in Microsoft creating a new database connection layer. This was included as part of its initiative to create the .NET environment. It eliminated the complexity in favor of speed and obscure component object models for additional programming by the developer. This interface was named ADO.NET. While the nomenclature was chosen to facilitate a continuity of branding it confused developers as much as it helped.

The Consequences of Change
Each of these evolutions improved upon the performance as well as eliminated shortcomings in prior products. Accordingly, they were aggressively marketed and acquired by the vendor and its customers. However, developers were faced with the task of learning a new feature and delivering to estimates they had made using their experience on prior projects. Inevitably, most products go through major overhauls at some point in their life cycle. At each of these stages, the underlying architecture, user interface and semantics change markedly. This results in most development effort needing at least some research. An architect must take into consideration these overhauls and communicate the implications to the project manager who might be unaware of a magnitude of risk a project faces on account of the use of a newer version.

0 Comments:

Post a Comment

<< Home