Saturday, February 6, 2016

2.2 Welcome to the dschungle: situation today

Let’s have a look into today’s system landscapes. Since the end of the mainframe era in the late 1980’s, enterprise applications are built in a client-server architecture. In its simplest configuration two powerful computers share the job: one is responsible for the process logic and the user interaction, the other one for storing the data in a secure and consistent way. We find a large variety of technologies, from hardware, operating system to software. To simplify the picture let’s for now assume that the application is written in Java and runs on a relational database system.


In the beginning of this IT era processor speed, availability of main memory and hard disc size for the database have been the bottlenecks which limited the overall performance of the system. This picture changed in the 1990’s when disc space became cheaper and processors faster. We were able store much more data but now the I/O speed became the bottleneck as everything was still stored on slow magnetic hard discs. So users start to work around this problem by separating applications. At first transactional systems had been separated from analytical systems and the latter ones received their own name: “data warehouse”. Then we separated within the transactional systems, then within the reporting systems, and so on. We kept adding more hardware, we did intelligent data modeling and indexing of data. In the end our system landscape looked liked this:


We have a lot of separated system running on isolated data silos. The icons with the blue gear wheels represent dedicated reporting systems. As many of the systems need identical data we copy it around in nightly batch jobs (indicated by the arrows). This leads to huge redundant data volumes and inconsistent data. All of this is extremely expensive but after all it is still just technology.

The negative business impact of such landscapes is much more serious.

No comments:

Post a Comment