Implementing Business Processes


Introduction to Business Process Engineering

Business Process Engineering (BPR) began life in the early 1990s. Barothy, Peterhans, & Bauknecht (1995) give a good introduction to the state of Business Process Reengineering at the time, stating:

Over the last few years we have observed the emergence of a new field or phenomenon in MIS practice and research: Business Process Reengineering (BPR). Since the publication of Michael Hammer’s article “Reengineering Works: Don’t Automate, Obliterate” in 1990, reengineering became a new and hot “buzzword” in management. Despite a lack of clear understanding organisations of today cling to it as the ultimate panacea in order to realise major improvements in productivity, quality, time and profitability. They also see reengineering as a way to adapt their business to faster changing environment and as a new paradigm in the deployment of information technology (IT). In order to sell services, consulting companies never tire of glorifying success stories like the reengineering of Ford’s accounts payable, or Mutual Benefit Life’s insurance applications process both resulting in “order of magnitude” improvements. But companies also learn the hard way that the radical redesign of business processes, by fundamentally rethinking the way business is done, bears major risks, is a highly complex change task, and may easily end in failure.

Continue reading

How To Install GlassFish on Windows


1. Download GlassFish

Download the installation jar for GlassFish. I went with GlassFish V2, since this is the latest stable release, though I think this technique will work with any version listed on the download page.

2. Install it

The GlassFish V2 Installation instructions show how to install it once download is complete. The problem with the GlassFish installer is that it does not register as a service with Windows, so once you log out, the GlassFish instance will terminate.

3. Register as a Service.

There are a few ways of doing this. The Hard Way, or my favourite the Easy Way. Once the service is registered, GlassFish will continue to run once you log out, and can be controlled through the Services applet from Control Panel, though continued use of asadmin is recommended.

Alternatively, use Ubuntu, where you can install GlassFish as easy as:

sudo apt-get install glassfish

Ubuntu wins again :-)

Adapting ITIL to Distributed Web Applications


Introduction to ITIL

The Information Technology Infrastructure Library version 1 (ITIL) was initially published by the Office of Government Commerce in the year 2000. ITIL is a broad framework of best practices which enterprises are using to manage their IT operations. This quickly grew to over 30 volumes within the library, so when ITIL version 2 came to be released a concerted effort to consolidate the processes described into logical sets was attempted. ITIL v3 continues in this vein by consolidating into five core titles:

  • Service Strategy
  • Service Design
  • Service Transition
  • Service Operation
  • Continual Service Improvement. Continue reading

Current Trends in SOA


Introducing Middleware.

In many ways, humans have been integral to the operation of computer networks, since the dawn of the computer age. In the early 1980′s computers had moved beyond governmental, military and research institutions, and were becoming more common among corporations.

At this stage of their evolution, computer applications were stand-alone. The finance applications of the world lived inside mainframes, and interacted with humans via green screen terminals. The order management systems of the world interacted likewise. Humans were the network, as information could only flow between systems through human intermediaries. For example, a clerk would run a report on one system and re-key the results into the other.

Humans being intelligent, could enforce policies on the flow of information. They could decide what information was appropriate to flow between systems, how quickly it should flow, and how the flow should be achieved.

On the other hand, humans, being error prone, caused this flow of information to be slow, laborious and costly. One day while typing yet another report, a clerk dreamed of letting the computers talk to each other directly. Suddenly, the network revolution had begun.

While the idea was sound, the reality of facilitating this was difficult. Henning (2006) notes, that “persuading programs on different machines to talk to each other was a nightmare, especially if different hardware, operating systems, and programming languages were involved: programmers either used sockets and wrote an entire protocol stack themselves or their programs didn’t talk at all”. What was needed was some sort of automated intermediary between systems.

Continue reading