Lloyds TSB, one of the UK’s big four banks, has simplified the array of systems feeding into its business intelligence programs to make sure it gets timely and accurate information ready for analysis.

The bank said it had benefited from taking steps to ensure a more efficient data integration system and a better flow of data to key warehouses that formed its business intelligence repositories, and advised other companies that they could benefit from a similar strategy.

“Typical approaches to data transfer involve reusing data extracts and sending more data than is necessary, including duplicated data, to the data warehouse,” said chief technology officer Chris Nottage, speaking at Butler Group’s Business Intelligence Symposium in London.

“It is more efficient to simplify the process with proper data integration and avoid waiting for files to be copied,” he continued. The data integration program should send all the data to the enterprise resource planning system and a central data warehouse, instead of an array of business intelligence repositories such as ODS warehouses and other systems.

A good data integration tool, he explained, would present data flow clearly on a browser interface with frequent updates, as well as allowing the access to all sources of data and warehouses, without a specific knowledge of any warehouse needed when changing the setup of the data flow.

Nottage warned against rushing into such a project, advising businesses instead to start by choosing an easy to use and robust data integration tool, and to set up the data flow into different warehouses one by one rather than going for a “big bang” approach.

As with a number of other speakers at the event, Nottage also questioned the necessity of real time reporting, despite it being a buzzword in many firms. He said that given the cost, it needed to be carefully evaluated and that reports could still be usefully provided within a longer time frame.

Now take part in our How Green is your IT? survey.