Integrate Applications Successfully; A Framework to Reduce Complexity

Given their broad functional scope, ERP / Business accounting platforms sit at the very core of businesses globally. Depending on the type of business, these platforms can cover a wide range of operational and financial processes in a single application. However for some industries, often those with highly complex or niche operational processes, integration of the core business/ accounting platform to other solutions is required. As software consultants or system administrators within companies, we are familiar with the solution requirements or dependencies within the applications/ business platforms we are most familiar with. Yet the topic of integrations forces us to consider not only the needs of a single applications, but the needs, functions and technical boundaries of others.

This article seeks to introduce a framework I use consistently to dissect all integration conversations, leading to the easy production of integration working documents on projects and successful integration implementations. This framework is especially useful where no pre-built integration exists between applications. However the framework is still relevant to discuss data flows, where complexities of technical connection are already in place and integration remains more of a configuration exercise.

The Framework

 

I breakdown my framework into five defined steps. Step two to five are repeated as required for each data flow. Step one largely concerns the drawing of the architecture diagram(s) and making high level designs. As such this step is repeated on an infrequent basis should a change in design or requirement be needed during this process.

Diagram

A stalwart of many business processes, but the visualisation of system architecture cannot be understated. As a minimum, the primary diagram must be kept clear and uncluttered avoiding technical language where at all possible, given the multiple stakeholders often involved in implementing and managing Business Platforms. Arrows between systems, bidirectional as needed, should be used to indicate data flow between applications. To add further categorisation I use three swim lanes. Front office is defined as line of business or customer interacting applications. Some examples could be an ecommerce platform, an organisation’s self-developed proprietary platform (common in FinTech), an asset management platform (for management of large CAPEX assets) or customer support/ customer success tools.

The middle office is the layer for processes with core business accounting outputs and processes. Central to this layer is the ERP/ Business Platform, but other applications that might be in use in this layer include invoice scanning tools (OCR applications) or reconciliation tools.

Finally, back office is largely synonymous with “reporting layer”. I often find however, it is important to clearly distinguish between any business intelligence tools, which by their nature are current and historical reporting toolsets. EPM (Enterprise Performance Management) tools on the other hand are, by their nature, geared towards forward modelling and forecasting but do also complete some elements of historical analysis and trending.

Define

It’s now time to define the connection type, master data and transactions which need to flow between the various applications. When considering the connection type, this might differ between project phases. Secure, real-time API integration is certainly a realisable goal in many scenarios, but it must be stressed that this level of integration brings efficiency. Often the same net result, (delivery of data from one application to another), can be achieved using semi-automated flat file methods in the first instance. Often the true business value lies in the delivery of the data at an appropriately detailed level to achieve reporting and processing needs in the receiving application. This avoids users needing to access multiple systems, often adding to confusion and certainly adding more cost in terms of software licences.

Transactions needed to be shared between the applications are easily identified in integration projects.  In this part, it is important not to focus on mapping the transactions or aligning the transactions between source and receiving system; this will come in the next step; we are defining the data needed to integrate. We must not forget the related master data information that is present on a transaction and how and when this is sent to a receiving application.  Should a transaction carry a new customer name for example, that a new customer record is created in the receiving system according to system defaults. Yet it might be equally required, that as certain customers are created, they must be sent to the receiving system ahead of any transactions in order to capture other required or dependent information on the master data record itself.

Align

The easiest integration scenarios are where transaction types between source and receiving system align. In this step we must check for transaction type homogeneity, where possible, between both applications. For example, consider the integration requirement of data between a best of breed Professional Services Automation (PSA) tool and an accounting platform with project functionality. We can align between the PSA system, Projects, Project Tasks, Resources, Invoices and likely Journal Entries (revenue recognition/ cost of sales) and other transaction types. Save perhaps for small discrepancies in available fields between source and receiving system, we can largely consider 1:1 integration of the transactions.

It is equally important to check transaction dependencies in either source or receiving systems. Of importance is to check the automatic actions and dependencies once a transaction or piece of data has been received. What related transactions/ records has the receiving system created, which might not be updated (depending on application functionality), if we push an update to the previously integrated transaction. Usually these dependencies are catered for using process change and or increased rigour.

For example, consider the integration of a timesheet into an accounting platform which allows for customer invoices to be created and cost of sales journals to be posted. If a timesheet update to be pushed through, (perhaps the consultant made the incorrect entries), the invoices could have already been issued to the customer, or the journals already posted with month end activities nearing completion. Therefore, it’s important to consider both process and functional dependencies to design a future-proof integration.

We must note transaction types/ data that aren’t homogenous between source and receiving system. Handling these transactions is covered in the next section.

Aggregation, Transformation and Validation 

Here we address the volumes, transformation logic and error routines needed to deliver a successful integration. Considering the previous section, it might be the case that all transaction types and master data align between sending and receiving systems. However perhaps real-time or identically mapped data is not needed in the receiving application or doing so would require increased database performance that might come with prohibitive commercial costs. Thus we could consider integration by aggregation, such as a daily or weekly aggregated feed. The level of aggregation is largely defined by the reporting frequency required in the receiving system. Is a daily feed necessary when a weekly reporting model is followed?   

In many instances the transaction or data types don’t align between source systems and receiving accounting platforms. Thus, one of the most common topic areas and terms in integration conversations is “ETL”; Extract, Transform, Load. This covers the planning for data extraction, transformation of the data into an appropriate form and stream to be imported into the receiving system as planned. This is often required when the source system is not “accounting aware” and thus produces data needed for accounting, but not necessarily in the ready format for an accounting platform. Perhaps for example, depending on a combination of two pieces of data, defines the general ledger code for the underlying transaction. Equally for reporting purposes, perhaps combination of customer address and sales unit, defines the region the transaction should be associated with. As a result we need ensure that this transformation logic is coded into the integration.

Where there are complex ETL requirements, middleware technology is often used as a toolset to configure and manage integrations. Decoupled from the dependencies of sending and receiving applications, middleware technology can assist greatly in graphically configuring transformation logic. Different types of connection points can be modelled for source and receiving applications, e.g. direct database extraction of data, followed by a transformation routine and loaded into an accounting platform using a journal entry API.

Frequency and Error Handling

Finally, having considered all the above two key elements remain. Integration frequency must be considered with respect to other activities/system load occurring in the receiving system as the integration runs. For example, if a daily feed of data is needed with transactions being 1:1, running this integration might seem logical to start after business hours in say the HQ location, given the volume of data. However, it must be considered that other offices might be transacting in other time zones around the world, and a load of a large amount of data could slow down the use of the system for others, running reports and loading transactions etc. As an example, it might be prudent to split the frequency of integration into three or four intraday runs, rather than a large single load in one go.

This article has thus far planned for full acceptance of data between one system to another, but what happens if something does not go to plan? It’s important to have error handling routines. If using middleware applications, error handling routines are usually in-built configurable functionality, but must be coded if building manual integrations.

How do we handle the routine if a transaction has for instance a 90% match with the required fields, but one field has missing data? Should we allow that transaction to be posted, should it error? Or should we accept the transaction with a ‘9999’, or other conspicuous value, in the missing field and allow updates to the transaction of the missing data? All these areas of validation must be considered and documented to ensure a successful integration.

Conclusion

Every situation and integration scenario are different and must be considered with regards to the business value and need. Two different companies, delivering functionally the same integration, could arrive at an integration design looking quite different and each be viewed as successfully integrated solutions. What is important however, is that the topic of integration is not awkwardly avoided, neither planned but oversimplified. It must be considered in a logical fashion and I have found that this framework, for me, has done just that.

Share this post!