How can we ‘reactivate’ the legacy systems, and in doing so, not only put indeed all the available data to use but also do something good for our climate?

While there are many challenges young companies might struggle with, they certainly escaped one that is a blessing and a curse at the same time – the legacy IT systems. Data is indeed one of the companies’ most valued assets, as knowledge (read, ‘data’) empowers better, more informed business decisions. Moreover, chances are your organization already has most of the knowledge it needs. The only caveat, though, is that it might be inaccessible and therefore, pretty useless.

In fact, according to various estimations, up to 97% of collected data is stored away and never gets to be used again, thus delivering zero value to the organization. Furthermore, a large portion of it comprises the so-called ‘dark data’ – data that is redundant, faulty, too old to hold any value at all, – or data that simply got forgotten.

And – here’s a surprising twist – this has an unpleasant effect on our climate. According to a recently published study by the American company Veritas Technologies, if we continue collecting and archiving all this ‘data waste’, we’ll end up with 5.8 million tons of CO2 being pumped into the atmosphere this year alone.

So, the main questions are, how can we ‘reactivate’ the legacy systems, and in doing so, not only put indeed all the available data to use but also do something good for our climate?

How Did We Get There?

Most organizations are fully aware of the fact that ‘oldie’ is sometimes not ‘goldie’. In today’s world, companies are increasingly relying on digital technologies to develop new products, increase operational efficiency, reduce costs, and – most importantly – ensure that customers have a satisfying experience.

So, how come that many companies still end up with sometimes decades-old systems that either run in parallel or are simply redundant, thereby driving up costs for maintenance and potentially, posing security risks and compliance issues?

Sometimes, a not entirely thought-through data migration strategy is the culprit. A company decides to replace one mission-critical piece of technology with a better one. To avoid disruption to the business operation, it lets both systems run in parallel “for a limited period” until data migration is over. Only to find out that not all data can be migrated as it is due to, for example, outdated and incompatible formats. Or the business logic of the legacy system wasn’t understood as well as it had been assumed, and the cumbersome source code is the only documentation the IT team has.

Another reason could be that the legacy system is – in the truest sense of the word – a legacy from a merger and acquisition. It’s a common issue when an acquired company brings into “the new relationship” outdated, inefficient, or redundant systems. Typically, both sides are aware of this and try to address the issue already during the pre-acquisition diligence process. However, often, this results in devising patches and workarounds that would merely help overcome the limitations of such systems. Consequently, instead of solving the issue efficiently and in the long term, companies often end up with incompatibilities among individual layers of the technology stack.

In the end, whether due to mergers and acquisitions, unforeseen issues during software replacement, or simply as a consequence of organic growth, the result is the same. Potentially valuable data – hence, knowledge – is locked away and inaccessible for use.

#integration #api integration #legacy #integration platform #legacy data #api

Creating Value From Legacy Data – the Whys and the Hows
1.05 GEEK