Bridging the past and future: The role of legacy systems in advancing AI

26-10-2023 | 3 | Decommissioning of Legacy Systems, Enterprise Legacy System Application (ELSA)

“Many industries are looking at how classical deterministic or scientific-technical high-performance computing can be used in conjunction with AI or machine learning to come up with a blended model that’s more efficient”, says Addison Snell, CEO of Intersect360 Research.

Training AI models involves a critical process known as ‘inference.’ Particularly, generative AI necessitates a larger dataset to enhance its algorithms.

Key insights

First: “The one big lightbulb for me was to understand that there was no off-the-shelf, one-size-fits-all solution that’s available out there today unless you’re in a fairly commodity industry”, says Rui Lopes, director of new technology assessment at Elekta, a maker of precision radiation therapy solutions.

Second: The AI InfrastructureView 2021 benchmark survey noted inadequate purpose-built infrastructures as a common cause of AI project failures. This, along with a study revealing that the computing requirements for large-scale AI models doubled every 10.7 months from 2016 to 2022, underscores the accelerating demands of AI technology.

John-David Lovelock, a senior analyst at Gartner, highlights, “In 2023 and 2024, very little IT spending will come from generative AI”. Despite this, organisations persist in their investment in AI and automation, aiming to bolster operational efficiency and bridge talent gaps in IT.
When diving into AI, a significant portion of organisations (43%) prioritise the collection, curation, and cleansing of data as an initial step, according to a 2022 AI Infrastructure Alliance survey.

Data Longevity

Different sectors demand various data retention periods. In healthcare, data might be preserved for up to a century, catering to life expectancies and the extended utility of patient information. Conversely, traditional businesses might maintain data for a duration of 15 to 20 years, aligning with economic cycles and global crises.

Legacy systems and data preservation

Orlando Ribas Fernandes, CEO and co-founder of Fashable, analogises, “Just as word processing programs became commonplace tools to increase general productivity, AI will become a common tool for organisations to increase innovation.”

A critical concern lies in the transition of IT systems, including ERPs. Inadequate attention during this migration phase may lead to premature data losses, thwarting readiness for effective AI utilisation.

Large corporations, owing to their vast historical data reserves, are well-positioned to spearhead AI advancements. A case in point is the transition within SAP’s customer base from the ECC system to the newer S/4HANA application. Legacy systems, vital for housing invaluable historical data, face threats such as technical debts and heightened vulnerabilities, exemplified by recent security breaches in the curl library (CVE-2023-38545 and CVE-2023-38546 vulnerabilities).

Misinterpretations concerning data privacy have also given rise to drastic data retention practices, often resulting in unnecessary mass data deletions.

Solutions like ELSA

ELSA by TJC offers a tailored approach to managing legacy systems, enabling secure decommissioning while safeguarding essential data and upholding data privacy standards, positioning it as a vital asset in preserving the integrity of data, which is essential for future AI opportunities. For more information on ELSA, check out https://www.tjc-group.com/resource/elsa-by-tjc-for-legacy-system-decommissioning/

Conclusion

Currently, AI spending focuses on AI models, but analysts forecast a remarkable surge in generative AI budgets around 2025. However, for this immense potential to be fully realised, the protection and meticulous management of legacy systems information are paramount. Solutions like ELSA by TJC stand out, offering modern technologies coupled with a flexible, purpose-built infrastructure. Such innovative solutions are instrumental in safeguarding legacy applications, thereby ensuring that the vast reservoirs of historical data are preserved and leveraged effectively in the evolution and expansion of AI capabilities.

Right after publishing this article, I came across this recent article from McKinsey Digital: https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/the-data-dividend-fueling-generative-ai

Point nr.4 is directly related to this post: “focus on securing the enterprise’s proprietary data and protecting personal information while actively monitoring a fluid regulatory environment”. McKinsey’s article delves further into fueling generative AI. The bottom line stays the same: No data, no fuel. No fuel, no dividends.