Make Your Data Work for You: Why Modernizing Core Technology Architectures is Critical

Print Friendly, PDF & Email

In this special guest feature, Manoj Aerroju, Senior Solutions Achitect for TmaxSoft, discusses the virtues of rehosting, whereby existing mainframe applications move unchanged to a modern open system, SQL-based x86 environment, or the cloud. He also provides 10 reasons for considering rehosting. Manoj has more than 15 years of programming and architectural software experience with such diverse and notable companies ranging from GE Capital to BMO  Harris. At TmaxSoft, Manoj has been involved in conducting application assessments, requirement analysis and product demonstrations for both  Tibero, a relational database, and OpenFrame, a legacy modernization solution. He received his bachelor of engineering from Osmania University in India and his Masters of Information Systems Management from Carnegie Mellon University.

To maximize big data, organizations must update their legacy architecture. Mainframe rehosting offers 10 key advantages.

For nearly a decade, IDC has been chronicling the emergence and evolution of the “Third Platform,” built on cloud, mobile, big data/analytics and social technologies. This Third Platform is poised to engulf the enterprise landscape, forever changing the way enterprises operate and reshaping the global economy: According to the research company, “By 2019, Third Platform technologies and services will drive nearly 75 percent of IT spending – growing at twice the rate of the total IT market.”

One major part of the Third Platform is big data – including structured and unstructured data, machine data, and online and mobile data – which supplements organizational data and provides the basis for both historical and predictive perspectives.

Big data isn’t important because of its volume, but rather, because of its value: It allows an organization to take data from any source and analyze it to reduce costs and time; develop new products and optimize current offerings; and make smarter decisions.

Big data will fundamentally change the ways businesses compete and operate, because those that can exploit their data to the max and derive the highest value will have a significant competitive advantage. The performance gap between companies that can generate better data and put it to work for them, and those that cannot, will only widen in the years to come.

If an organization wants to rise to the challenges of the “digital transformation economy” and optimize its ability to analyze data to make better decision and strategic business moves, it may be time for a change. Modernizing core technologies – including the mainframe architectures that have provided the very foundation of many businesses for years – is one way to begin that transformation.

Why Modernize?

You may be reluctant to modernize, and you’re not alone. CIOs have concerns that any kind of change is risky (the “Why fix what’s not broken?” mentality), or the data that accumulated over decades is too intertwined with the company’s monolithic mainframe software applications.

But sooner or later, legacy technology becomes a liability. Conversion costs rise as competitors with newer tech eat away at your markets. Qualified support personnel retire or move on, and your old vendors may no longer be available. Businesses that failed to adapt are left without the needed support for their big iron and COBOL or PLI applications. The IT department is flooded with complaints from unhappy mainframe users, while upgrade costs grow out of line with available budget. Additionally, the cost of third-party software to support applications continues to escalate.

Moreover, some legacy infrastructure only run data extracts once a day, generally at night, so a business using its data the following day is working off old data – and in the digital age, making decisions based on even day-old data can impact operations and profitability.

At some point, change is no longer an option, but an imperative. Maintaining the status quo means accepting mediocre technology, which in turn means business performance is driven by outdated and unsupported mainframe infrastructures – placing the company in a risky position that will only get worse over time.

For a company looking to optimize its infrastructure for big data (as well as cloud, mobile and social), it’s not feasible to stick with the status quo – but many companies are concerned about the risks, time and cost of a complete rewrite.

There is another option, however: rehosting, whereby existing mainframe applications move unchanged to a modern open system, SQL-based x86 environment, or the cloud.

Why Rehost? Top 10 Reasons

For many businesses, rehosting can be a thoroughly cost-effective way to overhaul mainframe architectures to improve big data optimization. When performed properly, rehosting provides many of the benefits over rewriting, with significantly fewer risks and costs.

It can also act as a beneficial first step to a less complicated – and therefore less risky – source code rewrite. Where mainframes lock a customer into a limited and tightly coupled architecture, the loosely coupled architecture of an open system offers dynamic scalability, workload management and agility.

Security is also improved, since existing mainframe security is maintained, and additional safeguards provided by modern SQL databases can now be employed quickly and easily. Rehosting also negates the potential risks of migrating to some newer open-source frameworks where engineers might not yet have all the technical expertise needed to ensure there are no gaps during the migration.

And to meet the new challenges of the Third Platform, modernizing architecture by rehosting allows a business to collect, integrate and distribute data; optimize data warehouse performance; and offload legacy workloads.

If you’re still not convinced, here are 10 important reasons to look closely at rehosting:

1– It helps fund necessary innovation. Rehosting has been proven to dramatically reduce infrastructure and operating costs. These funds then can be reallocated to innovation – which may include the rewriting of legacy apps or new ways of using data.

For example, GE Capital modernized its Portfolio Management System (PMS) and the data with a complete mainframe rehosting solution that moved mainframe applications into a multi-tiered, x86 environment. The modernization took about a year to complete and yielded astonishing results.

By moving the 71 million lines-of-code system moved from an ancient mainframe environment to a modern and open Unix environment, GE Capital’s annual run cost for the PMS system and related applications fell by 66 percent; the PMS disaster recovery time decreased by 240 percent; and the overall application footprint shrank by 78 percent. While the cost savings were enormous, the most positive result was moving to a platform that integrated easily with the rest of the business and supported growth and innovation.

2 – It can support bimodal or two-speed IT. Bimodal IT is a Gartner concept created to help CIOs understand that their IT estate must support both traditional and agile models of IT delivery in order for them to make informed decisions about infrastructure, processes, people and tools.

Investment in legacy Unix, mainframe and other proprietary systems has dwindled, and the demand for legacy server operating environments will continue to decline. Rehosting allows a bimodal system, providing a fast, flexible foundation to quickly respond to market change and future integration requirements.

3 – It allows a business to exploit the cloud. The benefits of storing and processing data in the cloud are undeniable. Organizations of all sizes need to be able to rapidly deploy, scale up and down quickly, and align costs to their specific big data application needs.

Rehosting is critical to helping an IT organization extend its modernizing apps to the cloud. Your choice of a rehosting application should offer the option of running it in a cloud.

4 – It’s less risky than rewriting. Re-engineering projects can take years. Rehosting is not only faster, but it also means no changes to the underlying business logic or user interface, with no negative impact on the enterprise. It requires minimal training, and the system operates in exactly the same way.

5 – It increases uptime and reliability. Rehosting gives the ability to configure Active clustering across your infrastructure, providing the foundation for the “five nines” of availability in this “always-on” world.

6 – It improves performance and manageability. The ability to dynamically scale your environment based on business demand eliminates the need to always provide resources for peak processing – even though they may only be required for short periods of time – maintaining maximum service and reliability.  You pay for what you use, with no impact on the business.

Additionally, being able to scale more easily will support the growing volumes of data that inundate businesses on a daily basis.

7 – It helps identify system inefficiencies. Through rehosting, GE Capital identified 78 percent of its source code was unused. Rehosting allows you to review and evaluate all of your legacy code for efficiency and usage patterns.

8 – It enables a business to leverage existing workforce and skillsets. Mainframe experts still exist because so many enterprises are still running mission-critical systems on mainframes. With rehosting, IT departments can leverage their existing mainframe skills as well as the skills of the open systems teams.

9 – It helps increase agility and time to market. According to CGI, apart from vendor lock-in, organizations still dependent on mainframes are confronted by four realities: slow time-to-market; aging skills pools; lack of access to best-in-class software; and high maintenance costs.

These factors mean that enterprises are facing a decreased ability to be agile and quick to market, which in the modern digital economy is fatal.

10 – It allows a business to provide the best customer experience. We live in an always-on world with consumers expecting a highly personalized customer experience, especially in the virtual realm. This is one reason that digital transformation is so important for many CIOs: Rehosting allows you to unlock the value of your mainframe apps by exposing them to web services for mobile and digital applications, transforming the customer experience.

Meet the Big Challenges of Big Data With Rehosting

Big data’s primary value comes from the insights a business can pull out of all those numbers, because it gives total visibility into the business from every level. But oftentimes, it’s not possible to maximize that data’s value on legacy infrastructure. Unless an organization can maximize the applications it uses, many companies will lack the performance necessary to take advantage of big data.

By modernizing core architectures, a business can integrate data across the enterprise, equipping corporate decision makers with the information they need to maintain and improve business performance, successfully scale as their businesses grow, and identify and implement new business initiatives and strategies.

 

Sign up for the free insideBIGDATA newsletter.

Speak Your Mind

*

Comments

  1. Naresh Babu Kokkonda says

    Nice article