Ragupathy Krishnan Marimuthu, senior software engineer for a global technology leader, advises tech startups to focus on the business

By Ellen F. Warren

Ragupathy Krishnan (“Ragu”) Marimuthu, currently a Senior Software Engineer for a global “top three” company by market capitalisation, has worked for the past two decades in software development and engineering at some of the world's most recognisable technology and consulting organisations. Over the course of his career, he has earned multiple awards and recognition from tech leaders including Gallup, Philips, Dell, and Tesco for his innovative contributions to development systems and complex engineering projects.

As a senior-level specialist in Azure cloud computing platforms and artificial intelligence (AI), Ragu deploys his robust business analytics and technical skills to designing solutions and developing high performance technology applications utilising .Net, cloud computing, cloud infrastructure, and cloud security that maximise efficiency across various sectors. His recent publications offer expert thought leadership focused on topics including enterprise transitions and the business benefits of microservice architecture.

Ragu earned a Master’s degree in Information Technology and Management from the University of Texas at Dallas (US). He previously received a Master’s in Software Science from Periyar University in Salem, India, where he completed his undergraduate education.

He achieved the Microsoft Certified Technology Specialist professional credential in 2007, and in 2015 earned the prestigious ITIL (Information Technology Infrastructure Library) Foundation Certificate in IT Service Management. He is honoured to serve as an awards judge for the Brandon Hall Group Technology & Ed Tech Excellence Awards.

Q: Ragu, at the rapid pace that new technologies are emerging, this must be a dynamic time to be a senior software engineer. What excites you most about how you are using AI and cloud computing right now? Can you give us some examples of transformative projects you’ve led, and how you are applying new tools?

A: Yes, tools and technology are evolving at a higher pace than we imagine. An abundance of data has become a focal point. Every system is flooded with data, and the technology is evolving around that to get highly processed, usable information from the data to benefit businesses with information and analysis that can drive better decision-making and outcomes.

Most of the business decisions are made purely based on the mined data points. Technical specialists and senior engineers play a crucial role in developing systems that use the latest cutting-edge technologies, to create high performance applications that will better facilitate processing of the  abundantly available data.

In this regard, there are basically two challenges. One is to create systems that can process data much faster and store it instead of raw data, and this is a continuous improvement process as the amount of data received is progressively increasing. The second challenge is to adopt new technologies, as that will greatly help in development cycles, but there is a steep learning curve that is further complicated by the need to constantly update to the newest iterations and tools.

For example, one of my current projects was using Bond libraries for high performance serialisation/deserialisation of flowing data, which can go as high as 20,000 events per second in a single virtual machine. I was involved in designing and delivering the project to replace Bond libraries with Protobuf (Protocol Buffers) libraries for schematised data.

This mechanism can process the same throughput with much better CPU and Memory usage, but there are some customisations involved in achieving this benchmark. Because this is a system side application development, guidance and support will generally not be available online, so it must be accomplished solely with the technical understanding of the crux behind these schematised data. I was able to apply my advanced knowledge and experience in this area to successfully complete this implementation. 

Tesco

Q: You have developed significant expertise in Azure Cloud. What kinds of benefits does Azure offer companies in various sectors? How does it further application development and innovation? Is Azure more suited for certain use cases than others?

A: Currently, I am involved in Azure Cloud infrastructure development. Like any other cloud, Azure offers an end-to-end solution to all businesses and individuals.

There is a myth that cloud technology is only for large scale enterprises and not applicable to individuals and smaller organisations, but this is not the case. Azure Cloud can offer much better services to individuals and small businesses. One significant benefit is that it can greatly reduce the initial setup and maintenance costs. Another advantage is that it greatly reduces the time to complete the business setup, enabling organisations go to market faster.

Typically, in a non-cloud environment, there is lot of infrastructure and human resources involved in setting up an IT office. Constant software updates need to be managed, and security is the biggest threat to all companies. This is especially true for startups and small businesses that need to invest heavily in infrastructure components, but using a cloud infrastructure like Azure can help solve these problems.

Utilising a cloud requires no physical office, software updates are managed by the Azure Cloud team, and the option to secure the applications is available by a click of a button through cloud services. Most importantly, out-of-the-box AI tools are available in Azure Cloud, and these can be used by all customers, with the option to customise the AI algorithms based on user needs. This makes Azure Cloud easily accessible for anyone to opt in.

Q: You recently published an article advising organisations, and especially tech startup enterprises, to adopt a business growth mindset, as opposed to diving into a technology transition. Tell us more about this approach. What guidance do you give to startup leaders? How should they vet new business tools to derive the best return on investment (ROI)?

A: Many people working under a large organisation come to believe that they are hindered by their limited resources. They may be strong in their own area, but they often don’t know about other segments, which creates hesitation and fear when it comes to starting their own business. Fortunately, we now have cloud computing with AI and machine learning (ML), and these tools can be useful in filling those gaps.

When someone with technical savvy starts a business, he or she must start thinking about the business aspects, even more than the  technology behind the business. If necessary, surround yourself with people who have knowledge in the areas where you lack expertise, and seek assistance from the non-technical people in business processes and decision-making.

For startups, I advise against investing a lot of money in developing technology, as the money may go to waste if there is a shift in the business strategy that results in updating and/or changing the technology. This will ultimately add costs and time. Instead, a better strategy at the outset is to use off-the-shelf technologies and cloud computing, which eliminates the need to invest in building an internal tech infrastructure.

In my experience, I’ve found that a businessperson with a technical background always considers the possibility and duration of going to market based on the technology, rather than considering the business opportunity. Focusing on business is more important than technology, as technology keeps evolving, and businesses can hire, outsource, or use cloud computing to leverage the technology and change directions if required. This will reduce the initial setup cost.

Q: You also advise businesses about transitioning from monolithic legacy systems to microservices architecture, both from decision-making and implementation perspectives. What are the factors companies should consider, and what are the advantages to be gained with a microservices platform? How are new cloud-based technologies facilitating the adoption of microservices architecture?

A: Cloud computing has changed the entire software ecosystem from development to deployment.

Architecture created to handle a business using a cloud computing plus microservice is totally different from creating a monolithic application. Yes, monolithic applications can also be managed by cloud computing, but it will not yield the real benefit of cloud computing, so migrating to microservice is a must.

Most of the companies that migrate an existing project to the cloud believe that they are using a microservice that can scale up and down, but in fact they typically are consuming more resource than they actually need.

If they break the entire project into smaller elements and then migrate the components to the cloud, it can partially scale up and down based on the actual usage of service. Companies should stop creating monolithic applications if they have a plan to scale the business. Monolithic applications can be created when they develop a prototype or proof of concept, as this will be faster to create and demo. Microservice is the solution for any production-level system.

IaaS (Infrastructure as a Service), PaaS (Platform as a Service), and SaaS (Software as a Service) can greatly help in developing and deploying microservices. Cloud service providers have their built-in infrastructure and software services which can be used out-of-the-box to enable certain key features like logging and tracing.

Kubernetes can help with scaling the applications. Notification services like AWS-SNS (Amazon Web Services-Simple Notification Service) and Azure Service Bus can help in connecting and communicating with multiple services. 

Q: Over the past 20 years, you have worked closely with global business analysts, specifically for complex projects in the tech and retail sectors. How has this unique experience enhanced your ability to analyse business needs and collaborate with different functional teams across a project’s development and implementation?

A: I started my career in software engineering  focusing on client/server application development. While working primarily with Microsoft tools and technologies, I soon became interested in system-side application development. I am passionate about creating technology which other individuals and organisations can use, rather than developing a software application using some tools.

I was preparing for use cases by working with multiple projects and organisations to understand the various business and the solutions that are provided. I also had excellent opportunities to work with various organisations across the globe, including but not limited to the United Kingdom, Germany, the Netherlands, Singapore, and the USA. This helped me gain global exposure to various projects and the development of technology in those countries, as well as in the various organisations and industry sectors.

After gaining more than ten years  of experience, I earned my Master’s degree in the United States to start my career in the next gen technology field. As planned, I got the opportunity to work on cloud, AI, and ML technologies, and I am now working with a cloud infrastructure team creating solutions for Azure Cloud to be used by other businesses and individuals. The global experience that I gained over the past two decades has helped me greatly in decision making as I create next generation system software applications.

Gen AI

Q: One of your engineering focus areas is designing telemetry applications, particularly with Azure and AI integrations. Can you tell us about how you develop and deploy this innovation? And how can businesses benefit from deploying telemetry applications?

A: I perform a critical role as  part of a cloud computing technology team involved in creating cloud infrastructure projects. My focus area is designing and developing telemetry applications for cloud computing. In a non-cloud environment, applications are created and deployed on their own private servers, and the telemetry is always available on their servers at any point of time. However, this is totally different in cloud computing.

Individuals and businesses deploy their applications in any of the public clouds, like AWS, Azure, and Google Cloud Platform (GCP). These public cloud providers create infrastructure for businesses based on the configuration, and it is typically a public virtual machine that is deployed across the globe.

These virtual machines are not 100% available, because they keep getting destroyed and recreated based on the load. In this scenario, users’ telemetry information like logs, metrics, and traces cannot be found when the virtual machines are destroyed or recreated.

I work on projects that collect this information and store it in a different virtual machine, and make it available for  customers 24/7. The biggest challenge with this kind of project is the amount and the frequency of data. I use all advanced system software development concepts, like the In-Proc memory processing mechanism, to process the received data before it gets stored.

Additionally, this telemetry project removes the overhead for the business users to create their application to store and retrieve data from cloud. With my solution, a business can do this with just a few configuration settings and enabling the necessary certificate to secure it. 

Q: What are some of the other high performance applications you have developed recently? With such a diverse array of new applications in and entering the marketplace, how do businesses keep up? How do - or should - business and IT leaders evaluate which tools to leverage to drive process improvements and profitability?

A: Recently, I started working on an Open Telemetry Protocol (OTLP) project for Azure cloud. OTLP is one of the open source protocols that is used for collecting and transmitting telemetry data. This OTLP can be configured to collect information using GRPC and HTTP depending on the customer’s needs.

I work on the unified application, which will work on both Linux (Ubuntu, Centos, Mariner) and Windows operating systems, and supports both X64 and ARM64 OS architecture. As OTLP is an open source protocol, any individual and business can easily get help in understanding the OTLP architecture. They can publish the data by setting the cloud configuration, and they can start getting the logs, traces, and metrics for their application.

This will directly impact their development time to create such a sophisticated infrastructure for cloud computing. As my project supports various authentications like Key Vault and certificate base authentication, it is highly secured. Currently, this application can process approximately 70,000 events per second per virtual machine. 

Q: Much of your development work has supported companies in the retail sector. What is your forecast for how new technologies will impact this industry over the next five to ten years? What are the applications that retail business owners should be looking to adopt, and which will deliver the greatest ROI?

A: Yes, I have previously worked in many major retail companies. During that time, I primarily worked on projects in which I developed applications to crawl our competitors’ websites for price matching and providing offers. In recent years, third party service providers have evolved to offer services to get the current price and price history of any product. Availing this kind of service, rather than developing in-house applications, will help the business in two ways.

First, the service is readily available, in comparison with developing in-house applications, which is a time-consuming process provided the technical team is already available. Getting this service from a third party will help the technical team to focus on other business areas for improvements. Second, a lot of effort is required to maintain web crawler applications. I worked in one of the largest retailers in the UK developing a web crawler application, and it used to work well for almost six months.

Later, the company’s competitors drastically changed their application, making it impossible for our in-house web crawler to grab the live price, which necessitated redesigning the project. This created a huge loss for the company. Alternatively, if this service had been purchased from a third party (i.e. CamelCamelCamel.com), the company might have saved a lot of time and resources.   

Q: As a senior leader, you bring strong leadership skills and are heavily involved in training other software engineers and teams and evaluating their work. How important is this to your role? What are some of the challenges and upsides you encounter in this function?

A: I was involved in training new and lateral employees in both technical and subject matters. In the past two decades, technology has evolved along with education. Training college graduates 15 years ago is totally different than training college graduates of the present age, thanks to the development of technology and social media.

Present college graduates are aware of the latest technological trends, such as AI, ML, and cloud computing. However, there is a challenge in training them to design and develop legacy applications.

Some of the legacy applications, such as banking systems and retail systems, are getting upgraded, but a good number of applications are still running with legacy code base. Though the new graduates are knowledgeable in cutting-edge technologies, the need for understanding the legacy software application development is inevitable.

Most notably, the improvement of computer hardware has developed to a point where running any inefficient code is now possible, even if it may ultimately lead to project failure. But ten years back, that was not the case, as the resources were limited and any small memory leak could explode the whole server, so quality was given great importance. 

We are now in a business environment in which competition is growing fast, which changes the priority of a business. In the rush to bring a product to market, a business is now likely to adopt any working solution, as opposed to the most efficient solution. Because products need to be released as quickly as possible, quality takes the back seat. This creates issues in the long run.

Q: In addition to training your internal team members, you serve as a judge for the prestigious Brandon Hall Group EdTech Excellence Awards programme, which recognises innovative and cutting-edge trailblazers in education technology at all levels, from K-12 and higher education schools, to professional development and other associations, nonprofit organisations, and government agencies. How did you come to be a judge and why is this important to you?

A: I am always looking for opportunities to learn and contribute. I was grateful for the  opportunity to serve as a judge for the Brandon Hall Technology Excellence Awards, an annual event that focuses on awarding the best innovative projects in technology. I realised that reading technical journals and newsletters is not sufficient to understand the growing trends, as technology is evolving faster than ever before.

I wanted to view the other people working on technology and learn from them, and also wanted to share with others the knowledge that I have gained over the past two decades working in various sectors across the globe. The Brandon Hall Technology Excellence Awards programme is the perfect fit for my expectations. I was honoured to be selected to be part of their judging panel after submitting my resume, providing my technical background, and clearing their judging interview.

This is helping me in two ways. First, the participants of this event are highly respected and have excellent reputations in the technology industry. These participants are submitting their innovative products, which can truly change the world in a better way. I get an opportunity to understand and learn about those trends and future technologies before they are publicly available.

Next, I have the opportunity to rank and provide feedback about the participants’ innovative products and services, which I hope will help them with improving their projects. This is indirectly making me a part of their next gen innovation. This is very satisfying, as I get a volunteering opportunity that enables me to learn and also contribute to society.