Technology Trends in Business

Technology trends for businesses in 2020

by | Sep 12, 2020 | Technology

In the first part of the series, we look at the top three technology trends that enterprises will focus on to develop business at a breakneck pace to navigate the current crisis.
Share to lead the transformation

In the times of COVID-19, when enterprise leaders are facing tremendous pressure to keep their businesses agile and profitable, the dependency on technology trends for business to overcome some of the challenges have increased.

Amidst the strong emphasis on social-distancing to contain the crisis, work-from-home is the new usual, resulting in rising business complexities. From school, yoga classes to grocery shopping, employee onboarding, medical consultation, and client interactions, everything is being conducted virtually. This is not just unprecedented, but also a unique experience for each one of us since not many were ready for such a sudden and abrupt shift.

These new habits, primarily enforced by the pandemic, have created significant losses to the economy and forced old businesses to modernize quickly. There is already a shift in services, and organizations are reinventing their operating models. Efforts are being made to leverage the potential of new-age technologies such as artificial intelligence (AI), digitization, collaboration tools, and risk management to drive growth and innovation.

In light of the above, let’s look at some of the top technological trends that are expected to redefine the IT of the Future.

Digital transformation in business

COVID-19 pandemic has turned into a decisive catalyst for digital transformation. Technology leaders are now reasonably convinced about fast-tracking their digital transformation efforts to navigate the current crisis and stay profitable. Traditional brick-and-mortar businesses also realize the importance of creating a robust virtual presence to beat the odds. The enterprises have no option but to accelerate their digital transformation efforts to adjust to the new normal. 

Enterprise technology leaders firmly believe that the current crisis has given a growing sense of belief and visibility to organizations on the best ways to tackle any future disruptions. (See: Chandresh Dedhia, Head of Information Technology, Ascent Health)

One of the biggest challenges that many enterprises are still facing is to drive the mindset shift of their employees. The next six to twelve months will witness a strong effort from enterprises of all scales to adopt technology advancements, change their organizational structures, and inculcate new dynamics of virtual behaviors within their larger teams. Learning resources and tools which can help in upskilling and reskilling will be in demand.

Updating business continuity plans

Covid-19 is proving to be the litmus test for many organizations to stay resilient and operate without any disruption. The disruption caused by the pandemic was a nightmare for many enterprises as they were not well-equipped to manage an upheaval of such magnitude. In the months ahead, organizations will be seen implementing and integrating new and advanced technologies in their Business Continuity Plans (BCP). Modernization of applications and tools to check employee health, emergency response, and data backup functionalities will be strengthened and restructured. 

The focus will be on deploying technology solutions that not only drive remote working but also help reduce operating expenses and increase business resiliency.

Application of AI in business

There is a growing organizational interest to adopt artificial intelligence (AI) technology to accelerate growth, innovate, and disrupt the market. The next couple of years will see enterprising testing and deploying several AI-capabilities to predict human behavior and fortify their market share. 

A recent study commissioned by global consulting major EY and trade association body Nasscom says that 60% of Indian executive leaders believe that AI will disrupt their businesses within three years. (See: Enterprises jump on the AI bandwagon but seat belts are few and Covid-19 lessons for accelerating AI usage).

Once the offices resume their physical centers, tech-leaders will strongly rely upon AI-based intelligent data processing and contactless technologies to ensure their employee maintain social distancing.

Implementing Chatbots to address customer grievances will be accelerated. The banking sector, for instance, has already taken aggressive steps to deploy innovative AI-based chatbots and tools to provide 24*7 customer support to their customers. (See: ICICI Prudential extends coverage of conversational AI Ligo and AI in banking now geared for a takeoff)

AI will specifically drive colossal traction for the industries which operate in the retail and supply chain. Since a majority of the consumers will continue to shop online for an indefinite time, AI-driven technologies will enable businesses to identify consumer purchasing patterns, launch new products, and provide an exceptional experience to their customers.

Stay tuned at Better World for the second part of the enterprise technology trends series, which will focus on technologies such as Cloud Computing, Blockchain, and Cyber Security.

 

 

 

 

MORE FROM BETTER WORLD

Milind Khamkar, Group CIO, Super-MAX

Milind Khamkar, Group CIO, Super-MAX

Viewpoint

Milind Khamkar

Senior IT Leader

“Storage versus applications continues to be a chicken-and-egg story.”

Storage versus applications has always remained a chicken-and-egg story. What comes first, storage or applications, is an interesting conundrum. Moreover, it is very difficult to predict how much of storage is enough. These two things keep the IT situation always fluid and the IT teams on their toes. A perfect solution remains ever elusive and a predictability around storage is hardly achieved.

CIOs start with some resources, and then the demand scales and sometimes goes out of scope. So the intelligence around storage requirements always remain a burning issue.

The landscape is constantly transforming. Original equipment manufacturers (OEMs) need to develop strategies to provide some predictability in terms of the applications’ storage requirements.

Also, it is of enormous significance to separate the professional and personal data, mainly in the context of regulation and compliance coming into force.

To my mind, cloud is an integral part of digital transformation. And the adoption of the cloud has been accelerated in this pandemic time. On a positive note, the pandemic has brought in some good changes, accelerated cloud adoption being one of them. Businesses that are embarking the digital transformation journey cannot ignore the importance of cloud. Hence, cloud is essential in today’s era, especially if you are going for new digital technologies. The kind of security questions we were grappling with before are no more there. Now, even the regulatory and compliance issues are taken care of to a large extent.

However, with new digital applications, latency is likely to be a key issue that public cloud may not be able to address adequately. That is where the significance of on-prem models becomes vital again.

Also read Viewpoint by Archie Jackson, Head – IT and Security, Incedo Inc. 

Storage Transformation Viewpoints

The new digital technologies are what we call the wave-2 digital technologies. They are getting developed with no precedence. So, the predictability about their behavior is extremely low. Plus, they are extremely resource hogging technologies. They put high demand on processing and storage resources, and the volume of data they generate is phenomenonal. The traditional storage technologies that were not developed for this era were tasked with matching the data needs of these technologies.

Going forward, storage elasticity will be extremely important in meeting these needs. On-premise data centers will therefore need to exhibit a cloud-like behavior. In fact, new-generation data centers are already providing storage on demand. That is going to become the new norm.

“Intelligence around storage requirements remains a burning issue. OEMs need to develop strategies to provide some predictability in terms of the applications’ storage requirements.”

Storage Transformation Viewpoints
Greesh Jairath, Senior IT leader

Greesh Jairath, Senior IT leader

Viewpoint

Greesh Jairath

Senior IT Leader

“AI has started playing a key role in ensuring SLA s and business availability.”

Storage is the underlying foundation of IT. Everything, including the applications and the structured as well as unstructured data, resides on storage media. However, storage solutions have move much beyond the hardware layer. Today, the virtualization layer has become the heart and center of all data centers, be it a private data center or a public cloud. Moreover, in the last three years or so, artificial intelligence (AI) has become a critical part from a storage perspective, and has started playing a significant role in ensuring SLA s and business availability.

Whenever an IT issue comes up, there has be either a storage problem, a network problem, or an application problem. AI simplifies the task of pinpointing the problem. And if you’re able to solve those issues immediately, it helps.

That’s point number one. Point number two is definitely in terms of scalability. Today, data has been growing from terabytes to gigabytes and exabytes, and the kind of scalability available within the controller set is enormous. So, it enables people running on-prem data centers to scale it almost on a demand basis, which has come very far in terms of intelligent storage on the data centers. Third is the agile part and the security that need to be factored into the storage component.

Also read Viewpoint by Archie Jackson, Head – IT and Security, Incedo Inc. 

Storage Transformation Viewpoints

The industry is witnessing a massive amount of transformation, and that is impacting storage as well. Storage transformation is already underway, though there are relative challenges on the ground.

Earlier data used to be about read and write, but now it’s mostly about write and read. Plus, we have big data, where there is lot of unstructured data.

Whenever we plan for storage or its replacement or scalability, we always look at it from a hybrid perspective. While some of the data will be available on prem, some of it will be available in the cloud. And if there are multiple clouds, then we have a provision available to move data from one cloud to another. The entire scope or design of storage has been taken at a different level altogether, wherein you provide the best-in-class security to fulfill the needs of compliance, security, and agility.

Today, data centers could very well be managed through automation to ensure that they run fine if errors happen due to known issues. Some alerts can go to the system admin or the backup admin for respective measures. So I think the intelligent data center is developing and progressing well. It’s not fully developed yet, but things are moving well in the right direction.

So, typically, when you look at the front cache or the cache available and the indexing on the storage, they are algorithms. They understand how to address structured data versus unstructured data. Also, with AI, provisions are available, either through an open stack or through our existing vendors, to ensure that those are being looked at differently.
Compliance is a key issue that one needs to factor in. Particularly, when GDPR aspects are involved, data retention can be a key challenge. It is important to differentiate between personally identifiable information (PII) and normal data. In terms of data, we have been ensuring that all the storage needs to be encrypted. A key question that CIOs must answer is: in case of an attack or a security threat, what data has been moved out? This could be of great importance because most organizations don’t even understand what information has been lost during an attack.

These are very grave concerns for organizations. While we try protecting data right from the endpoint to the perimeter, but in case an event happens, often one doesn’t even understand that the event has occurred.

Going forward, among other things, blockchain-based mechanisms are likely to evolve such that data may be protected in a far more better way.

“Whenever there is an IT issue, there is either a storage problem, a network problem, or an application problem. AI simplifies the task of pinpointing the problem.”

Storage Transformation Viewpoints
Charu Bhargava, Vice President – IT, Sheela Foam

Charu Bhargava, Vice President – IT, Sheela Foam

Viewpoint

Charu Bhargava

Vice President – IT, Sheela Foam

“One must maintain an equilibrium between convenience and compliance.”

Storage is becoming everyone’s necessity and the size of storage is increasing phenomenally. In the current scenario where virtualization plays a very important role, storage solutions should be able to provide an expandable or rather an ever-increasing input–output ratio because when everything and anything has to be stored and retrieved, you don’t know where the volumes are going. So storage has started playing a very important role in day-to-day operations, and it is ever-growing. It, therefore, has to be agile and scalable, right from the design stage.

Earlier, organizations used to struggle with files. Today, everyone is working with electronic data as digitalization has become the buzzword. So, organizations want to digitalize and store everything that is raw. The goal is to have zero paper but lots of electronic data. One needs that kind of ample space and storage to keep everything. Structured as well as unstructured data are exponentially growing, and before you process that and take out useful data, first you need to store it. The storage space needs to have a modular approach because you need to decide what comes first and how to store the data such that you optimize the resources to the best extent possible. That is where the trend is moving.

Also read Viewpoint by Archie Jackson, Head – IT and Security, Incedo Inc. 

Storage Transformation Viewpoints

You also need to maintain an equilibrium between convenience and compliance. It is never this way or that way, and you have to take both things into account because compliance has to go with convenience. Second, one also needs to consider the data type and how long it is to be stored. You need to identify data that is not useful or an absolute space wastage, and consider how you get rid of it such that it also takes care of your security and compliance obligations. As a data incharge or data custodian, you have to be very mindful of these things.

In fact, this is a struggle that everyone today faces because the volume keeps exponentially increasing. And it is not just structured data, but also unstructured data that is coming in from everywhere, be it text, images, or videos. Everything is getting into your data center. We have 7,000 showrooms and we use visual merchandising, so a phenomenal volume of images is flowing in each day. With AI, ML, and IoT, we work on these data sets. The data sets become so huge that someday you actually need to segregate them and throw things out of your data center, because after a period of time it is of no good.

As an organization we are following a hybrid approach. We have our own data center where all our core applications are residing. To hedge the risk, we have our DR on cloud. For all non-core applications, we use cloud. Security risk is still there on cloud, because cloud being open is vulnerable. On the other hand, a private cloud in an enterprise space, or dedicated to an enterprise, is more secure. As a philosophy, we have been using our own core applications, developed and designed by our own IT team. From a safety, security, and compliance perspective, we have far more control over it. We are working on this kind of hybrid environment and the cloud is actually being used for R&D-oriented applications, where you need expandability.

“We have our own data center where all our core applications are residing. To hedge the risk, we have our DR on cloud.”

Storage Transformation Viewpoints
Archie Jackson, Head – IT and Security, Incedo., Viewpoint

Archie Jackson, Head – IT and Security, Incedo., Viewpoint

Viewpoint

Archie Jackson

Head – IT and Security, Incedo Inc.

Modern storage solutions will require massive reimagining.

At this point in time, enterprises are racing towards an anywhere and everywhere work environment. The pandemic has made it imperative for organizations to transform themselves to meet the core needs of their employees who are scattered across geographies and sites. As a result, organizations are moving away from the erstwhile centralization mindset and going for decentralized architectures.

At the same time, there is a rapid evolution of cloud in the works. Several new technologies, such as analytics and business intelligence, are responsible for the evolution of the cloud in terms of scalability and agility. This evolution has also become a key catalyst for storage transformation. 

Storage Transformation Viewpoints

Today, we operate in a multi-cloud hybrid environment. It’s rare to find an organization working either fully on-premise or being fully dependent on a single cloud, thanks to the multitude of applications we work with and kind of architectures we use. Organizations are using different clouds and are essentially using a hybrid environment. All of this is often supported by multiple technology partners.

Identifying the most optimal solution around storage involves designing something that would be highly scalable, agile, and available as well as be cost-effective, unrestricted, and act as a disaster recovery (DR) option to ensure business continuity. It should integrate new technologies such as artificial intelligence.

Considering all these factors together is extremely important. This leads us more towards soft storage.

Today, application development is happening in a DevOps environment, which is increasingly distributed as well. Individuals may be working in small agile pods, with some storage, some activities, some gits, and so on. Now, when designing a solution, it is important to join all these dots and create a complete architecture and consequently a solution at the very foundation. Storage should enable such a foundation.

To sum up, today we are operating in a dynamically changing environment. So storage solutions should be in an agile format and also move away from a centralized architecture towards a decentralized one.

Also read: New Dropbox features could make pro remote workers more sticky

“Storage solutions should be highly scalable, agile, available, and cost-effective, and also meet DR needs, while integrating new technologies such as artificial intelligence.”

Storage Transformation Viewpoints
Time to get ‘responsible’ with AI systems

Time to get ‘responsible’ with AI systems

Humans have built very complex robotic systems, such as convoys and airplanes, and even neural networks to communicate with each other, but we’re only starting to scratch the surface of what artificial intelligence (AI) can do. It’s also about time we started paying more attention to ‘responsible AI.’

A future with artificial intelligence would be very mixed. It would be an actuality that could not only eliminate many of today’s human jobs, but also allow us to solve complex problems much faster than we could if we used a human brain to solve those same complex problems.

As technology gets closer to achieving full intelligence, we will start seeing the artificial intelligence (AI) systems that are fully self-aware and can think, reason, and act like a human would. This may raise some concerns, because some people fear that as artificially intelligent computers become more advanced, they might start to have a good enough IQ to be more intelligent than humans. The concern is not if, but when, it might happen.

In future we will have artificial intelligent robotic ‘teams’ of robots that can do all the menial tasks which we traditionally assign to humans such as vacuuming, picking up items, cooking, shopping and more. All jobs will eventually be done by artificially intelligent robotic machines. Even without this new development, all work will still be based on traditional methods such as task assignment, task resolution, and reward and punishment systems.

Today, we are beginning to see the first AI machine prototypes at work and many exciting projects are in the works. One such project is a robotic dog, which can recognize objects, humans and other dogs. Other projects include self-driving cars, self-piloted planes, artificial intelligent robots, and new weather systems.

The future of artificially intelligent robotic androids is exciting but also scary due to the autonomous capabilities of these machines. These robotic androids may be made up of two different types of artificial intelligence, a human-like non-conscious neural network (NCL) and a fully conscious human mind with all its own memory, thoughts, and feelings. Some NCL robots may have both systems in one system or may only have one. Many experts believe a full AI will be closer to human intelligence than any current technology can ever make.

Such concerns and apprehensions around AI have triggered the need for AI developments and implementations to be humanly, ethically, and legally more responsible.

Microsoft recognizes six principles that it believes should guide AI development and use (see link). These are fairness; reliability and safety; privacy and security; inclusiveness, transparency; and accountability.

PwC Responsible AI frameworkPwC has created a ‘Responsible AI Toolkit,’ which is a suite of customizable frameworks, tools, and processes designed to help organizations “harness the power of AI in an ethical and responsible manner, from strategy through execution.”

The field of ‘Responsible AI’ is generating more and more interest from various stakeholders, including governments, developers, human-resource experts, and user organizations, among others.

Distributed cloud is the new enterprise IT frontier

Distributed cloud is the new enterprise IT frontier

A titanic struggle for control of the cloud has begun in earnest by the emergence of various distributed cloud architectures. The shift is being driven by the need for enterprises to move away from traditional infrastructure-aspect-management to ‘utility cloud’ models, which can be far more sustainable as long-term strategies.

Amazon Web Services, IBM, Google, and Microsoft are the giants whose bet in the development of such virtualization technologies has won them large shares of the cloud market. Several other companies are also active in this arena, and a closer examination of the main players may reveal a number of smaller players too.

distributed cloud

Multiple drivers are fueling growth

The star attractions of distributed clouds include (1) low latency due to proximity to user organizations (e.g., on-premises delivery or edge delivery); (2) better adherence to compliance and data-residency requirements;  and (3) rapidly growing number of IoT devices, utility drones, etc.

With distributed cloud services, the service providers are moving closer to the users. These cloud services are offered not just as public-cloud-hosted solutions but also on the edge or the on-premise data center. This approach of having a SaaS model with an on-premise application has its own advantages like ease of provisioning new services, ease of management, and cost reductions in the form of greater operational efficiency brought about by streamlined infrastructure management.

Cloud service providers have a deep understanding of both the needs of enterprises and their unique business requirements. They use their expertise to develop solutions that meet these objectives. They are also well known for providing easy accessibility to their services from the internet. This enables fast and convenient access for end-users.

Enterprises may think that by switching over to a distributed cloud computing service they will lose control of their data. However, the cloud service providers enable excellent security and monitoring solutions. They also ensure that users are given the highest level of access to their data. By migrating on-premises software to a cloud service provider, enterprises do not stand to lose the expertise that their employees have built up during their time in the organization.

Google Anthos: A first-mover advantage

Google formally introduced Anthos, as an open platform that lets enterprises run an app anywhere—simply, flexibly, and securely. In a blog post, dated 9 April 2019, Google noted that, embracing open standards, Anthos let enterprises run applications, unmodified, on existing on-prem hardware investments or in the public cloud, and was based on the Cloud Services Platform announced earlier.

The announcement said that Anthos’ hybrid functionality was made generally available both on Google Cloud Platform (GCP) with Google Kubernetes Engine (GKE), and in the enterprise data center with GKE On-Prem.

Consistency, another post said, was the greatest common denominator, with Anthos making multi-cloud easy owing to its foundation of Kubernetes—specifically the Kubernetes-style API. “Using the latest upstream version as a starting point, Anthos can see, orchestrate and manage any workload that talks to the Kubernetes API—the lingua franca of modern application development, and an interface that supports more and more traditional workloads,” the blog post added.

AWS Outposts: Defending its cloud turf

Amazon Web Services (AWS) has been among the first movers. On 3 December 2019, the cloud services major announced the general availability of AWS Outposts, as fully managed and configurable compute and storage racks built with AWS-designed hardware that allow customers to run compute and storage on-premises, while seamlessly connecting to AWS’s broad array of services in the cloud. A pre-announcement for Outposts had come on 28 November 2018 at the re:Invent 2018.

“When we started thinking about offering a truly consistent hybrid experience, what we heard is that customers really wanted it to be the same—the same APIs, the same control plane, the same tools, the same hardware, and the same functionality. It turns out this is hard to do, and that’s the reason why existing options for on-premises solutions haven’t gotten much traction today,” said Matt Garman, Vice President, Compute Services, at AWS. “With AWS Outposts, customers can enjoy a truly consistent cloud environment using the native AWS services or VMware Cloud on AWS to operate a single enterprise IT environment across their on-premises locations and the cloud.”

IBM Cloud Satellite: Late but not left out

IBM has been a bit late to the distributed cloud party. It was only on 1 March 2021 that IBM announced that hybrid cloud services were now generally available in any environment—on any cloud, on premises or at the edge—via IBM Cloud Satellite. The partnership with Lumen Technologies, coupled with IBM’s long-standing deep presence in on-premise enterprise systems, could turn out to be a key differentiator. An IBM press release noted that Lumen Technologies and IBM have integrated IBM Cloud Satellite with the Lumen edge platform to enable clients to harness hybrid cloud services in near real-time and build innovative solutions at the edge.

“IBM is working with clients to leverage advanced technologies like edge computing and AI, enabling them to digitally transform with hybrid cloud while keeping data security at the forefront,” said Howard Boville, Head of IBM Hybrid Cloud Platform. “With IBM Cloud Satellite, clients can securely gain the benefits of cloud services anywhere, from the core of the data center to the farthest reaches of the network.”

“With the Lumen platform’s broad reach, we are giving our enterprise customers access to IBM Cloud Satellite to help them drive innovation more rapidly at the edge,” said Paul Savill, SVP Enterprise Product Management and Services at Lumen. “Our enterprise customers can now extend IBM Cloud services across Lumen’s robust global network, enabling them to deploy data-heavy edge applications that demand high security and ultra-low latency. By bringing secure and open hybrid cloud capabilities to the edge, our customers can propel their businesses forward and take advantage of the emerging applications of the 4th Industrial Revolution.”

Microsoft Azure Arc: General availability awaited

Julia White Corporate Vice President, Microsoft Azure, in a blog post, dated 4 November 2019, announced Azure Arc, as a set of technologies that unlocks new hybrid scenarios for customers by bringing Azure services and management to any infrastructure. “Azure Arc is available in preview starting today,” she said.

However, the general availability of Azure Arc was not to be announced anytime soon. Six months after the ‘preview’ announcement, Jeremy Winter Partner Director, Azure Management, published a blog post on 20 May 2020, noting that the company was delivering ‘Azure Arc enabled Kubernetes’ in preview to its customers. “With this, anyone can use Azure Arc to connect and configure any Kubernetes cluster across customer datacenters, edge locations, and multi-cloud,” he said.

“In addition, we are also announcing our first set of Azure Arc integration partners, including Red Hat OpenShift, Canonical Kubernetes, and Rancher Labs to ensure Azure Arc works great for all the key platforms our customers are using today,” the post added.

The announcement followed Azure Stack launch two years earlier, to enable a consistent cloud model, deployable on-premises. Meanwhile, Azure was extended to provide DevOps for any environment and any cloud. Microsoft also enabled cloud-powered security threat protection for any infrastructure, and unlocked the ability to run Microsoft Azure Cognitive Services AI models anywhere. Azure Arc was a significant leap forward to enable customers to move from just hybrid cloud to truly deliver innovation anywhere with Azure, the post added.

Looking ahead

A distributed cloud presents an incredible opportunity for businesses that are looking to improve their bottom line while also increasing their agility and versatility.

A distributed cloud is essentially a distributed version of public cloud computing which offers the capability to manage nearly everything from a single computer to thousands of computers. The cloud promises the benefits of a global network without having to worry about hardware, software, management, and monitoring issues. The distributed cloud goes a step further and also brings the assurance on fronts such as latency, compliance, and on-premise application modernization.

0 Comments

Submit a Comment

Your email address will not be published.