Responsible AI

Time to get ‘responsible’ with AI systems

by | Aug 8, 2021 | Artificial Intelligence

Share to lead the transformationHumans have built very complex robotic systems, such as convoys and airplanes, and even neural networks to communicate with each other, but we’re only starting to scratch the surface of what artificial intelligence (AI) can do. It’s also about time we started paying more attention to ‘responsible AI.’ A future with […]
Share to lead the transformation

Humans have built very complex robotic systems, such as convoys and airplanes, and even neural networks to communicate with each other, but we’re only starting to scratch the surface of what artificial intelligence (AI) can do. It’s also about time we started paying more attention to ‘responsible AI.’

A future with artificial intelligence would be very mixed. It would be an actuality that could not only eliminate many of today’s human jobs, but also allow us to solve complex problems much faster than we could if we used a human brain to solve those same complex problems.

As technology gets closer to achieving full intelligence, we will start seeing the artificial intelligence (AI) systems that are fully self-aware and can think, reason, and act like a human would. This may raise some concerns, because some people fear that as artificially intelligent computers become more advanced, they might start to have a good enough IQ to be more intelligent than humans. The concern is not if, but when, it might happen.

In future we will have artificial intelligent robotic ‘teams’ of robots that can do all the menial tasks which we traditionally assign to humans such as vacuuming, picking up items, cooking, shopping and more. All jobs will eventually be done by artificially intelligent robotic machines. Even without this new development, all work will still be based on traditional methods such as task assignment, task resolution, and reward and punishment systems.

Today, we are beginning to see the first AI machine prototypes at work and many exciting projects are in the works. One such project is a robotic dog, which can recognize objects, humans and other dogs. Other projects include self-driving cars, self-piloted planes, artificial intelligent robots, and new weather systems.

The future of artificially intelligent robotic androids is exciting but also scary due to the autonomous capabilities of these machines. These robotic androids may be made up of two different types of artificial intelligence, a human-like non-conscious neural network (NCL) and a fully conscious human mind with all its own memory, thoughts, and feelings. Some NCL robots may have both systems in one system or may only have one. Many experts believe a full AI will be closer to human intelligence than any current technology can ever make.

Such concerns and apprehensions around AI have triggered the need for AI developments and implementations to be humanly, ethically, and legally more responsible.

Microsoft recognizes six principles that it believes should guide AI development and use (see link). These are fairness; reliability and safety; privacy and security; inclusiveness, transparency; and accountability.

PwC Responsible AI frameworkPwC has created a ‘Responsible AI Toolkit,’ which is a suite of customizable frameworks, tools, and processes designed to help organizations “harness the power of AI in an ethical and responsible manner, from strategy through execution.”

The field of ‘Responsible AI’ is generating more and more interest from various stakeholders, including governments, developers, human-resource experts, and user organizations, among others.

MORE FROM BETTER WORLD

Here’s why the ‘seth’s’ wealth will never become a ‘chavanni’

Here’s why the ‘seth’s’ wealth will never become a ‘chavanni’

Let me make it clear at the outset that the purpose of this analysis is not to delve into the research merit of the Hindenburg report on Adani Group of companies. That is because there is hardly anything in the report that has not been known to the media or the investors prior to this. It is just that the report has succeeded in amalgamating all the available ammunition in one place in an explosive manner.

The purpose of this analysis is also not to defend the Adani Group in any manner whatsoever. This analyst does not hold any recent positions in any of Adani Group stocks for that matter.

The focus here is on the long-term impact that the report may have on the Adani Group as well as on the Indian economy in the aftermath of its publication.

Hindenburg’s intent

It is important to look at the core intent of Hindenburg in ‘revealing’ the open secrets of Adani Group to the world.

Let it be very clear that if Adani Group is not an epitome of business ethics, then Hindenburg is no charitable organization either. It is, well, just another shortseller, which has the singular aim of maximizing profits to a hilt.

The timing of publishing the report simply confirms that. Why, otherwise, did Hindenburg not publish it at least a couple of months or weeks earlier, when, by its own account, it has been researching the Adani Group for two long years?

Very clearly, Hindenburg was waiting until the shares of Adani Group reaching a high and when the Adani Enterprises FPO was on. Hindenburg knows better than many that investor sentiments can best be manipulated at such times. So the sole purpose of this report was to maximize profits for Hindenburg. Also, in doing so, Hindenburg was hiding ‘precious’ information from other investors, and in the process, was being unethical, to say the least.

Hindenburg’s past trophies

Let’s pick up three of such trophies, namely, Nicola, Clover Health, and Jinhua An Kao, for the purpose of this analysis. Nikola Corporation is an US manufacturer of electric vehicles (EV) and energy solutions that had not delivered a single EV to the market when Hindenburg filed its report indicting Nikola of a “fraud” in September 2020. It would roll out its first two EV trucks only in December 2021. The report caused its Nasdaq listed shares to drop in value to USD12 from an earlier high of USD65.

Clover Health, which was founded in New Jersey, USA, in 2012, began selling Medicare Advantage in 2013. It was said to be one of the fastest growing Medicare Advantage insurers in the USA. Interestingly, Clover’s board members included a former first daughter Chelsea Clinton, while its investors included Sequoia and Alphabet’s GV. When Hindenburg made its expose on Clover in February 2021, the company’s shares were trading on Nasdaq at USD12.23 a piece. In the subsequent three months, the value of a share dropped to USD6.59. However, quite significantly, in September 2021, the share price briefly touched a record high of more than USD28 and it was not until November 2021 that the price fell below USD7.0 again. At the time of writing this article, however, the share was trading in the range of USD1.27.

Jinhua An Kao (now Kandi Technologies) too is an EV maker with China being its primary revenue market. Its shares dropped on Nasdaq from USD14.44 a share to USD7.88 a share in about a month’s time. Kandi’s shares now trade slightly above USD2 a share.

Impact on Adani Group

First and foremost, it is important to realize that Adani Group is not just a Nicola, a Clover Health, or a Jinhua, which have been mostly focused on one or two businesses. Moreover, these were yet to become mainstream businesses generating large revenue streams.

It would be too naïve to assume that the Hindenburg report could impact the Adani Group on a scale similar to Nicola, Clover, or Jinhua. This is simply because unlike these companies, Adani Group’s overall businesses are far from being vulnerable. Most of the Adani businesses are having revenue streams that are unlikely to get affected by their share prices. Take the ports or airports for example. Will ships stop docking at the Mudra port or will passengers stop boarding flights at the airports because Adani Group’s shares have fallen?

In fact, even the Adani Enterprises FPO ‘managed’ to get fully subscribed amidst all the Hindenburg hoopla in the media and the simultaneous bloodbath on the bourses.

It will be just a matter of time when the Adani Group shares, and any other shares that may have got dragged along, will find their previous levels. In fact, it won’t be surprising if that happens in a span of months rather than years. Signs of a recovery are already visible, as some of the group shares edged up, even if briefly, on the day of writing this article.

That a shortseller’s report can turn an Adani share into a penny stock can, at best, be a wishful thinking. The ground realities, aka the group’s assets and cashflows, are way too big to get dwarfed.

Milind Khamkar, Group CIO, Super-MAX

Milind Khamkar, Group CIO, Super-MAX

Viewpoint

Milind Khamkar

Senior IT Leader

“Storage versus applications continues to be a chicken-and-egg story.”

Storage versus applications has always remained a chicken-and-egg story. What comes first, storage or applications, is an interesting conundrum. Moreover, it is very difficult to predict how much of storage is enough. These two things keep the IT situation always fluid and the IT teams on their toes. A perfect solution remains ever elusive and a predictability around storage is hardly achieved.

CIOs start with some resources, and then the demand scales and sometimes goes out of scope. So the intelligence around storage requirements always remain a burning issue.

The landscape is constantly transforming. Original equipment manufacturers (OEMs) need to develop strategies to provide some predictability in terms of the applications’ storage requirements.

Also, it is of enormous significance to separate the professional and personal data, mainly in the context of regulation and compliance coming into force.

To my mind, cloud is an integral part of digital transformation. And the adoption of the cloud has been accelerated in this pandemic time. On a positive note, the pandemic has brought in some good changes, accelerated cloud adoption being one of them. Businesses that are embarking the digital transformation journey cannot ignore the importance of cloud. Hence, cloud is essential in today’s era, especially if you are going for new digital technologies. The kind of security questions we were grappling with before are no more there. Now, even the regulatory and compliance issues are taken care of to a large extent.

However, with new digital applications, latency is likely to be a key issue that public cloud may not be able to address adequately. That is where the significance of on-prem models becomes vital again.

Also read Viewpoint by Archie Jackson, Head – IT and Security, Incedo Inc. 

Storage Transformation Viewpoints

The new digital technologies are what we call the wave-2 digital technologies. They are getting developed with no precedence. So, the predictability about their behavior is extremely low. Plus, they are extremely resource hogging technologies. They put high demand on processing and storage resources, and the volume of data they generate is phenomenonal. The traditional storage technologies that were not developed for this era were tasked with matching the data needs of these technologies.

Going forward, storage elasticity will be extremely important in meeting these needs. On-premise data centers will therefore need to exhibit a cloud-like behavior. In fact, new-generation data centers are already providing storage on demand. That is going to become the new norm.

“Intelligence around storage requirements remains a burning issue. OEMs need to develop strategies to provide some predictability in terms of the applications’ storage requirements.”

Storage Transformation Viewpoints
Greesh Jairath, Senior IT leader

Greesh Jairath, Senior IT leader

Viewpoint

Greesh Jairath

Senior IT Leader

“AI has started playing a key role in ensuring SLA s and business availability.”

Storage is the underlying foundation of IT. Everything, including the applications and the structured as well as unstructured data, resides on storage media. However, storage solutions have move much beyond the hardware layer. Today, the virtualization layer has become the heart and center of all data centers, be it a private data center or a public cloud. Moreover, in the last three years or so, artificial intelligence (AI) has become a critical part from a storage perspective, and has started playing a significant role in ensuring SLA s and business availability.

Whenever an IT issue comes up, there has be either a storage problem, a network problem, or an application problem. AI simplifies the task of pinpointing the problem. And if you’re able to solve those issues immediately, it helps.

That’s point number one. Point number two is definitely in terms of scalability. Today, data has been growing from terabytes to gigabytes and exabytes, and the kind of scalability available within the controller set is enormous. So, it enables people running on-prem data centers to scale it almost on a demand basis, which has come very far in terms of intelligent storage on the data centers. Third is the agile part and the security that need to be factored into the storage component.

Also read Viewpoint by Archie Jackson, Head – IT and Security, Incedo Inc. 

Storage Transformation Viewpoints

The industry is witnessing a massive amount of transformation, and that is impacting storage as well. Storage transformation is already underway, though there are relative challenges on the ground.

Earlier data used to be about read and write, but now it’s mostly about write and read. Plus, we have big data, where there is lot of unstructured data.

Whenever we plan for storage or its replacement or scalability, we always look at it from a hybrid perspective. While some of the data will be available on prem, some of it will be available in the cloud. And if there are multiple clouds, then we have a provision available to move data from one cloud to another. The entire scope or design of storage has been taken at a different level altogether, wherein you provide the best-in-class security to fulfill the needs of compliance, security, and agility.

Today, data centers could very well be managed through automation to ensure that they run fine if errors happen due to known issues. Some alerts can go to the system admin or the backup admin for respective measures. So I think the intelligent data center is developing and progressing well. It’s not fully developed yet, but things are moving well in the right direction.

So, typically, when you look at the front cache or the cache available and the indexing on the storage, they are algorithms. They understand how to address structured data versus unstructured data. Also, with AI, provisions are available, either through an open stack or through our existing vendors, to ensure that those are being looked at differently.
Compliance is a key issue that one needs to factor in. Particularly, when GDPR aspects are involved, data retention can be a key challenge. It is important to differentiate between personally identifiable information (PII) and normal data. In terms of data, we have been ensuring that all the storage needs to be encrypted. A key question that CIOs must answer is: in case of an attack or a security threat, what data has been moved out? This could be of great importance because most organizations don’t even understand what information has been lost during an attack.

These are very grave concerns for organizations. While we try protecting data right from the endpoint to the perimeter, but in case an event happens, often one doesn’t even understand that the event has occurred.

Going forward, among other things, blockchain-based mechanisms are likely to evolve such that data may be protected in a far more better way.

“Whenever there is an IT issue, there is either a storage problem, a network problem, or an application problem. AI simplifies the task of pinpointing the problem.”

Storage Transformation Viewpoints
Charu Bhargava, Vice President – IT, Sheela Foam

Charu Bhargava, Vice President – IT, Sheela Foam

Viewpoint

Charu Bhargava

Vice President – IT, Sheela Foam

“One must maintain an equilibrium between convenience and compliance.”

Storage is becoming everyone’s necessity and the size of storage is increasing phenomenally. In the current scenario where virtualization plays a very important role, storage solutions should be able to provide an expandable or rather an ever-increasing input–output ratio because when everything and anything has to be stored and retrieved, you don’t know where the volumes are going. So storage has started playing a very important role in day-to-day operations, and it is ever-growing. It, therefore, has to be agile and scalable, right from the design stage.

Earlier, organizations used to struggle with files. Today, everyone is working with electronic data as digitalization has become the buzzword. So, organizations want to digitalize and store everything that is raw. The goal is to have zero paper but lots of electronic data. One needs that kind of ample space and storage to keep everything. Structured as well as unstructured data are exponentially growing, and before you process that and take out useful data, first you need to store it. The storage space needs to have a modular approach because you need to decide what comes first and how to store the data such that you optimize the resources to the best extent possible. That is where the trend is moving.

Also read Viewpoint by Archie Jackson, Head – IT and Security, Incedo Inc. 

Storage Transformation Viewpoints

You also need to maintain an equilibrium between convenience and compliance. It is never this way or that way, and you have to take both things into account because compliance has to go with convenience. Second, one also needs to consider the data type and how long it is to be stored. You need to identify data that is not useful or an absolute space wastage, and consider how you get rid of it such that it also takes care of your security and compliance obligations. As a data incharge or data custodian, you have to be very mindful of these things.

In fact, this is a struggle that everyone today faces because the volume keeps exponentially increasing. And it is not just structured data, but also unstructured data that is coming in from everywhere, be it text, images, or videos. Everything is getting into your data center. We have 7,000 showrooms and we use visual merchandising, so a phenomenal volume of images is flowing in each day. With AI, ML, and IoT, we work on these data sets. The data sets become so huge that someday you actually need to segregate them and throw things out of your data center, because after a period of time it is of no good.

As an organization we are following a hybrid approach. We have our own data center where all our core applications are residing. To hedge the risk, we have our DR on cloud. For all non-core applications, we use cloud. Security risk is still there on cloud, because cloud being open is vulnerable. On the other hand, a private cloud in an enterprise space, or dedicated to an enterprise, is more secure. As a philosophy, we have been using our own core applications, developed and designed by our own IT team. From a safety, security, and compliance perspective, we have far more control over it. We are working on this kind of hybrid environment and the cloud is actually being used for R&D-oriented applications, where you need expandability.

“We have our own data center where all our core applications are residing. To hedge the risk, we have our DR on cloud.”

Storage Transformation Viewpoints
Archie Jackson, Head – IT and Security, Incedo., Viewpoint

Archie Jackson, Head – IT and Security, Incedo., Viewpoint

Viewpoint

Archie Jackson

Head – IT and Security, Incedo Inc.

Modern storage solutions will require massive reimagining.

At this point in time, enterprises are racing towards an anywhere and everywhere work environment. The pandemic has made it imperative for organizations to transform themselves to meet the core needs of their employees who are scattered across geographies and sites. As a result, organizations are moving away from the erstwhile centralization mindset and going for decentralized architectures.

At the same time, there is a rapid evolution of cloud in the works. Several new technologies, such as analytics and business intelligence, are responsible for the evolution of the cloud in terms of scalability and agility. This evolution has also become a key catalyst for storage transformation. 

Storage Transformation Viewpoints

Today, we operate in a multi-cloud hybrid environment. It’s rare to find an organization working either fully on-premise or being fully dependent on a single cloud, thanks to the multitude of applications we work with and kind of architectures we use. Organizations are using different clouds and are essentially using a hybrid environment. All of this is often supported by multiple technology partners.

Identifying the most optimal solution around storage involves designing something that would be highly scalable, agile, and available as well as be cost-effective, unrestricted, and act as a disaster recovery (DR) option to ensure business continuity. It should integrate new technologies such as artificial intelligence.

Considering all these factors together is extremely important. This leads us more towards soft storage.

Today, application development is happening in a DevOps environment, which is increasingly distributed as well. Individuals may be working in small agile pods, with some storage, some activities, some gits, and so on. Now, when designing a solution, it is important to join all these dots and create a complete architecture and consequently a solution at the very foundation. Storage should enable such a foundation.

To sum up, today we are operating in a dynamically changing environment. So storage solutions should be in an agile format and also move away from a centralized architecture towards a decentralized one.

Also read: New Dropbox features could make pro remote workers more sticky

“Storage solutions should be highly scalable, agile, available, and cost-effective, and also meet DR needs, while integrating new technologies such as artificial intelligence.”

Storage Transformation Viewpoints
Distributed cloud is the new enterprise IT frontier

Distributed cloud is the new enterprise IT frontier

A titanic struggle for control of the cloud has begun in earnest by the emergence of various distributed cloud architectures. The shift is being driven by the need for enterprises to move away from traditional infrastructure-aspect-management to ‘utility cloud’ models, which can be far more sustainable as long-term strategies.

Amazon Web Services, IBM, Google, and Microsoft are the giants whose bet in the development of such virtualization technologies has won them large shares of the cloud market. Several other companies are also active in this arena, and a closer examination of the main players may reveal a number of smaller players too.

distributed cloud

Multiple drivers are fueling growth

The star attractions of distributed clouds include (1) low latency due to proximity to user organizations (e.g., on-premises delivery or edge delivery); (2) better adherence to compliance and data-residency requirements;  and (3) rapidly growing number of IoT devices, utility drones, etc.

With distributed cloud services, the service providers are moving closer to the users. These cloud services are offered not just as public-cloud-hosted solutions but also on the edge or the on-premise data center. This approach of having a SaaS model with an on-premise application has its own advantages like ease of provisioning new services, ease of management, and cost reductions in the form of greater operational efficiency brought about by streamlined infrastructure management.

Cloud service providers have a deep understanding of both the needs of enterprises and their unique business requirements. They use their expertise to develop solutions that meet these objectives. They are also well known for providing easy accessibility to their services from the internet. This enables fast and convenient access for end-users.

Enterprises may think that by switching over to a distributed cloud computing service they will lose control of their data. However, the cloud service providers enable excellent security and monitoring solutions. They also ensure that users are given the highest level of access to their data. By migrating on-premises software to a cloud service provider, enterprises do not stand to lose the expertise that their employees have built up during their time in the organization.

Google Anthos: A first-mover advantage

Google formally introduced Anthos, as an open platform that lets enterprises run an app anywhere—simply, flexibly, and securely. In a blog post, dated 9 April 2019, Google noted that, embracing open standards, Anthos let enterprises run applications, unmodified, on existing on-prem hardware investments or in the public cloud, and was based on the Cloud Services Platform announced earlier.

The announcement said that Anthos’ hybrid functionality was made generally available both on Google Cloud Platform (GCP) with Google Kubernetes Engine (GKE), and in the enterprise data center with GKE On-Prem.

Consistency, another post said, was the greatest common denominator, with Anthos making multi-cloud easy owing to its foundation of Kubernetes—specifically the Kubernetes-style API. “Using the latest upstream version as a starting point, Anthos can see, orchestrate and manage any workload that talks to the Kubernetes API—the lingua franca of modern application development, and an interface that supports more and more traditional workloads,” the blog post added.

AWS Outposts: Defending its cloud turf

Amazon Web Services (AWS) has been among the first movers. On 3 December 2019, the cloud services major announced the general availability of AWS Outposts, as fully managed and configurable compute and storage racks built with AWS-designed hardware that allow customers to run compute and storage on-premises, while seamlessly connecting to AWS’s broad array of services in the cloud. A pre-announcement for Outposts had come on 28 November 2018 at the re:Invent 2018.

“When we started thinking about offering a truly consistent hybrid experience, what we heard is that customers really wanted it to be the same—the same APIs, the same control plane, the same tools, the same hardware, and the same functionality. It turns out this is hard to do, and that’s the reason why existing options for on-premises solutions haven’t gotten much traction today,” said Matt Garman, Vice President, Compute Services, at AWS. “With AWS Outposts, customers can enjoy a truly consistent cloud environment using the native AWS services or VMware Cloud on AWS to operate a single enterprise IT environment across their on-premises locations and the cloud.”

IBM Cloud Satellite: Late but not left out

IBM has been a bit late to the distributed cloud party. It was only on 1 March 2021 that IBM announced that hybrid cloud services were now generally available in any environment—on any cloud, on premises or at the edge—via IBM Cloud Satellite. The partnership with Lumen Technologies, coupled with IBM’s long-standing deep presence in on-premise enterprise systems, could turn out to be a key differentiator. An IBM press release noted that Lumen Technologies and IBM have integrated IBM Cloud Satellite with the Lumen edge platform to enable clients to harness hybrid cloud services in near real-time and build innovative solutions at the edge.

“IBM is working with clients to leverage advanced technologies like edge computing and AI, enabling them to digitally transform with hybrid cloud while keeping data security at the forefront,” said Howard Boville, Head of IBM Hybrid Cloud Platform. “With IBM Cloud Satellite, clients can securely gain the benefits of cloud services anywhere, from the core of the data center to the farthest reaches of the network.”

“With the Lumen platform’s broad reach, we are giving our enterprise customers access to IBM Cloud Satellite to help them drive innovation more rapidly at the edge,” said Paul Savill, SVP Enterprise Product Management and Services at Lumen. “Our enterprise customers can now extend IBM Cloud services across Lumen’s robust global network, enabling them to deploy data-heavy edge applications that demand high security and ultra-low latency. By bringing secure and open hybrid cloud capabilities to the edge, our customers can propel their businesses forward and take advantage of the emerging applications of the 4th Industrial Revolution.”

Microsoft Azure Arc: General availability awaited

Julia White Corporate Vice President, Microsoft Azure, in a blog post, dated 4 November 2019, announced Azure Arc, as a set of technologies that unlocks new hybrid scenarios for customers by bringing Azure services and management to any infrastructure. “Azure Arc is available in preview starting today,” she said.

However, the general availability of Azure Arc was not to be announced anytime soon. Six months after the ‘preview’ announcement, Jeremy Winter Partner Director, Azure Management, published a blog post on 20 May 2020, noting that the company was delivering ‘Azure Arc enabled Kubernetes’ in preview to its customers. “With this, anyone can use Azure Arc to connect and configure any Kubernetes cluster across customer datacenters, edge locations, and multi-cloud,” he said.

“In addition, we are also announcing our first set of Azure Arc integration partners, including Red Hat OpenShift, Canonical Kubernetes, and Rancher Labs to ensure Azure Arc works great for all the key platforms our customers are using today,” the post added.

The announcement followed Azure Stack launch two years earlier, to enable a consistent cloud model, deployable on-premises. Meanwhile, Azure was extended to provide DevOps for any environment and any cloud. Microsoft also enabled cloud-powered security threat protection for any infrastructure, and unlocked the ability to run Microsoft Azure Cognitive Services AI models anywhere. Azure Arc was a significant leap forward to enable customers to move from just hybrid cloud to truly deliver innovation anywhere with Azure, the post added.

Looking ahead

A distributed cloud presents an incredible opportunity for businesses that are looking to improve their bottom line while also increasing their agility and versatility.

A distributed cloud is essentially a distributed version of public cloud computing which offers the capability to manage nearly everything from a single computer to thousands of computers. The cloud promises the benefits of a global network without having to worry about hardware, software, management, and monitoring issues. The distributed cloud goes a step further and also brings the assurance on fronts such as latency, compliance, and on-premise application modernization.

0 Comments