distributed cloud

Distributed cloud is the new enterprise IT frontier

by | Jun 16, 2021 | IT Services

Cloud providers vie for pole positions as they gear up to compete in an upcoming hyperscale computing era.
Share to lead the transformation

A titanic struggle for control of the cloud has begun in earnest by the emergence of various distributed cloud architectures. The shift is being driven by the need for enterprises to move away from traditional infrastructure-aspect-management to ‘utility cloud’ models, which can be far more sustainable as long-term strategies.

Amazon Web Services, IBM, Google, and Microsoft are the giants whose bet in the development of such virtualization technologies has won them large shares of the cloud market. Several other companies are also active in this arena, and a closer examination of the main players may reveal a number of smaller players too.

distributed cloud

Multiple drivers are fueling growth

The star attractions of distributed clouds include (1) low latency due to proximity to user organizations (e.g., on-premises delivery or edge delivery); (2) better adherence to compliance and data-residency requirements;  and (3) rapidly growing number of IoT devices, utility drones, etc.

With distributed cloud services, the service providers are moving closer to the users. These cloud services are offered not just as public-cloud-hosted solutions but also on the edge or the on-premise data center. This approach of having a SaaS model with an on-premise application has its own advantages like ease of provisioning new services, ease of management, and cost reductions in the form of greater operational efficiency brought about by streamlined infrastructure management.

Cloud service providers have a deep understanding of both the needs of enterprises and their unique business requirements. They use their expertise to develop solutions that meet these objectives. They are also well known for providing easy accessibility to their services from the internet. This enables fast and convenient access for end-users.

Enterprises may think that by switching over to a distributed cloud computing service they will lose control of their data. However, the cloud service providers enable excellent security and monitoring solutions. They also ensure that users are given the highest level of access to their data. By migrating on-premises software to a cloud service provider, enterprises do not stand to lose the expertise that their employees have built up during their time in the organization.

Google Anthos: A first-mover advantage

Google formally introduced Anthos, as an open platform that lets enterprises run an app anywhere—simply, flexibly, and securely. In a blog post, dated 9 April 2019, Google noted that, embracing open standards, Anthos let enterprises run applications, unmodified, on existing on-prem hardware investments or in the public cloud, and was based on the Cloud Services Platform announced earlier.

The announcement said that Anthos’ hybrid functionality was made generally available both on Google Cloud Platform (GCP) with Google Kubernetes Engine (GKE), and in the enterprise data center with GKE On-Prem.

Consistency, another post said, was the greatest common denominator, with Anthos making multi-cloud easy owing to its foundation of Kubernetes—specifically the Kubernetes-style API. “Using the latest upstream version as a starting point, Anthos can see, orchestrate and manage any workload that talks to the Kubernetes API—the lingua franca of modern application development, and an interface that supports more and more traditional workloads,” the blog post added.

AWS Outposts: Defending its cloud turf

Amazon Web Services (AWS) has been among the first movers. On 3 December 2019, the cloud services major announced the general availability of AWS Outposts, as fully managed and configurable compute and storage racks built with AWS-designed hardware that allow customers to run compute and storage on-premises, while seamlessly connecting to AWS’s broad array of services in the cloud. A pre-announcement for Outposts had come on 28 November 2018 at the re:Invent 2018.

“When we started thinking about offering a truly consistent hybrid experience, what we heard is that customers really wanted it to be the same—the same APIs, the same control plane, the same tools, the same hardware, and the same functionality. It turns out this is hard to do, and that’s the reason why existing options for on-premises solutions haven’t gotten much traction today,” said Matt Garman, Vice President, Compute Services, at AWS. “With AWS Outposts, customers can enjoy a truly consistent cloud environment using the native AWS services or VMware Cloud on AWS to operate a single enterprise IT environment across their on-premises locations and the cloud.”

IBM Cloud Satellite: Late but not left out

IBM has been a bit late to the distributed cloud party. It was only on 1 March 2021 that IBM announced that hybrid cloud services were now generally available in any environment—on any cloud, on premises or at the edge—via IBM Cloud Satellite. The partnership with Lumen Technologies, coupled with IBM’s long-standing deep presence in on-premise enterprise systems, could turn out to be a key differentiator. An IBM press release noted that Lumen Technologies and IBM have integrated IBM Cloud Satellite with the Lumen edge platform to enable clients to harness hybrid cloud services in near real-time and build innovative solutions at the edge.

“IBM is working with clients to leverage advanced technologies like edge computing and AI, enabling them to digitally transform with hybrid cloud while keeping data security at the forefront,” said Howard Boville, Head of IBM Hybrid Cloud Platform. “With IBM Cloud Satellite, clients can securely gain the benefits of cloud services anywhere, from the core of the data center to the farthest reaches of the network.”

“With the Lumen platform’s broad reach, we are giving our enterprise customers access to IBM Cloud Satellite to help them drive innovation more rapidly at the edge,” said Paul Savill, SVP Enterprise Product Management and Services at Lumen. “Our enterprise customers can now extend IBM Cloud services across Lumen’s robust global network, enabling them to deploy data-heavy edge applications that demand high security and ultra-low latency. By bringing secure and open hybrid cloud capabilities to the edge, our customers can propel their businesses forward and take advantage of the emerging applications of the 4th Industrial Revolution.”

Microsoft Azure Arc: General availability awaited

Julia White Corporate Vice President, Microsoft Azure, in a blog post, dated 4 November 2019, announced Azure Arc, as a set of technologies that unlocks new hybrid scenarios for customers by bringing Azure services and management to any infrastructure. “Azure Arc is available in preview starting today,” she said.

However, the general availability of Azure Arc was not to be announced anytime soon. Six months after the ‘preview’ announcement, Jeremy Winter Partner Director, Azure Management, published a blog post on 20 May 2020, noting that the company was delivering ‘Azure Arc enabled Kubernetes’ in preview to its customers. “With this, anyone can use Azure Arc to connect and configure any Kubernetes cluster across customer datacenters, edge locations, and multi-cloud,” he said.

“In addition, we are also announcing our first set of Azure Arc integration partners, including Red Hat OpenShift, Canonical Kubernetes, and Rancher Labs to ensure Azure Arc works great for all the key platforms our customers are using today,” the post added.

The announcement followed Azure Stack launch two years earlier, to enable a consistent cloud model, deployable on-premises. Meanwhile, Azure was extended to provide DevOps for any environment and any cloud. Microsoft also enabled cloud-powered security threat protection for any infrastructure, and unlocked the ability to run Microsoft Azure Cognitive Services AI models anywhere. Azure Arc was a significant leap forward to enable customers to move from just hybrid cloud to truly deliver innovation anywhere with Azure, the post added.

Looking ahead

A distributed cloud presents an incredible opportunity for businesses that are looking to improve their bottom line while also increasing their agility and versatility.

A distributed cloud is essentially a distributed version of public cloud computing which offers the capability to manage nearly everything from a single computer to thousands of computers. The cloud promises the benefits of a global network without having to worry about hardware, software, management, and monitoring issues. The distributed cloud goes a step further and also brings the assurance on fronts such as latency, compliance, and on-premise application modernization.

MORE FROM BETTER WORLD

Milind Khamkar, Group CIO, Super-MAX

Milind Khamkar, Group CIO, Super-MAX

Viewpoint

Milind Khamkar

Senior IT Leader

“Storage versus applications continues to be a chicken-and-egg story.”

Storage versus applications has always remained a chicken-and-egg story. What comes first, storage or applications, is an interesting conundrum. Moreover, it is very difficult to predict how much of storage is enough. These two things keep the IT situation always fluid and the IT teams on their toes. A perfect solution remains ever elusive and a predictability around storage is hardly achieved.

CIOs start with some resources, and then the demand scales and sometimes goes out of scope. So the intelligence around storage requirements always remain a burning issue.

The landscape is constantly transforming. Original equipment manufacturers (OEMs) need to develop strategies to provide some predictability in terms of the applications’ storage requirements.

Also, it is of enormous significance to separate the professional and personal data, mainly in the context of regulation and compliance coming into force.

To my mind, cloud is an integral part of digital transformation. And the adoption of the cloud has been accelerated in this pandemic time. On a positive note, the pandemic has brought in some good changes, accelerated cloud adoption being one of them. Businesses that are embarking the digital transformation journey cannot ignore the importance of cloud. Hence, cloud is essential in today’s era, especially if you are going for new digital technologies. The kind of security questions we were grappling with before are no more there. Now, even the regulatory and compliance issues are taken care of to a large extent.

However, with new digital applications, latency is likely to be a key issue that public cloud may not be able to address adequately. That is where the significance of on-prem models becomes vital again.

Also read Viewpoint by Archie Jackson, Head – IT and Security, Incedo Inc. 

Storage Transformation Viewpoints

The new digital technologies are what we call the wave-2 digital technologies. They are getting developed with no precedence. So, the predictability about their behavior is extremely low. Plus, they are extremely resource hogging technologies. They put high demand on processing and storage resources, and the volume of data they generate is phenomenonal. The traditional storage technologies that were not developed for this era were tasked with matching the data needs of these technologies.

Going forward, storage elasticity will be extremely important in meeting these needs. On-premise data centers will therefore need to exhibit a cloud-like behavior. In fact, new-generation data centers are already providing storage on demand. That is going to become the new norm.

“Intelligence around storage requirements remains a burning issue. OEMs need to develop strategies to provide some predictability in terms of the applications’ storage requirements.”

Storage Transformation Viewpoints
Greesh Jairath, Senior IT leader

Greesh Jairath, Senior IT leader

Viewpoint

Greesh Jairath

Senior IT Leader

“AI has started playing a key role in ensuring SLA s and business availability.”

Storage is the underlying foundation of IT. Everything, including the applications and the structured as well as unstructured data, resides on storage media. However, storage solutions have move much beyond the hardware layer. Today, the virtualization layer has become the heart and center of all data centers, be it a private data center or a public cloud. Moreover, in the last three years or so, artificial intelligence (AI) has become a critical part from a storage perspective, and has started playing a significant role in ensuring SLA s and business availability.

Whenever an IT issue comes up, there has be either a storage problem, a network problem, or an application problem. AI simplifies the task of pinpointing the problem. And if you’re able to solve those issues immediately, it helps.

That’s point number one. Point number two is definitely in terms of scalability. Today, data has been growing from terabytes to gigabytes and exabytes, and the kind of scalability available within the controller set is enormous. So, it enables people running on-prem data centers to scale it almost on a demand basis, which has come very far in terms of intelligent storage on the data centers. Third is the agile part and the security that need to be factored into the storage component.

Also read Viewpoint by Archie Jackson, Head – IT and Security, Incedo Inc. 

Storage Transformation Viewpoints

The industry is witnessing a massive amount of transformation, and that is impacting storage as well. Storage transformation is already underway, though there are relative challenges on the ground.

Earlier data used to be about read and write, but now it’s mostly about write and read. Plus, we have big data, where there is lot of unstructured data.

Whenever we plan for storage or its replacement or scalability, we always look at it from a hybrid perspective. While some of the data will be available on prem, some of it will be available in the cloud. And if there are multiple clouds, then we have a provision available to move data from one cloud to another. The entire scope or design of storage has been taken at a different level altogether, wherein you provide the best-in-class security to fulfill the needs of compliance, security, and agility.

Today, data centers could very well be managed through automation to ensure that they run fine if errors happen due to known issues. Some alerts can go to the system admin or the backup admin for respective measures. So I think the intelligent data center is developing and progressing well. It’s not fully developed yet, but things are moving well in the right direction.

So, typically, when you look at the front cache or the cache available and the indexing on the storage, they are algorithms. They understand how to address structured data versus unstructured data. Also, with AI, provisions are available, either through an open stack or through our existing vendors, to ensure that those are being looked at differently.
Compliance is a key issue that one needs to factor in. Particularly, when GDPR aspects are involved, data retention can be a key challenge. It is important to differentiate between personally identifiable information (PII) and normal data. In terms of data, we have been ensuring that all the storage needs to be encrypted. A key question that CIOs must answer is: in case of an attack or a security threat, what data has been moved out? This could be of great importance because most organizations don’t even understand what information has been lost during an attack.

These are very grave concerns for organizations. While we try protecting data right from the endpoint to the perimeter, but in case an event happens, often one doesn’t even understand that the event has occurred.

Going forward, among other things, blockchain-based mechanisms are likely to evolve such that data may be protected in a far more better way.

“Whenever there is an IT issue, there is either a storage problem, a network problem, or an application problem. AI simplifies the task of pinpointing the problem.”

Storage Transformation Viewpoints
Charu Bhargava, Vice President – IT, Sheela Foam

Charu Bhargava, Vice President – IT, Sheela Foam

Viewpoint

Charu Bhargava

Vice President – IT, Sheela Foam

“One must maintain an equilibrium between convenience and compliance.”

Storage is becoming everyone’s necessity and the size of storage is increasing phenomenally. In the current scenario where virtualization plays a very important role, storage solutions should be able to provide an expandable or rather an ever-increasing input–output ratio because when everything and anything has to be stored and retrieved, you don’t know where the volumes are going. So storage has started playing a very important role in day-to-day operations, and it is ever-growing. It, therefore, has to be agile and scalable, right from the design stage.

Earlier, organizations used to struggle with files. Today, everyone is working with electronic data as digitalization has become the buzzword. So, organizations want to digitalize and store everything that is raw. The goal is to have zero paper but lots of electronic data. One needs that kind of ample space and storage to keep everything. Structured as well as unstructured data are exponentially growing, and before you process that and take out useful data, first you need to store it. The storage space needs to have a modular approach because you need to decide what comes first and how to store the data such that you optimize the resources to the best extent possible. That is where the trend is moving.

Also read Viewpoint by Archie Jackson, Head – IT and Security, Incedo Inc. 

Storage Transformation Viewpoints

You also need to maintain an equilibrium between convenience and compliance. It is never this way or that way, and you have to take both things into account because compliance has to go with convenience. Second, one also needs to consider the data type and how long it is to be stored. You need to identify data that is not useful or an absolute space wastage, and consider how you get rid of it such that it also takes care of your security and compliance obligations. As a data incharge or data custodian, you have to be very mindful of these things.

In fact, this is a struggle that everyone today faces because the volume keeps exponentially increasing. And it is not just structured data, but also unstructured data that is coming in from everywhere, be it text, images, or videos. Everything is getting into your data center. We have 7,000 showrooms and we use visual merchandising, so a phenomenal volume of images is flowing in each day. With AI, ML, and IoT, we work on these data sets. The data sets become so huge that someday you actually need to segregate them and throw things out of your data center, because after a period of time it is of no good.

As an organization we are following a hybrid approach. We have our own data center where all our core applications are residing. To hedge the risk, we have our DR on cloud. For all non-core applications, we use cloud. Security risk is still there on cloud, because cloud being open is vulnerable. On the other hand, a private cloud in an enterprise space, or dedicated to an enterprise, is more secure. As a philosophy, we have been using our own core applications, developed and designed by our own IT team. From a safety, security, and compliance perspective, we have far more control over it. We are working on this kind of hybrid environment and the cloud is actually being used for R&D-oriented applications, where you need expandability.

“We have our own data center where all our core applications are residing. To hedge the risk, we have our DR on cloud.”

Storage Transformation Viewpoints
Archie Jackson, Head – IT and Security, Incedo., Viewpoint

Archie Jackson, Head – IT and Security, Incedo., Viewpoint

Viewpoint

Archie Jackson

Head – IT and Security, Incedo Inc.

Modern storage solutions will require massive reimagining.

At this point in time, enterprises are racing towards an anywhere and everywhere work environment. The pandemic has made it imperative for organizations to transform themselves to meet the core needs of their employees who are scattered across geographies and sites. As a result, organizations are moving away from the erstwhile centralization mindset and going for decentralized architectures.

At the same time, there is a rapid evolution of cloud in the works. Several new technologies, such as analytics and business intelligence, are responsible for the evolution of the cloud in terms of scalability and agility. This evolution has also become a key catalyst for storage transformation. 

Storage Transformation Viewpoints

Today, we operate in a multi-cloud hybrid environment. It’s rare to find an organization working either fully on-premise or being fully dependent on a single cloud, thanks to the multitude of applications we work with and kind of architectures we use. Organizations are using different clouds and are essentially using a hybrid environment. All of this is often supported by multiple technology partners.

Identifying the most optimal solution around storage involves designing something that would be highly scalable, agile, and available as well as be cost-effective, unrestricted, and act as a disaster recovery (DR) option to ensure business continuity. It should integrate new technologies such as artificial intelligence.

Considering all these factors together is extremely important. This leads us more towards soft storage.

Today, application development is happening in a DevOps environment, which is increasingly distributed as well. Individuals may be working in small agile pods, with some storage, some activities, some gits, and so on. Now, when designing a solution, it is important to join all these dots and create a complete architecture and consequently a solution at the very foundation. Storage should enable such a foundation.

To sum up, today we are operating in a dynamically changing environment. So storage solutions should be in an agile format and also move away from a centralized architecture towards a decentralized one.

Also read: New Dropbox features could make pro remote workers more sticky

“Storage solutions should be highly scalable, agile, available, and cost-effective, and also meet DR needs, while integrating new technologies such as artificial intelligence.”

Storage Transformation Viewpoints
Time to get ‘responsible’ with AI systems

Time to get ‘responsible’ with AI systems

Humans have built very complex robotic systems, such as convoys and airplanes, and even neural networks to communicate with each other, but we’re only starting to scratch the surface of what artificial intelligence (AI) can do. It’s also about time we started paying more attention to ‘responsible AI.’

A future with artificial intelligence would be very mixed. It would be an actuality that could not only eliminate many of today’s human jobs, but also allow us to solve complex problems much faster than we could if we used a human brain to solve those same complex problems.

As technology gets closer to achieving full intelligence, we will start seeing the artificial intelligence (AI) systems that are fully self-aware and can think, reason, and act like a human would. This may raise some concerns, because some people fear that as artificially intelligent computers become more advanced, they might start to have a good enough IQ to be more intelligent than humans. The concern is not if, but when, it might happen.

In future we will have artificial intelligent robotic ‘teams’ of robots that can do all the menial tasks which we traditionally assign to humans such as vacuuming, picking up items, cooking, shopping and more. All jobs will eventually be done by artificially intelligent robotic machines. Even without this new development, all work will still be based on traditional methods such as task assignment, task resolution, and reward and punishment systems.

Today, we are beginning to see the first AI machine prototypes at work and many exciting projects are in the works. One such project is a robotic dog, which can recognize objects, humans and other dogs. Other projects include self-driving cars, self-piloted planes, artificial intelligent robots, and new weather systems.

The future of artificially intelligent robotic androids is exciting but also scary due to the autonomous capabilities of these machines. These robotic androids may be made up of two different types of artificial intelligence, a human-like non-conscious neural network (NCL) and a fully conscious human mind with all its own memory, thoughts, and feelings. Some NCL robots may have both systems in one system or may only have one. Many experts believe a full AI will be closer to human intelligence than any current technology can ever make.

Such concerns and apprehensions around AI have triggered the need for AI developments and implementations to be humanly, ethically, and legally more responsible.

Microsoft recognizes six principles that it believes should guide AI development and use (see link). These are fairness; reliability and safety; privacy and security; inclusiveness, transparency; and accountability.

PwC Responsible AI frameworkPwC has created a ‘Responsible AI Toolkit,’ which is a suite of customizable frameworks, tools, and processes designed to help organizations “harness the power of AI in an ethical and responsible manner, from strategy through execution.”

The field of ‘Responsible AI’ is generating more and more interest from various stakeholders, including governments, developers, human-resource experts, and user organizations, among others.

Why it’s time to regulate social media platforms now?

Why it’s time to regulate social media platforms now?

In the last two decades, social media platforms have gotten too big  and powerful but have mostly shrugged responsibility. Moreover, the big ones are literally without competition in their respective markets. There is no close direct competitor to a Facebook, Twitter, YouTube, LinkedIn, et al.

In this sense, social media platforms have become analogous to governments that are either free of any opposition or have a very weak opposition to contend with. Isn’t that what we call nonconductive to democracy?

Indeed. Be it Facebook, WhatsApp, or Google, they keep changing privacy policies. Sometimes these changes are to meet the regulatory requirements of the markets they operate in but often these changes are also at their wills (I chose not to use whims here) and fancies. Mostly, these changes are to suit their commercial interests, period.

Arm-twisting users to accept new privacy rules

Take the most recent and glaring instance of WhatsApp, for example. In early 2021, the Facebook-owned social messaging behemoth decided to issue a new privacy-policy diktat to its more than 500 million users in India to take it (the new privacy policy) or leave it (use of the WhatsApp app). After the government didn’t approve of its new privacy policy, WhatsApp did a climbdown from its earlier stand. It has postponed the exit of those users who have not accepted its policy for now.

WhatsApp argues against the government’s new guidelines (see article) on the pretext of servicing the ‘privacy interest’ of its users. At the same time, it tries forcing a privacy policy on users that they don’t approve of, by making a blatant misuse of its dominant position in the social messaging market segment. (It may be noted that Telegram is a distant second to WhatsApp globally as well as in India).

See also: Ironic that WhatsApp breaches privacy but wants govt to practice it.

Sumant ParimalSumant Parimal, Chief Analyst at 5Jewels Research and a keen IT industry observer agrees, “When they (social media companies) want, they impose any kind of term and conditions on users while even compromising privacy of users, but when Indian government asks for something then they are citing privacy as reason for not complying.”

So, what recourse do users have against such misuse of power by these platforms? There is no social-media appellate who could step in to safeguard the democratic interests of netizens. They are left with no other choice but to approach real-world courts and governments, who sometimes do step in and intervene.

Has regulation become a need of the changed times?

There is a thin line between democracy and anarchy, just as there is a thin line between freedom of speech and indecency of speech.

Social media is a platform that espouses the tenets of democracy and freedom of speech but where these cherished values can easily be sucked by dungeons of anarchy and indecent speech.

Worse, social media–and more so the social messaging platforms—can be misused by criminals and terrorists for perpetuating their respective agendas. Tech media is often replete with news of various cybercrimes ranging from digital frauds and cyber stalking to ransomware attacks.

Is government-led regulation of social media platforms needed?

Let’s be fair—the average internet user faces a perennial dilemma whenever the topic crops up. Netizens tend to see government interventions as a double-edged sword, which can cut both ways. There have been numerous instances in the past when netizens have opposed steps taken by governments to regulate the internet.

There are obvious reasons for users to be distrustful of both the government and the internet companies when it comes to protecting their freedom of speech and expression, particularly on social media platforms.

While the average utopian users will quite likely be fine with an intervention that rids social media platforms of obscenity, violence, and disharmonies of all kinds, they may not like any intrusive policing and patrolling of their social walls and communities.

Alas, internet is no longer the global village it was conceived to be!

Nevertheless, with the right regulatory mechanisms in place, it can be made a lot better than what it is today.

Verified accounts are a good way to autoregulate

Anshuman TiwariAnshuman Tiwari, a well-known process transformation professional, podcaster, and YouTuber has summed it up aptly, “So there is this chaos around the banning of some social media services in India. While we can debate the interest and logic in doing this, there is a huge opportunity to sort this mess. All social media should be ‘verified.’ Verified accounts will behave better. And the trolls will be careful. Essentially, what you can’t say in real life and get away with should also be not said online.”

A lot of people will lose a lot of ‘followers’ though, he quips.

A good thing is that amidst all the recent social-media din and commotion in the wake of the Intermediary Guidelines issued earlier by Ministry of Electronics and IT (MEITY), there has been some positive development on the front. Most significantly, Twitter has recently said it will enable a system for users to verify their Twitter accounts. It noted on its official website, “Starting May 20, 2021, we’ll begin rolling out verification applications to everyone. If you don’t see it in your Account Settings tab right away, don’t worry! Everyone should be able to apply soon.”

It is a well-known fact that getting an account ‘verified’ on Twitter has historically been one of the most arduous and hard-to-achieve tasks for a common Twitterati.

Multi-stakeholder regulation can infuse trust

When it comes to the wider impact of social media, there are multiple stakeholders at play. These include the general users, the government, the opposition, public figures, businesses, academia, judiciary, and the social media platforms themselves, among others.

So, a panel that comprises representations from several of these stakeholder groups should ideally be allowed to monitor, judge, and moderate the social media platforms. Such a measure would help alleviate the apprehensions that the new rules and regulations may be misused by a government in power.

It would also ensure that social media has not just power, but also shoulders the responsibility that is required of an internet intermediary in today’s context. With up to half of India’s eligible population (less than 13/14 years of age) likely to be on one social media platform or the other, there indeed is a need to ensure that these platforms are not used by elements that are detrimental to the society and the nation.

Indeed, when too much power, direct or indirect, gets concentrated in any institution or platform, it is important to put the right set of checks and balances in place.

By issuing the intermediary guidelines, the government has done well to put the necessary checks in place. What it needs to do now is to balance it all by constituting a multi-stakeholder mechanism (panel) to monitor any potential breach and recommend any corrective measures or punitive actions to the concerned government authorities.

This way, the panel itself works like an intermediary between the government and the social media companies as well as between the users and the government or the social media companies.

0 Comments

Submit a Comment

Your email address will not be published.