Zoom selects Oracle as Cloud Infrastructure Provider

5/5 (6)

5/5 (6)

The COVID-19 crisis has forced countries to implement work from home policies and lockdowns. Since the crisis hit, uptake of cloud communication and collaboration solutions have seen a dramatic increase. Video conferencing provider, Zoom has emerged as a key player in the market, with a rapid increase in user base from 10 million daily active participants in December 2019 to 200 million in March 2020 – a growth in the number of users of nearly 200%!

Security Concerns around Zoom

The rapid increase in user base and the surge in traffic has required Zoom to re-evaluate its offerings and capacity. The platform was primarily built for enterprises and now is seeing unprecedented usage in conducting team meetings, webinars, virtual conferences, e-learning, and social events.

The one area where they were impacted most is security. In his report, Cybersecurity Considerations in the COVID-19 Era, Ecosystm Principal Advisor Andrew Milroy says, “The extraordinary growth of Zoom has made it a target for attackers. It has had to work remarkably hard to plug the security gaps, identified by numerous breaches. Many security vulnerabilities have been discovered with Zoom such as, a vulnerability to UNC path injection in the client chat feature, which allows hackers to steal Windows credentials, keeping decryption keys in the cloud which can potentially be accessed by hackers and the ability for trolls to ‘Zoombomb’ open and unprotected meetings.”

“Zoom largely responded to these disclosures quickly and transparently, and it has already patched many of the weaknesses highlighted by the security community. But it continues to receive rigorous stress testing by hackers, exposing more vulnerabilities.”

However, Milroy does not think that this issue is unique to Zoom. “Collaboration platforms tend to tread a fine line between performance and security. Too much security can cause performance and usability to be impacted negatively. Too little security, as we have seen, allows hackers to find vulnerabilities. If data privacy is critical for a meeting, then perhaps collaboration platforms should not be used, or organisations should not share critical information on them.”

Zoom to increase Capacity and Scalability

Zoom is aware that it has to increase its service capacity and scalability of its offerings, if it has to successfully leverage its current market presence, beyond the COVID-19 crisis. Last week Zoom announced that that it had selected Oracle as its cloud Infrastructure provider. One of the reasons cited for the choice is Oracle’s “industry-leading security”. It has been reported that Zoom is transferring more than 7 PB of data through Oracle Cloud Infrastructure servers daily.

In addition to growing their data centres, Zoom has been using AWS and Microsoft Azure as its hosting providers. Milroy says, “It makes sense for Zoom to use another supplier rather than putting ‘all its eggs in one or two baskets’. Zoom has not shared the commercial details, but it is likely that Oracle has offered more predictable pricing. Also, the security offered by the Oracle Cloud Infrastructure deal is likely to have impacted the choice and it is likely that Oracle has also priced its security features very competitively.”

“It must also be borne in mind that Google, Microsoft and Amazon are all competing directly with Zoom. They all offer video collaboration platforms and like Zoom, are seeing huge growth in demand. Zoom may not wish to contribute to the growth of its competitors any more than it needs to.”

Milroy sees another benefit to using Oracle. “Oracle is known to have a presence in the government sector – especially in the US. Working with Oracle might make it easier for Zoom to win large government contracts, to consolidate its market presence.”


Gain access to more insights from the Ecosystm Cloud Study

Ecosystm Cloud Insights


0
Microsoft Set to Acquire Affirmed Networks

5/5 (1)

5/5 (1)

In The Top 5 Cloud trends for 2020, Ecosystm Principal Analyst, Claus Mortensen had predicted that in 2020, cloud and IoT will drive edge computing.

“Edge computing has been widely touted as a necessary component of a viable 5G setup, as it offers a more cost-effective and lower latency option than a traditional infrastructure. Also, with IoT being a major part of the business case behind 5G, the number of connected devices and endpoints is set to explode in the coming years, potentially overloading an infrastructure based fully on centralised data centres for processing the data,” says Mortensen.

Although some are positioning the Edge as the ultimate replacement of cloud, Mortensen believes it will be a complementary rather than a competing technology. “The more embedded major cloud providers like AWS and Microsoft can become with 5G providers, the better they can service customers, who want to access cloud resources via the mobile networks. This is especially compelling for customers who need very low latency access.”

Affirmed Networks Brings Microsoft to the 5G Infrastructure Table

Microsoft recently announced that they were in discussions to acquire Affirmed Networks, a provider of network functions virtualisation (NFV) software for telecom operators. The company’s existing enterprise customer base is impressive with over 100 major telecom customers including big names such as AT&T, Orange and Vodafone. Affirmed Networks’ recently appointed CEO, Anand Krishnamurthy says that their virtualised cloud-native network, Evolved Packet Core, allows for scale on demand with a range of automation capabilities, at 70% of the cost of traditional networks. The telecom industry has been steadily moving away from proprietary hardware-based infrastructure, opting for open, software-defined networking (SDN). This acquisition will potentially allow Microsoft to leverage their Azure platform for 5G infrastructure and for cloud-based edge computing applications.

Ecosystm Principal Advisor, Shamir Amanullah says, “The telecommunications industry is suffering from a decline in traditional services leading to a concerted effort in reducing costs and introducing new digital services. To do this in preparation for 5G, carriers are working towards transforming their operations and business support systems to a more virtualised and software-defined infrastructure. 5G will be dynamic, more than ever before, for a number of reasons. 5G will operate across a range of frequencies and bands, with significantly more devices and connections, highly software-defined with computing power at the Edge.”

Microsoft is by no means the only tech giant that is exploring this space. Google recently announced a new solution, Anthos for Telecom and a new service called the Global Mobile Edge Cloud (GMEC), aimed at giving telecom providers compute power on the Edge. At about the same time, HPE announced a new portfolio of as-a-service offering to help telecom companies build and deploy open 5G networks. Late last year, AWS had launched AWS Wavelength that promises to bring compute and storage services at the edge of telecom providers’ 5G networks. Microsoft’s acquisition of Affirmed Networks brings them to the 5G infrastructure table.

Microsoft Continues to Focus on 5G Offerings

The acquisition of Affirmed Networks is not the only Microsoft initiative to improve their 5G offerings. Last week also saw Microsoft announce Azure Edge Zones aimed at reducing latency for both public and private networks. AT&T is a good example of how public carriers will use the Azure Edge Zones. As part of the ongoing partnership with Microsoft,  AT&T has already launched a Dallas Edge Zone, with another one planned for Los Angeles, later in the year. Microsoft also intends to offer the Azure Edge Zones, independent of carriers in denser areas. They also launched Azure Private Edge Zones for private enterprise networks suitable for delivering ultra low latency performance for IoT devices.

5G will remain a key area of focus for cloud and software giants. Amanullah sees this trend as a challenge to infrastructure providers such as Huawei, Ericsson and Nokia. “History has shown how these larger software providers can be fast, nimble, innovative, and extremely customer-centric. Current infrastructure providers should not take this challenge lightly.”

1
Google Cloud Continues to Add Breadth and Depth to their Portfolio

4.7/5 (3)

4.7/5 (3)

Talking about the top 5 global cloud players – Microsoft, AWS, Google, Alibaba and IBM –  in the Ecosystm Predicts: The Top 5 Cloud Trends for 2020, Ecosystm Principal Analyst, Claus Mortensen had said, “their ability to compete will increasingly come down to expansion of their service capabilities beyond their current offerings. Ecosystm expects these players to further enhance their focus on expanding their services, management and integration capabilities through global and in-country partnerships.” Google Cloud is doing just that. The last week has been busy for Google Cloud with a few announcements that show that it is ramping up – adding both depth and breadth to their portfolio.

Expanding Data Centre Footprint

This year Google Cloud is set to expand the number of locations to 26 countries. Earlier in the year, Google CEO Sundar Pichai had promised to invest more than US$ 10 billion in expanding their data centre footprints in the USA and they have recently opened their Salt Lake City data centre. Last week Google announced four new data centre locations in Doha (Qatar), Toronto (Canada), Melbourne (Australia), and Delhi (India). With Australia, Canada and India, Google appears to be following the same policy they followed in Japan – where locations in Osaka and Tokyo give customers the option to have an in-country disaster recovery solution. Doha marks Google Cloud’s first foray into the Middle East. While the data centre will primarily cater to global clients, Google has noted a substantial interest from customers in the Middle East and Africa.

Mortensen says, “Google’s new data centres can be seen as an organic geographical expansion of their cloud services but there are a few more factors at play. With data privacy laws getting stricter across the globe, the ability to offer localised data storage is becoming more important – and India is a very good example of a market where keeping data within the geographical borders will become a must.”

“The expansion will also help the development of Google’s own edge computing services going forward. As we noted in our Ecosystm Predicts document, we believe that Cloud and IoT will drive edge computing (which is tightly tied to 5G). Edge computing will function in a symbiotic relationship with centralised data centres where low latency is important. The geographical expansion of Google’s data centre presence will thus also help their push towards edge computing services.”

Google offers their cloud infrastructure and digital transformation (DX) solutions to customers in 150 countries. Not only are they expanding their data centre footprint, but they are also creating industry differentiation. They have targeted industry-specific solutions that deliver new digital capabilities in 6 key verticals – financial services; telecommunications media and entertainment; retail; healthcare and life sciences; manufacturing and industrial; and public sector.

Partnering with Telecom Providers

Last week also saw the unveiling of the Global Mobile Edge Cloud (GMEC) aimed at the telecom industry’s need to transform and the challenges it faces. The telecom industry – long considered an enabler of DX in other industries – stands at a crossroads now. It is time for the industry to transform in order to succeed in a challenging market, newer devices and networking capabilities, and evolving customer requirements – both consumer and enterprise. Talking about the impact of 5G on telecom providers, Ecosystm Principal Advisor, Shamir Amanullah says, “5G is an enterprise play and leading tech giants, carriers and the companies in the ecosystem are collaborating and inking partnerships in order to create solutions and monetise 5G opportunities across industries.”

Google Cloud announced a partnership with AT&T, which is meant to leverage AT&T’s 5G network and Google Cloud’s edge compute technologies (AI and machine learning, analytics, Kubernetes and networking) to develop a joint portfolio of 5G edge computing solutions. This is part of Google’s larger strategy of supporting telecom providers in their efforts to monetise 5G as a business services platform. Through the GMEC, Google Cloud will partner with carriers to offer a suite of applications and services at the edge via 5G networks.

The telecom industry is a key focus as Google aims to help operators take 5G to market, by creating solutions and services that can be offered to enterprises. This includes better customer engagement through data-driven experiences, and improvement of operational efficiencies across core telecom systems. Telecom providers such as Vodafone and Wind Tre are leveraging Google to improve customer experience through data-driven insights.

Amanullah says, “Google Cloud already has thousands of edge nodes inside the carrier networks which will be enabled for use by enterprises, providing access to data analytics, AI and machine learning capabilities. Carriers can offer enterprises these data-driven solutions, to transform the customer experience they offer. Google will also create solutions which will enable carriers and enterprises to improve infrastructure and operational efficiencies through modern cloud-based Operations Support Systems (OSS) and Business Support Systems (BSS).”

Mortensen also thinks that the data centre expansion should be seen in the light of Google’s GMEC push. “Both India and the Middle East are big potential markets via the local telecom providers.”

 

 

1
Nvidia and Intel Race For The Future Of Machine Learning

4/5 (4)

4/5 (4)

Two things happened recently that 99% of the ICT world would normally miss. After all microprocessor and chip interconnect technology is quite the geek area where we generally don’t venture into. So why would I want to bring this to your attention?

We are excited about the innovation that analytics, machine learning (ML) and all things real time processing will bring to our lives and the way we run our business. The data center, be it on an enterprise premise or truly on a cloud service provider’s infrastructure is being pressured to provide compute, memory, input/output (I/O) and storage requirements to take advantage of the hardware engineers would call ‘accelerators’. In its most simple form, an accelerator microprocessor does the specialty work for ML and analytics algorithms while the main microprocessor is trying to hold everything else together to ensure that all of the silicon parts are in sync. If we have a ML accelerator that is too fast with its answers, it will sit and wait for everyone else as its outcomes squeezed down a narrow, slow pipe or interconnect – in other words, the servers that are in the data center are not optimized for these workloads. The connection between the accelerators and the main components becomes the slowest and weakest link…. So now back to the news of the day.

A new high speed CPU-to-device interconnect standard, the Common Express Link (CXL) 1.0 was announced by Intel and a consortium of leading technology companies (Huawei and Cisco in the network infrastructure space, HPE and Dell EMC in the server hardware market, and Alibaba, Facebook, Google and Microsoft for the cloud services provider markets). CXL joins a crowded field of other standards already in the server link market including CAPI, NVLINK, GEN-Z and CCIX. CXL is being positioned to improve the performance of the links between FPGA and GPUs, the most common accelerators to be involved in ML-like workloads.

Of course there were some names that were absent from the launch – Arm, AMD, Nvidia, IBM, Amazon and Baidu. Each of them are members of the other standards bodies and probably are playing the waiting game.

Now let’s pause for a moment and look at the other announcement that happened at the same time. Nvidia and Mellanox announced that the two companies had reached a definitive agreement under which Nvidia will acquire Mellanox for $6.9 billion.  Nvidia puts the acquisition reasons as “The data and compute intensity of modern workloads in AI, scientific computing and data analytics is growing exponentially and has put enormous performance demands on hyperscale and enterprise datacenters. While computing demand is surging, CPU performance advances are slowing as Moore’s law has ended. This has led to the adoption of accelerated computing with Nvidia GPUs and Mellanox’s intelligent networking solutions.”

So to me it seems that despite Intel working on CXL for four years, it looks like they might have been outbid by Nvidia for Mellanox. Mellanox has been around for 20 years and was the major supplier of Infiniband, a high speed interconnect that is common in high performance workloads and very well accepted by the HPC industry. (Note: Intel was also one of the founders of the Infiniband Trade Association, IBTA, before they opted to refocus on the PCI bus). With the growing need for fast links between the accelerators and the microprocessors, it would seem like Mellanox persistence had paid off and now has the market coming to it. One can’t help but think that as soon as Intel knew that Nvidia was getting Mellanox, it pushed forward with the CXL announcement – rumors that have had no response from any of the parties.

Advice for Tech Suppliers:

The two announcements are great for any vendor who is entering the AI, intense computing world using graphics and floating point arithmetic functions. We know that more digital-oriented solutions are asking for analytics based outcomes so there will be a growing demand for broader commoditized server platforms to support them. Tech suppliers should avoid backing or picking one of either the CXL or Infiniband at the moment until we see how the CXL standard evolves and how nVidia integrates Mellanox.

Advice for Tech Users:

These two announcements reflect innovation that is generally so far away from the end user, that it can go unnoticed. However, think about how USB (Universal Serial Bus) has changed the way we connect devices to our laptops, servers and other mobile devices. The same will true for this connection as more and more data is both read and outcomes generated by the ‘accelerators’ for the way we drive our cars, digitize our factories, run our hospitals, and search the Internet. Innovation in this space just got a shot in the arm from these two announcements.

2