Generative AI is seeing enterprise interest and early adoption enhancing efficiency, fostering innovation, and pushing the boundaries of possibility. It has the potential of reshaping industries – and fast!
However, alongside its immense potential, Generative AI also raises concerns. Ethical considerations surrounding data privacy and security come to the forefront, as powerful AI systems handle vast amounts of sensitive information.
Addressing these concerns through responsible AI development and thoughtful regulation will be crucial to harnessing the full transformative power of Generative AI.
Read on to find out the key challenges faced in implementing Generative AI and explore emerging use cases in industries such as Financial Services, Retail, Manufacturing, and Healthcare.
Download ‘Generative AI: Industry Adoption’ as a PDF

The Manufacturing industry is at crossroads today. It faces challenges such as geopolitical risks, supply chain disruptions, changing regulatory environments, workforce shortages, and changing consumer demands. Overcoming these requires innovation, collaboration, and proactive adaptation.
Fortunately, many of these challenges can be mitigated by technology. The future of Manufacturing will be shaped by advanced technology, automation, and AI. We are seeing early evidence of how smart factories, robotics, and 3D printing are transforming production processes for increased efficiency and customisation.
Manufacturing is all set to become more agile, efficient, and sustainable.
Read on to find out the changing priorities and key trends in Manufacturing; about the World Economic Forum’s Global Lighthouse Network initiative; and where Ecosystm advisor Kaushik Ghatak sees as the Future of Manufacturing.
Click here to download ‘The Future of Manufacturing’ as a PDF

When non-organic (man-made) fabric was introduced into fashion, there were a number of harsh warnings about using polyester and man-made synthetic fibres, including their flammability.
In creating non-organic data sets, should we also be creating warnings on their use and flammability? Let’s look at why synthetic data is used in industries such as Financial Services, Automotive as well as for new product development in Manufacturing.
Synthetic Data Defined
Synthetic data can be defined as data that is artificially developed rather than being generated by actual interactions. It is often created with the help of algorithms and is used for a wide range of activities, including as test data for new products and tools, for model validation, and in AI model training. Synthetic data is a type of data augmentation which involves creating new and representative data.
Why is it used?
The main reasons why synthetic data is used instead of real data are cost, privacy, and testing. Let’s look at more specifics on this:
- Data privacy. When privacy requirements limit data availability or how it can be used. For example, in Financial Services where restrictions around data usage and customer privacy are particularly limiting, companies are starting to use synthetic data to help them identify and eliminate bias in how they treat customers – without contravening data privacy regulations.
- Data availability. When the data needed for testing a product does not exist or is not available to the testers. This is often the case for new releases.
- Data for testing. When training data is needed for machine learning algorithms. However, in many instances, such as in the case of autonomous vehicles, the data is expensive to generate in real life.
- Training across third parties using cloud. When moving private data to cloud infrastructures involves security and compliance risks. Moving synthetic versions of sensitive data to the cloud can enable organisations to share data sets with third parties for training across cloud infrastructures.
- Data cost. Producing synthetic data through a generative model is significantly more cost-effective and efficient than collecting real-world data. With synthetic data, it becomes cheaper and faster to produce new data once the generative model is set up.

Why should it cause concern?
If real dataset contains biases, data augmented from it will contain biases, too. So, identification of optimal data augmentation strategy is important.
If the synthetic set doesn’t truly represent the original customer data set, it might contain the wrong buying signals regarding what customers are interested in or are inclined to buy.
Synthetic data also requires some form of output/quality control and internal regulation, specifically in highly regulated industries such as the Financial Services.
Creating incorrect synthetic data also can get a company in hot water with external regulators. For example, if a company created a product that harmed someone or didn’t work as advertised, it could lead to substantial financial penalties and, possibly, closer scrutiny in the future.
Conclusion
Synthetic data allows us to continue developing new and innovative products and solutions when the data necessary to do so wouldn’t otherwise be present or available due to volume, data sensitivity or user privacy challenges. Generating synthetic data comes with the flexibility to adjust its nature and environment as and when required in order to improve the performance of the model to create opportunities to check for outliers and extreme conditions.
As we return to the office, there is a growing reliance on devices to tell us how safe and secure the environment is for our return. And in specific application areas, such as Healthcare and Manufacturing, IoT data is critical for decision-making. In some sectors such as Health and Wellness, IoT devices collect personally identifiable information (PII). IoT technology is so critical to our current infrastructures that the physical wellbeing of both individuals and organisations can be at risk.
Trust & Data
IoT are also vulnerable to breaches if not properly secured. And with a significant increase in cybersecurity events over the last year, the reliance on data from IoT is driving the need for better data integrity. Security features such as data integrity and device authentication can be accomplished through the use of digital certificates and these features need to be designed as part of the device prior to manufacturing. Because if you cannot trust either the IoT devices and their data, there is no point in collecting, running analytics, and executing decisions based on the information collected.
We discuss the role of embedding digital certificates into the IoT device at manufacture to enable better security and ongoing management of the device.
Securing IoT Data from the Edge
So much of what is happening on networks in terms of real-time data collection happens at the Edge. But because of the vast array of IoT devices connecting at the Edge, there has not been a way of baking trust into the manufacture of the devices. With a push to get the devices to market, many manufacturers historically have bypassed efforts on security. Devices have been added on the network at different times from different sources.
There is a need to verify the IoT devices and secure them, making sure to have an audit trail on what you are connecting to and communicating with.
So from a product design perspective, this leads us to several questions:
- How do we ensure the integrity of data from devices if we cannot authenticate them?
- How do we ensure that the operational systems being automated are controlled as intended?
- How do we authenticate the device on the network making the data request?
Using a Public Key Infrastructure (PKI) approach maintains assurance, integrity and confidentiality of data streams. PKI has become an important way to secure IoT device applications, and this needs to be built into the design of the device. Device authentication is also an important component, in addition to securing data streams. With good design and a PKI management that is up to the task you should be able to proceed with confidence in the data created at the Edge.
Johnson Controls/DigiCert have designed a new way of managing PKI certification for IoT devices through their partnership and integration of the DigiCert ONE™ PKI management platform and the Johnson Controls OpenBlue IoT device platform. Based on an advanced, container-based design, DigiCert ONE allows organisations to implement robust PKI deployment and management in any environment, roll out new services and manage users and devices across your organisation at any scale no matter the stage of their lifecycle. This creates an operational synergy within the Operational Technology (OT) and IoT spaces to ensure that hardware, software and communication remains trusted throughout the lifecycle.

Rationale on the Role of Certification in IoT Management
Digital certificates ensure the integrity of data and device communications through encryption and authentication, ensuring that transmitted data are genuine and have not been altered or tampered with. With government regulations worldwide mandating secure transit (and storage) of PII data, PKI can help ensure compliance with the regulations by securing the communication channel between the device and the gateway.
Connected IoT devices interact with each other through machine to machine (M2M) communication. Each of these billions of interactions will require authentication of device credentials for the endpoints to prove the device’s digital identity. In such scenarios, an identity management approach based on passwords or passcodes is not practical, and PKI digital certificates are by far the best option for IoT credential management today.
Creating lifecycle management for connected devices, including revocation of expired certificates, is another example where PKI can help to secure IoT devices. Having a robust management platform that enables device management, revocation and renewal of certificates is a critical component of a successful PKI. IoT devices will also need regular patches and upgrades to their firmware, with code signing being critical to ensure the integrity of the downloaded firmware – another example of the close linkage between the IoT world and the PKI world.
Summary
PKI certification benefits both people and processes. PKI enables identity assurance while digital certificates validate the identity of the connected device. Use of PKI for IoT is a necessary trend for sense of trust in the network and for quality control of device management.
Identifying the IoT device is critical in managing its lifespan and recognizing its legitimacy in the network. Building in the ability for PKI at the device’s manufacture is critical to enable the device for its lifetime. By recognizing a device, information on it can be maintained in an inventory and its lifecycle and replacement can be better managed. Once a certificate has been distributed and certified, having the control of PKI systems creates life-cycle management.

In this Insight, our guest author Anupam Verma talks about how the Global Capability Centres (GCCs) in India are poised to become Global Transformation Centres. “In the post-COVID world, industry boundaries are blurring, and business models are being transformed for the digital age. While traditional functions of GCCs will continue to be providing efficiencies, GCCs will be ‘Digital Transformation Centres’ for global businesses.”

India has a lot to offer to the world of technology and transformation. Attracted by the talent pool, enabling policies, digital infrastructure, and competitive cost structure, MNCs have long embraced India as a preferred destination for Global Capability Centres (GCCs). It has been reported that India has more than 1,700 GCCs with an estimated global market share of over 50%.
GCCs employ around 1 million Indian professionals and has an immense impact on the economy, contributing an estimated USD 30 billion. US MNCs have the largest presence in the market and the dominating industries are BSFI, Engineering & Manufacturing, Tech & Consulting.
GCC capabilities have always been evolving
The journey began with MNCs setting up captives for cost optimisation & operational excellence. GCCs started handling operations (such as back-office and business support functions), IT support (such as app development and maintenance, remote IT infrastructure, and help desk) and customer service contact centres for the parent organisation.
In the second phase, MNCs started leveraging GCCs as centers of excellence (CoE). The focus then was product innovation, Engineering Design & R&D. BFSI and Professional Services firms started expanding the scope to cover research, underwriting, and consulting etc. Some global MNCs that have large GCCs in India are Apple, Microsoft, Google, Nissan, Ford, Qualcomm, Cisco, Wells Fargo, Bank of America, Barclays, Standard Chartered, and KPMG.
In the post-COVID world, industry boundaries are blurring, and business models are being transformed for the digital age. While traditional functions of GCCs will continue to be providing efficiencies, GCCs will be “Digital Transformation Centres” for global businesses.
The New Age GCC in the post-COVID world
On one hand, the pandemic broke through cultural barriers that had prevented remote operations and work. The world became remote everything! On the other hand, it accelerated digital adoption in organisations. Businesses are re-imagining customer experiences and fast-tracking digital transformation enabled by technology (Figure 1). High digital adoption and rising customer expectations will also be a big catalyst for change.

In last few years, India has seen a surge in talent pool in emerging technologies such as data analytics, experience design, AI/ML, robotic process automation, IoT, cloud, blockchain and cybersecurity. GCCs in India will leverage this talent pool and play a pivotal role in enabling digital transformation at a global scale. GCCs will have direct and significant impacts on global business performance and top line growth creating long-term stakeholder value – and not be only about cost optimisation.
GCCs in India will also play an important role in digitisation and automation of existing processes, risk management and fraud prevention using data analytics and managing new risks like cybersecurity.
More and more MNCs in traditional businesses will add GCCs in India over the next decade and the existing 1,700 plus GCCs will grow in scale and scope focussing on innovation. Shift of supply chains to India will also be supported by Engineering R & D Centres. GCCs passed the pandemic test with flying colours when an exceptionally large workforce transitioned to the Work from Home model. In a matter of weeks, the resilience, continuity, and efficiency of GCCs returned to pre-pandemic levels with a distributed and remote workforce.
A Final Take
Having said that, I believe the growth spurt in GCCs in India will come from new-age businesses. Consumer-facing platforms (eCommerce marketplaces, Healthtechs, Edtechs, and Fintechs) are creating digital native businesses. As of June 2021, there are more than 700 unicorns trying to solve different problems using technology and data. Currently, very few unicorns have GCCs in India (notable names being Uber, Grab, Gojek). However, this segment will be one of the biggest growth drivers.
Currently, only 10% of the GCCs in India are from Asia Pacific organisations. Some of the prominent names being Hitachi, Rakuten, Panasonic, Samsung, LG, and Foxconn. Asian MNCs have an opportunity to move fast and stay relevant. This segment is also expected to grow disproportionately.
New age GCCs in India have the potential to be the crown jewel for global MNCs. For India, this has a huge potential for job creation and development of Smart City ecosystems. In this decade, growth of GCCs will be one of the core pillars of India’s journey to a USD 5 trillion economy.
The views and opinions mentioned in the article are personal.
Anupam Verma is part of the Senior Leadership team at ICICI Bank and his responsibilities have included leading the Bank’s strategy in South East Asia to play a significant role in capturing Investment, NRI remittance, and trade flows between SEA and India.

In 2020, much of the focus for organisations were on business continuity, and on empowering their employees to work remotely. Their primary focus in managing customer experience was on re-inventing their product and service delivery to their customers as regular modes were disrupted. As they emerge from the crisis, organisations will realise that it is not only their customer experience delivery models that have changed – but customer expectations have also evolved in the last few months. They are more open to digital interactions and in many cases the concept of brand loyalty has been diluted. This will change everything for organisations’ customer strategies. And digital technology will play a significant role as they continue to pivot to succeed in 2021 – across regions, industries and organisations.
Ecosystm Advisors Audrey William, Niloy Mukherjee and Tim Sheedy present the top 5 Ecosystm predictions for Customer Experience in 2021. This is a summary of the predictions – the full report (including the implications) is available to download for free on the Ecosystm platform.
The Top 5 Customer Experience Trends for 2021
- Customer Experience Will Go Truly Digital
COVID-19 made the few businesses that did not have an online presence acutely aware that they need one – yesterday! We have seen at least 4 years of digital growth squeezed into six months of 2020. And this is only the beginning. While in 2020, the focus was primarily on eCommerce and digital payments, there will now be a huge demand for new platforms to be able to interact digitally with the customer, not just to be able to sell something online.
Digital customer interactions with brands and products – through social media, online influencers, interactive AI-driven apps, online marketplaces and the like will accelerate dramatically in 2021. The organisations that will be successful will be the ones that are able to interact with their customers and connect with them at multiple touchpoints across the customer journey. Companies unable to do that will struggle.
- Digital Engagement Will Expand Beyond the Traditional Customer-focused Industries
One of the biggest changes in 2020 has been the increase in digital engagement by industries that have not traditionally had a strong eye on CX. This trend is likely to accelerate and be further enhanced in 2021.
Healthcare has traditionally been focused on improving clinical outcomes – and patient experience has been a byproduct of that focus. Many remote care initiatives have the core objective of keeping patients out of the already over-crowded healthcare provider organisations. These initiatives will now have a strong CX element to them. The need to disseminate information to citizens has also heightened expectations on how people want their healthcare organisations and Public Health to interact with them. The public sector will dramatically increase digital interactions with citizens, having been forced to look at digital solutions during the pandemic.
Other industries that have not had a traditional focus on CX will not be far behind. The Primary & Resources industries are showing an interest in Digital CX almost for the first time. Most of these businesses are looking to transform how they manage their supply chains from mine/farm to the end customer. Energy and Utilities and Manufacturing industries will also begin to benefit from a customer focus – primarily looking at technology – including 3D printing – to customise their products and services for better CX and a larger share of the market.
- Brands that Establish a Trusted Relationship Can Start Having Fun Again
Building trust was at the core of most businesses’ CX strategies in 2020 as they attempted to provide certainty in a world generally devoid of it. But in the struggle to build a trusted experience and brand, most businesses lost the “fun”. In fact, for many businesses, fun was off the agenda entirely. Soft drink brands, travel providers, clothing retailers and many other brands typically known for their fun or cheeky experiences moved the needle to “trust” and dialed it up to 11. But with a number of vaccines on the horizon, many CX professionals will look to return to pre-pandemic experiences, that look to delight and sometimes even surprise customers.
However, many companies will get this wrong. Customers will not be looking for just fun or just great experiences. Trust still needs to be at the core of the experience. Customers will not return to pre-pandemic thinking – not immediately anyway. You can create a fun experience only if you have earned their trust first. And trust is earned by not only providing easy and effective experiences, but by being authentic.
- Customer Data Platforms Will See Increased Adoption
Enterprises continue to struggle to have a single view of the customer. There is an immense interest in making better sense of data across every touchpoint – from mobile apps, websites, social media, in-store interactions and the calls to the contact centre – to be able to create deeper customer profiles. CRM systems have been the traditional repositories of customer data, helping build a sales pipeline, and providing Marketing teams with the information they need for lead generation and marketing campaigns. However, CRM systems have an incomplete view of the customer journey. They often collect and store the same data from limited touchpoints – getting richer insights and targeted action recommendations from the same datasets is not possible in today’s world. And organisations struggled to pivot their customer strategies during COVID-19. Data residing in silos was an obstacle to driving better customer experience.
We are living in an age where customer journeys and preferences are becoming complex to decipher. An API-based CDP can ingest data from any channel of interaction across multiple journeys and create unique and detailed customer profiles. A complete overhaul of how data can be segregated based on a more accurate and targeted profile of the customer from multiple sources will be the way forward in order to drive a more proactive CX engagement.
- Voice of the Customer Programs Will be Transformed
Designing surveys and Voice of Customer programs can be time-consuming and many organisations that have a routine of running these surveys use a fixed pattern for the data they collect and analyse. However, some organisations understand that just analysing results from a survey or CSAT score does not say much about what customers’ next plan of action will be. While it may give an idea of whether particular interactions were satisfactory, it gives no indication of whether they are likely to move to another brand; if they needed more assistance; if there was an opportunity to upsell or cross sell; or even what new products and services need to be introduced. Some customers will just tick the box as a way of closing off a feedback form or survey. Leading organisations realise that this may not be a good enough indication of a brand’s health.
Organisations will look beyond CSAT to other parameters and attributes. It is the time to pay greater attention to the Voice of the Customer – and old methods alone will not suffice. They want a 360-degree view of their customers’ opinions.

As the search for a COVID-19 vaccine intensifies, there is a global focus on the Life Sciences industry. The industry has been hit hard this year – having to deliver overtime through a disrupted supply chain, unexpected demand spikes, and reduction of revenues from their regular streams. Life sciences organisations are already challenged by the breadth of their focus – across R&D and clinical discovery; Manufacturing & Distribution; and Sales & Marketing. Increasingly, many pharmaceutical and medtech organisations choose to outsource some of these functions, which brings to fore the need for a robust compliance framework. In the Ecosystm Digital Priorities in the New Normal Study, two-thirds of life sciences organisations mention that they have either been forced to start, accelerate or refocus their Digital Transformation initiatives – the remaining one-third have put their Digital Transformation on hold. The industry is clearly at an inflection point.
Challenges of the Life Sciences Industry
Continued Focus on R&D. Life sciences companies operate in an extremely competitive global market where they have to work on new products against a backdrop of competition from generics and a global concern over rising healthcare expenditure. Apart from regulatory challenges, they also face immense competition from local manufacturers as they enter each new market.
Re-thinking their Distribution Strategy. Sales and distribution for many pharma and medtech organisations have been traditional – using agents, distributors, clinicians, and healthcare providers. But now they need to change their go-to-market strategies, target patients and consumers directly and package their product offerings into value-added services. This will require them to incorporate customer experience enhancers in their R&D, going beyond drug discovery and product innovation.
Tracking Global Regulations. Governments across the world are trying to manage their healthcare budgets. They are also more focused on chronic disease management. The focus has shifted to value-based medicine in general, but pharma and medtech products are being increasingly held accountable by health outcomes. Governments are increasingly implementing drug reforms around what clinicians can prescribe. Global Life Sciences organisations have to constantly monitor the regulations in the multiple countries where they operate and sell. They are also accountable for their entire supply chain, especially ensuring a high product quality and fraud prevention.
The global Ecosystm AI study reveals the top priorities for Life Sciences organisations, focused on adopting emerging technologies (Figure 1). They appear to be investing in emerging technology especially in their R&D and clinical discovery and Manufacturing functions.

Technology as an Enabler of Life Sciences Transformation
Discovery and Development
With the evolution of technology, Life Sciences organisations are able to automate much of the mundane tasks around drug discovery and apply AI and machine learning to transform their drug discovery and development process. They are increasingly leveraging their ecosystem of smaller pharma and medtech companies, research laboratories, academic institutions, and technology providers to make the process more time and cost efficient.
Using an AI algorithm, the researchers at the Massachusetts Institute of Technology have discovered an antibiotic compound that can kill many species of antibiotic-resistant bacteria. MIT’s algorithm screens millions of chemical compounds and chooses the antibiotics which have the potential to eliminate bacteria resistant to existing drugs. Harvard’s Wyss Institute for Biologically Inspired Engineering is manufacturing 3D printed organ-on-a-chip to give insights on cell, tissue, and organ biology to help the pharma sector with drug development, disease modelling and finally in the development of personalised medicine.
Life Sciences are also engaging more with technology partners – whether emerging start-ups or established players. Pfizer and Saama are working together on AI clinical data mining. The companies are developing and deploying an AI-based analytical tool where Pfizer provides clinical data and domain knowledge to train models on the Saama Life Science Analytics Cloud (LSAC). Saama was identified as a partner at a hackathon. Sanofi and Google have established a new virtual Innovation Lab to develop scientific and commercial solutions, using multiple Google capabilities from cloud computing to AI.
Tech providers also keep evolving their capabilities in the Life Sciences industry for more efficient drug discovery and better treatment protocols. Microsoft’s Project Hanover uses machine learning to develop a personalised drug protocol to manage acute myeloid leukaemia. Similarly, Apple’s ResearchKit – an open-source framework is meant to help researchers and developers create iOS-based applications in the field of medical research.
Manufacturing and Logistics
The industry also faces the challenges faced by any Manufacturing organisation and has the need to deploy manufacturing analytics, and advanced supply chain technology for better process and optimisation and agility. There is also the need for complete visibility over their supply chain and inventory for traceability, safety, and fraud prevention. Emerging technologies such as Blockchain will become increasingly relevant for real-time track and trace capability.
The MediLedger Network was established as an open network to the entire pharma supply chain. The project brings a consortium of some of the world’s largest pharmaceutical companies, and logistics providers to improve drug supply chain management.
Since the data on the distributed ledger is encrypted, it creates a secure system without any vulnerabilities. This eliminates counterfeit products and ultimately ensures the quality of the pharma products and promotes increased patient safety. To foster security and improve the supply chain, the United States Food and Drug Administration (USFDA) successfully completed a pilot with a group including IBM, KPMG, Merck and Walmart to support U.S. Drug Supply Chain Security Act (DSCSA) to trace vaccines and prescription medicines throughout the country.
Diagnostics and Personalised Healthcare
As more devices (consumer and enterprise) and applications enter the market, people will take ownership and interest in their own health outcomes. This is seeing a continued growth in online communities and comparison sites (on physicians, hospitals, and pharmaceutical products). Increasingly, insurance providers will use data from wearable devices for a more personalised approach; promoting and rewarding good health practices.
Beyond the use of wearables and health and wellness apps, we will also see an exponential increase of home-based healthcare products and services – whether for primary care and chronic disease management, or long-term and palliative care. As patients become more engaged with their care, the life sciences industry is beginning to serve them through personalised approach, medicines, right diagnosis and through advanced medical devices and products.
An online tool developed by the University of Virginia Health Systems helps identify patients that have a high risk of getting a stroke and helps them reduce that risk. This tool calculates the patient’s probability of suffering a stroke by measuring the severity of their metabolic syndrome – taking into account a number of conditions that include high blood pressure, abnormal cholesterol levels and excess body fat. Life Sciences organisations are increasingly having to invest in customer-focused solutions such as these.
Wearables with special smart software to monitor health parameters, gauge drug compatibility and monitor complications are being implemented by Life Sciences organisations. The US FDA approved a pill called Abilify MyCite fitted with a tiny ingestible sensor that communicates with a patch worn by the patient to transmit data on a smartphone. Medtech companies continue to develop FDA approved health devices that can monitor chronic conditions. Smart continuous glucose monitoring (CGM) and insulin pens send blood glucose level data to smartphone applications allowing the wearer to easily check their information and detect trends.
Technologies such as AR/VR are also enabling Life Sciences companies with their diagnostics. Regeneron Pharmaceuticals has created an AR/VR app called “In My Eyes” to better diagnose vision impairment in patients.
What is interesting about these personalised products is that not only do they improve clinical outcomes, they also give Life Sciences companies access to rich data that can be used for further product development and improvement.
The Life Sciences industry will continue to operate in an unpredictable and competitive market. This is evident by the several mergers and acquisitions that we witness in the industry. As they continue to use cutting-edge technology for their R&D practices, they will leverage technology to transform other functions as well.

We continue to receive responses from the tech buyer community on the impact of COVID-19 on Digital Transformation initiatives, and the early business and technology measures that were implemented to combat the crisis. As the months go by, it is becoming apparent that organisations have implemented the early measures and are now looking ahead to their journey to recovery.
IT Teams realised that even if they had the right technology solutions, they were unprepared for the scale or capacity to extend these technology offerings to handle the sudden and enormous changes required to manage the crisis. Their cloud business applications, cybersecurity and collaboration solutions were simply not sufficient to meet the needs of the remote workforce. As organisations become more conscious of business continuity planning (BCP) for future eventualities, they will boost their technology capabilities, over the next 12 months.
Another area the study aims to explore is how optimistic is the business outlook, when it comes to expecting a return to normalcy. Only 3% of organisations are expecting a New Normal that is very different from where things were at the beginning of the year. About a third of organisations are expecting a return to normalcy by the end of the year, while the majority expect to recover by the middle of 2021. Also, some industries are more optimistic of a recovery than others. As an example, 35% of healthcare organisations expect a return to normalcy by the end of the year. This is a positive indicator, given that the industry has been in the forefront of the crisis, for nearly 6 months now.

More insights on the impact of the COVID-19 pandemic and technology areas that will see continued investments, as organisations get into the recovery phase, can be found in the Digital Priorities in the New Normal Study.
In recent times, there appears to be a shift in motive for cyber-attacks – along with common data theft, there is a proliferation of attacks aimed at the business interruption and physical incapacitation of business operations. We have witnessed an alarming increase in high-profile attacks on manufacturing businesses and critical infrastructure providers, globally.
This appears to be a global phenomenon. Honda manufacturing plants went offline in June after a cyber-attack compromised some of the Japanese automaker’s facilities. The same pattern emerged in a separate attack at the same time targeting Edesur S.A., a company belonging to the Enel Group that confirmed its internal IT network was disrupted due to a ransomware attack, which was caught by antivirus software before the malware could infect. Both companies had machines with Internet-accessible remote desktop servers, which is a favorite infection method among attackers nowadays. One of Australia’s largest brewers, Lion also faced a ransomware outbreak, last month. In Israel, it was reported that a cyber-attack very nearly poisoned the water supply with the attackers attempting to overload the water system with chlorine, and in recent days, a fire and explosion at an Iranian nuclear plant is suspected of being caused by cyber-attack.
These attacks highlight the need for appropriate investments in cybersecurity by companies and municipalities that own or operate critical infrastructure, properties (including places of public congregation, retailers and others) that are rapidly deploying a suite of operational technologies, and businesses in the manufacturing sector.
Operational Technology (OT) is the backbone of modern industrial operations and is a network of multiple computing systems that perform operations including production line management, operations control and industrial monitoring. OT can further include specific computing systems like industrial control systems (ICS) which is a collection of control systems used to operate and/or automate industrial processes. There are several types of ICSs, the most common of which are Supervisory Control and Data Acquisition (SCADA) systems, and Distributed Control Systems (DCS). With such industrial systems and smart end-user products connected by a common network, several vulnerabilities may appear.
In OT security, the focus is much less on information, but more on the industrial process that technology controls. Hence, availability and integrity are often more important than confidentiality. Any organisation employing OT should employ continual risk-based assessments of their cybersecurity posture to prioritise and tailor recommended guidelines and solutions to fit specific security, business, and operational requirements.
Why is OT More Vulnerable?
OT systems are versatile and can be found in all kinds of industrial settings and infrastructures like smart buildings, oil and gas, energy generation/distribution, mining, wastewater treatment/distribution, manufacturing, food production, consumer devices and transport. In fact, almost every business in 2020 has an element of IoT within their operations.
A big issue with OT is that a lot of the technology in place is over 20 years old and therefore was not designed to provide the security capabilities required to face cyber threats in 2020. Legacy technology often requires legacy hardware and software to support it – much of which is the end of life and unsupported by the vendors (for example, consider SCADA systems still reliant on Windows NT or older Unix based systems, which have not been supported by their vendors for many years).
OT systems have also been damaged as unintended side effects of problems starting in corporate networks that took advantage of increasing connectivity, proving clearly that the standard PCs that now form part of a typical organisation’s IT environment are in turn used to manage OT systems and become a major vector for such cyber-attacks.
When it comes to OT, safety and reliability are the primary concerns as attackers aim to disrupt the critical services industry and their customers rely upon them. Given the increasing propensity of connecting OT systems with corporate networks for ease of management and the growing use of IoT systems, the likelihood of such systems being affected by vulnerabilities exploitable over the network is increasing exponentially.
For almost every business – not just critical infrastructure providers – most technologies we deploy include connectivity to the internet. Not knowing what systems and external access to these systems that your business is introducing in its everyday technology investment create significant risks to the broader business operations.

Manufacturing businesses and critical infrastructure providers realise that there is need to re-evaluate their cybersecurity measures, in the wake of the COVID-19 crisis, according to the findings of the Ecosystm’s ongoing “Digital Priorities in the New Normal” study (Figure 1).
But these measures may not be sufficient, as indicated by the slew of cyber-attacks on these organisations.
Why are these attacks successful?
There are several reasons why OT attacks are successful:
- Unauthorised access to internet-facing systems (e.g. deploying an IoT with the default username and password)
- Introduction of a compromised device (e.g. USB stick) to the environment that infects the network (often employee action)
- Exploitation of zero-day vulnerabilities in control devices and software
- Propagated malware infections within isolated computer networks (i.e. The attacker can place a receiving device to make contact over a channel that can propagate across the isolated network)
- SQL injection via exploitation of web application vulnerabilities
- Network scanning and probing
- Lateral movement (i.e. inadequate segmentation which results in attackers being able to move between systems, groups of systems, network zones and even geographical locations.)
How can they be prevented?
The mitigation cannot rely solely on the organisation building security around the deployment nor can it be a reactive approach to fixing vulnerabilities in production, as they are identified. It begins with the OT vendors building security within; however, as with most IT systems and applications, this will evolve over time. For example, there is an initiative in Australia – driven by the IoT Alliance Australia (IOTAA) – to introduce a ‘Trust Mark’ for IoT devices that pass a certification process for security and privacy in product development. This is targeted to launch in September 2020 but could take many years to gain real traction. Thus, for the foreseeable future, the best operational outcomes must be planned and managed by the consumers of the technologies.
Here are the best practices to reduce exploitable IoT weaknesses and attacks occurring in your business:
- Maintain an accurate inventory of Operational Systems and eliminate any exposure of these systems to external networks
- Establish clear roles and responsibilities for your organisation and your vendors, to ensure cybersecurity risk is being addressed and managed throughout the OT lifecycle
- Implement network segmentation and apply firewalls between critical networks and systems.
- Use secure remote access methods
- Establish Role-Based Access Controls (RBAC) and implement system logging
- Use only strong passwords, change default passwords, and consider other access controls (especially for any elevated privileges) such as multi-factor authentication, privileged access management solutions, etc.
- Establish threat intelligence feeds from your OT vendors and security vendors to ensure you remain abreast of new vulnerabilities, software/firmware patches and threats targeting systems you employ
- Develop and enforce policies on mobile devices, including strict device controls for any device connecting to OT systems or network zones
- Implement an employee cybersecurity training program
- Establish and maintain rigorous testing and patching program including vulnerability assessment and penetration testing
- Implement measures for detecting compromises and develop a cybersecurity incident response plan with a specific focus on responding to a disruptive attack on your OT environment
- Maintain an up-to-date Business Continuity Plan that can be deployed rapidly in response to a significant disruption
