Coding Evolved: How AI Tools Boost Efficiency and Quality

5/5 (2)

5/5 (2)

AI tools have become a game-changer for the technology industry, enhancing developer productivity and software quality. Leveraging advanced machine learning models and natural language processing, these tools offer a wide range of capabilities, from code completion to generating entire blocks of code, significantly reducing the cognitive load on developers. AI-powered tools not only accelerate the coding process but also ensure higher code quality and consistency, aligning seamlessly with modern development practices. Organisations are reaping the benefits of these tools, which have transformed the software development lifecycle. 

Ecosystm research indicates that close to half (nearly 50%) of Asia Pacific organisations are already leveraging AI tools for code generation, with an additional 32% actively evaluating similar GenAI tools

Impact on Developer Productivity 

AI tools are becoming an indispensable part of software development owing to their: 

  • Speed and Efficiency. AI-powered tools provide real-time code suggestions, which dramatically reduces the time developers spend writing boilerplate code and debugging. For example, Tabnine can suggest complete blocks of code based on the comments or a partial code snippet, which accelerates the development process. 
  • Quality and Accuracy. By analysing vast datasets of code, AI tools can offer not only syntactically correct but also contextually appropriate code suggestions. This capability reduces bugs and improves the overall quality of the software. 
  • Learning and Collaboration. AI tools also serve as learning aids for developers by exposing them to new or better coding practices and patterns. Novice developers, in particular, can benefit from real-time feedback and examples, accelerating their professional growth. These tools can also help maintain consistency in coding standards across teams, fostering better collaboration. 

Advantages of Using AI Tools in Development 

  • Reduced Time to Market. Faster coding and debugging directly contribute to shorter development cycles, enabling organisations to launch products faster. This reduction in time to market is crucial in today’s competitive business environment where speed often translates to a significant market advantage. 
  • Cost Efficiency. While there is an upfront cost in integrating these AI tools, the overall return on investment (ROI) is enhanced through the reduced need for extensive manual code reviews, decreased dependency on large development teams, and lower maintenance costs due to improved code quality. 
  • Scalability and Adaptability. AI tools learn and adapt over time, becoming more efficient and aligned with specific team or project needs. This adaptability ensures that the tools remain effective as the complexity of projects increases or as new technologies emerge. 

Deployment Models 

The choice between SaaS and on-premises deployment models involves a trade-off between control, cost, and flexibility. Organisations need to consider their specific requirements, including the level of control desired over the infrastructure, sensitivity of the data, compliance needs, and available IT resources. A thorough assessment will guide the decision, ensuring that the deployment model chosen aligns with the organisation’s operational objectives and strategic goals. 

SAAS Vs. On-Premises: A guide to choosing the right deployment model

Technology teams must consider challenges such as the reliability of generated code, the potential for generating biased or insecure code, and the dependency on external APIs or services. Proper oversight, regular evaluations, and a balanced integration of AI tools with human oversight are recommended to mitigate these risks. 

A Roadmap for AI Integration 

The strategic integration of AI tools in software development offers a significant opportunity for companies to achieve a competitive edge. By starting with pilot projects, organisations can assess the impact and utility of AI within specific teams. Encouraging continuous training in AI advancements empowers developers to leverage these tools effectively.  Regular audits ensure that AI-generated code adheres to security standards and company policies, while feedback mechanisms facilitate the refinement of tool usage and address any emerging issues. 

Technology teams have the opportunity to not only boost operational efficiency but also cultivate a culture of innovation and continuous improvement in their software development practices. As AI technology matures, even more sophisticated tools are expected to emerge, further propelling developer capabilities and software development to new heights. 

More Insights to tech Buyer Guidance
Databases Demystified. Cloud-Based Databases

5/5 (2)

5/5 (2)

In my previous Ecosystm Insights, I covered how to choose the right database for the success of any application or project. Often organisations select cloud-based databases for the scalability, flexibility, and cost-effectiveness.

Here’s a look at some prominent cloud-based databases and guidance on the right cloud-based database for your organisational needs.

Cloud-Based Databases_Guide 1
Cloud-Based Databases_Guide 2
Cloud-Based Databases_Guide 3
Cloud-Based Databases_Guide 4
Cloud-Based Databases_Guide 5
Cloud-Based Databases_Guide 6
Cloud-Based Databases_Guide 7
Cloud-Based Databases_Guide 8
Cloud-Based Databases_Guide 9
previous arrowprevious arrow
next arrownext arrow
Cloud-Based Databases_Guide 1
Cloud-Based Databases_Guide 2
Cloud-Based Databases_Guide 3
Cloud-Based Databases_Guide 4
Cloud-Based Databases_Guide 5
Cloud-Based Databases_Guide 6
Cloud-Based Databases_Guide 7
Cloud-Based Databases_Guide 8
Cloud-Based Databases_Guide 9
previous arrow
next arrow

Click here to download ‘Databases Demystified. Cloud-Based Databases’ as a PDF.

Amazon RDS (Relational Database Service)


Managed Service. Automates database setup, maintenance, and scaling, allowing you to focus on application development. 

Scalability. Easily scales database’s compute and storage resources with minimal downtime. 

Variety of DB Engines. Supports multiple database engines, including MySQL, PostgreSQL, MariaDB, Oracle, and SQL Server. 


Cost. Can be expensive for larger databases or high-throughput applications. 

Complex Pricing. The pricing model can be complex to understand, with costs for storage, I/O, and data transfer. 

Google Cloud SQL


Fully Managed. Takes care of database management tasks like replication, patch management, and backups. 

Integration. Seamlessly integrates with other GCP services, enhancing data analytics and machine learning capabilities. 

Security. Offers robust security features, including data encryption at rest and in transit. 


Limited Customisation. Compared to managing your own database, there are limitations on configurations and fine-tuning. 

Egress Costs. Data transfer costs (especially egress) can add up if you have high data movement needs. 

Azure SQL Database


Highly Scalable. Offers a scalable service that can dynamically adapt to your application’s needs. 

Advanced Features. Includes advanced security features, AI-based performance optimisation, and automated updates. 

Integration. Deep integration with other Azure services and Microsoft products. 


Learning Curve. The wide array of options and settings might be overwhelming for new users. 

Cost for High Performance. Higher-tier performance levels can become costly. 

MongoDB Atlas


Flexibility. Offers a flexible document database that is ideal for unstructured data. 

Global Clusters. Supports global clusters to improve access speeds for distributed applications. 

Fully Managed. Provides a fully managed service, including automated backups, patches, and security. 


Cost at Scale. While it offers a free tier, costs can grow significantly with larger deployments and higher performance requirements. 

Indexing Limitations. Efficient querying requires proper indexing, which can become complex as your dataset grows. 

Amazon DynamoDB


Serverless. Offers a serverless NoSQL database that scales automatically with your application’s demands. 

Performance. Delivers single-digit millisecond performance at any scale. 

Durability and Availability. Provides built-in security, backup, restore, and in-memory caching for internet-scale applications. 


Pricing Model. Pricing can be complex and expensive, especially for read/write throughput and storage. 

Learning Curve. Different from traditional SQL databases, requiring time to learn best practices for data modeling and querying. 

Selection Considerations 

Data Model Compatibility. Ensure the database supports the data model you plan to use (relational, document, key-value, etc.). 

Scalability and Performance Needs. Assess whether the database can meet your application’s scalability and performance requirements. 

Cost. Understand the pricing model and estimate monthly costs based on your expected usage. 

Security and Compliance. Check for security features and compliance with regulations relevant to your industry. 

Integration with Existing Tools. Consider how well the database integrates with your current application ecosystem and development tools. 

Vendor Lock-in. Be aware of the potential for vendor lock-in and consider the ease of migrating data to other services if needed. 

Choosing the right cloud-based database involves balancing these factors to find the best fit for your application’s requirements and your organisation’s budget and skills. 

More Insights to tech Buyer Guidance
Evolving Landscape: AI Startups Take Centre Stage in 2024

5/5 (1)

5/5 (1)

The tech industry tends to move in waves, driven by the significant, disruptive changes in technology, such as cloud and smartphones. Sometimes, it is driven by external events that bring tech buyers into sync – such as Y2K and the more recent pandemic. Some tech providers, such as SAP and Microsoft, are big enough to create their own industry waves. The two primary factors shaping the current tech landscape are AI and the consequential layoffs triggered by AI advancements. 

While many of the AI startups have been around for over five years, this will be the year they emerge as legitimate solutions providers to organisations. Amidst the acceleration of AI-driven layoffs, individuals from these startups will go on to start new companies, creating the next round of startups that will add value to businesses in the future. 

Tech Sourcing Strategies Need to Change 

The increase in startups implies a change in the way businesses manage and source their tech solutions. Many organisations are trying to reduce tech debt, by typically consolidating the number of providers and tech platforms. However, leveraging the numerous AI capabilities may mean looking beyond current providers towards some of the many AI startups that are emerging in the region and globally. 

The ripple effect of these decisions is significant. If organisations opt to enhance the complexity of their technology architecture and increase the number of vendors under management, the business case must be watertight. There will be less of the trial-and-error approach towards AI from 2023, with a heightened emphasis on clear and measurable value. 

AI Startups Worth Monitoring 

Here is a selection of AI startups that are already starting to make waves across Asia Pacific and the globe. 

  • ADVANCE.AI provides digital transformation, fraud prevention, and process automation solutions for enterprise clients. The company offers services in security and compliance, digital identity verification, and biometric solutions. They partner with over 1,000 enterprise clients across Southeast Asia and India across sectors, such as Banking, Fintech, Retail, and eCommerce. 
  • Megvii is a technology company based in China that specialises in AI, particularly deep learning. The company offers full-stack solutions integrating algorithms, software, hardware, and AI-empowered IoT devices. Products include facial recognition software, image recognition, and deep learning technology for applications such as consumer IoT, city IoT, and supply chain IoT. 
  • I’mCloud is based in South Korea and specialises in AI, big data, and cloud storage solutions. The company has become a significant player in the AI and big data industry in South Korea. They offer high-quality AI-powered chatbots, including for call centres and interactive educational services. 
  • provides an AI platform, the H2O AI Cloud, to help businesses, government entities, non-profits, and academic institutions create, deploy, monitor, and share data models or AI applications for various use cases. The platform offers automated machine learning capabilities powered by H2O-3, H2O Hydrogen Torch, and Driverless AI, and is designed to help organisations work more efficiently on their AI projects. 
  • Frame AI provides an AI-powered customer intelligence platform. The software analyses human interactions and uses AI to understand the driving factors of business outcomes within customer service. It aims to assist executives in making real-time decisions about the customer experience by combining data about customer interactions across various platforms, such as helpdesks, contact centres, and CRM transcripts. 
  • Uizard offers a rapid, AI-powered UI design tool for designing wireframes, mockups, and prototypes in minutes. The company’s mission is to democratise design and empower non-designers to build digital, interactive products. Uizard’s AI features allow users to generate UI designs from text prompts, convert hand-drawn sketches into wireframes, and transform screenshots into editable designs. 
  • Moveworks provides an AI platform that is designed to automate employee support. The platform helps employees to automate tasks, find information, query data, receive notifications, and create content across multiple business applications. 
  • Tome develops a storytelling tool designed to reduce the time required for creating slides. The company’s online platform creates or emphasises points with narration or adds interactive embeds with live data or content from anywhere on the web, 3D renderings, and prototypes. 
  • Jasper is an AI writing tool designed to assist in generating marketing copy, such as blog posts, product descriptions, company bios, ad copy, and social media captions. It offers features such as text and image AI generation, integration with Grammarly and other Chrome extensions, revision history, auto-save, document sharing, multi-user login, and a plagiarism checker. 
  • Eightfold AI provides an AI-powered Talent Intelligence Platform to help organisations recruit, retain, and grow a diverse global workforce. The platform uses AI to match the right people to the right projects, based on their skills, potential, and learning ability, enabling organisations to make informed talent decisions. They also offer solutions for diversity, equity, and inclusion (DEI), skills intelligence, and governance, among others. 
  • Arthur provides a centralised platform for model monitoring. The company’s platform is model and platform agnostic, and monitors machine learning models to ensure they deliver accurate, transparent, and fair results. They also offer services for explainability and bias mitigation. 
  • DNSFilter is a cloud-based, AI-driven content filtering and threat protection service, that can be deployed and configured within minutes, requiring no software installation. 
  • Spot AI specialises in building a modern AI Camera System to create safer workplaces and smarter operations for every organisation. The company’s AI Camera System combines cloud and edge computing to make video footage actionable, allowing customers to instantly surface and resolve problems. They offer intelligent video recorders, IP cameras, cloud dashboards, and advanced AI alerts to proactively deliver insights without the need to manually review video footage. 
  • is an AI-powered revenue intelligence platform that helps customers win more revenue by providing sales, RevOps, marketing, enablement, and customer success teams with valuable insights. The company’s platform is designed to speed up complex enterprise sales cycles by engaging the right people in the right accounts, ultimately helping teams to sell more and faster with the same headcount.  

These examples highlight a few startups worth considering, but the landscape is rich with innovative options for organisations to explore. Similar to other emerging tech sectors, the AI startup market will undergo consolidation over time, and incumbent providers will continue to improve and innovate their own AI capabilities. Till then, these startups will continue to influence enterprise technology adoption and challenge established providers in the market.

AI Research and Reports
Transformative Integration: HPE’s Acquisition of Juniper Networks

5/5 (2)

5/5 (2)

Hewlett Packard Enterprise (HPE) has entered into a definitive agreement to acquire Juniper Networks for USD 40 per share, totaling an equity value of about USD 14 Billion. This strategic move is aimed to enhance HPE’s portfolio by focusing on higher-growth solutions and reinforcing their high-margin networking business. HPE expects to double their networking business, positioning the combined entity as a leader in networking solutions. With the growing demand for secure, unified technology driven by AI and hybrid cloud trends, HPE aims to offer comprehensive, disruptive solutions that connect, protect, and analyse data from edge to cloud.

This would also be the organisation’s largest deal since becoming an independent company in 2015. The acquisition is expected to be completed by late 2024 or early 2025.

Ecosystm analysts Darian Bird and Richard Wilkins provide their insights on the HPE acquisition and its implications for the tech market.

Converging Networking and Security

One of the big drawcards for HPE is Juniper’s Mist AI. The networking vendors have been racing to catch up – both in capabilities and in marketing. The acquisition though will give HPE a leadership position in network visibility and manageability. With GreenLake and soon Mist AI, HPE will have a solid AIOps story across the entire infrastructure.

HPE has been working steadily towards becoming a player in the converged networking-security space. They integrated Silver Peak well to make a name for themselves in SD-WAN and last year acquiring Axis Security gave them the Zero Trust Network Access (ZTNA), Secure Web Gateway (SWG), and Cloud Access Security Broker (CASB) modules in the Secure Service Edge (SSE) stack. Bringing all of this to the market with Juniper’s networking prowess positions HPE as a formidable player, especially as the Secure Access Service Edge (SASE) market gains momentum.

As the market shifts towards converged SASE, there will only be more interest in the SD-WAN and SSE vendors. In just over one year, Cato Networks and Netskope have raised funds, Check Point acquired Perimeter 81, and Versa Networks has made noises about an IPO. The networking and security players are all figuring out how they can deliver a single-vendor SASE.

Although HPE’s strategic initiatives signal a robust market position, potential challenges arise from the overlap between Aruba and Juniper. However, the distinct focus on the edge and data center, respectively, may help alleviate these concerns. The acquisition also marks HPE’s foray into the telecom space, leveraging its earlier acquisition of Athonet and establishing a significant presence among service providers. This expansion enhances HPE’s overall market influence, posing a challenge to the long-standing dominance of Cisco.

The strategic acquisition of Juniper Networks by HPE can make a transformative leap in AIOps and Software-Defined Networking (SDN). There is a potential for this to establish a new benchmark in IT management.

AI in IT Operations Transformation

The integration of Mist’s AI-driven wireless solutions and HPE’s SDN is a paradigm shift in IT operations management and will help organisations transition from a reactive to a predictive and proactive model. Mist’s predictive analytics, coupled with HPE’s SDN capabilities, empower networks to dynamically adjust to user demands and environmental changes, ensuring optimal performance and user experience. Marvis, Mist’s Virtual Network Assistant (VNA), adds conversational troubleshooting capabilities, enhancing HPE’s network solutions. The integration envisions an IT ecosystem where Juniper’s AI augments HPE’s InfoSight, providing deeper insights into network behaviour, preemptive security measures, and more autonomous IT operations.

Transforming Cloud and Edge Computing

The incorporation of Juniper’s AI into HPE’s cloud and edge computing solutions promises a significant improvement in data processing and management. AI-driven load balancing and resource allocation mechanisms will significantly enhance multi-cloud environment efficiency, ensuring robust and seamless cloud services, particularly vital in IoT applications where real-time data processing is critical. This integration not only optimises cloud operations but also has the potential to align with HPE’s commitment to sustainability, showcasing how AI advancements can contribute to energy conservation.

In summary, HPE’s acquisition of Juniper Networks, and specifically the integration of the Mist AI platform, is a pivotal step towards an AI-driven, efficient, and predictive IT infrastructure. This can redefine the standards in AIOps and SDN, creating a future where IT systems are not only reactive but also intuitively adaptive to the evolving demands of the digital landscape.


Sustainability is About MUCH More than Green Credentials

5/5 (3)

5/5 (3)

As an industry, the tech sector tends to jump on keywords and terms – and sometimes reshapes their meaning and intention. “Sustainable” is one of those terms. Technology vendors are selling (allegedly!) “sustainable software/hardware/services/solutions” – in fact, the focus on “green” or “zero carbon” or “recycled” or “circular economy” is increasing exponentially at the moment. And that is good news – as I mentioned in my previous post, we need to significantly reduce greenhouse gas emissions if we want a future for our kids. But there is a significant disconnect between the way tech vendors use the word “sustainable” and the way it is used in boardrooms and senior management teams of their clients.

Defining Sustainability

For organisations, Sustainability is a broad business goal – in fact for many, it is the over-arching goal. A sustainable organisation operates in a way that balances economic, social, and environmental (ESG) considerations. Rather than focusing solely on profits, a sustainable organisation aims to meet the needs of the present without compromising the ability of future generations to meet their own needs.

This is what building a “Sustainable Organisation” typically involves:

Economic Sustainability. The organisation must be financially stable and operate in a manner that ensures long-term economic viability. It doesn’t just focus on short-term profits but invests in long-term growth and resilience.

Social Sustainability. This involves the organisation’s responsibility to its employees, stakeholders, and the wider community. A sustainable organisation will promote fair labour practices, invest in employee well-being, foster diversity and inclusion, and engage in ethical decision-making. It often involves community engagement and initiatives that support societal growth and well-being.

Environmental Sustainability. This facet includes the responsible use of natural resources and minimising negative impacts on the environment. A sustainable organisation seeks to reduce its carbon footprint, minimise waste, enhance energy efficiency, and often supports or initiates activities that promote environmental conservation.

Governance and Ethical Considerations. Sustainable organisations tend to have transparent and responsible governance. They follow ethical business practices, comply with laws and regulations, and foster a culture of integrity and accountability.

Security and Resilience. Sustainable organisations have the ability to thwart bad actors – and in the situation that they are breached, to recover from these breaches quickly and safely. Sustainable organisations can survive cybersecurity incidents and continue to operate when breaches occur, with the least impact.

Long-Term Focus. Sustainability often requires a long-term perspective. By looking beyond immediate gains and considering the long-term impact of decisions, a sustainable organisation can better align its strategies with broader societal goals.

Stakeholder Engagement. Understanding and addressing the needs and concerns of different stakeholders (including employees, customers, suppliers, communities, and shareholders) is key to sustainability. This includes open communication and collaboration with these groups to foster relationships based on trust and mutual benefit.

Adaptation and Innovation. The organisation is not static and recognises the need for continual improvement and adaptation. This might include innovation in products, services, or processes to meet evolving sustainability standards and societal expectations.

Alignment with the United Nations’ Sustainable Development Goals (UNSDGs). Many sustainable organisations align their strategies and operations with the UNSDGs which provide a global framework for addressing sustainability challenges.

Organisations Appreciate Precise Messaging

A sustainable organisation is one that integrates economic, social, and environmental considerations into all aspects of its operations. It goes beyond mere compliance with laws to actively pursue positive impacts on people and the planet, maintaining a balance that ensures long-term success and resilience.

These factors are all top of mind when business leaders, boards and government agencies use the word “sustainable”. Helping organisations meet their emission reduction targets is a good starting point – but it is a long way from all businesses need to become sustainable organisations.

Tech providers need to reconsider their use of the term “sustainable” – unless their solution or service is helping organisations meet all of the features outlined above. Using specific language would be favoured by most customers – telling them how the solution will help them reduce greenhouse gas emissions, meet compliance requirements for CO2 and/or waste reduction, and save money on electricity and/or management costs – these are all likely to get the sale over the line faster than a broad “sustainability” messaging will.

Access More Insights Here
Ecosystm Predicts: The Top Cities of the Future Trends for 2021

5/5 (1)

5/5 (1)

Public sector organisations are looking at 2021 as the year where they either hobble back to normalcy or implement their successful pilots (that were honed under tremendous pressure). Ecosystm research finds that 60% of government agencies are looking at 2021 as the year they make a recovery to normal – or the normal that finally emerges. The path to recovery will be technology-driven, and this time they will look at scalability and data-driven intelligence.

Ecosystm Advisors Alan Hesketh, Mike Zamora and Sash Mukherjee present the top 5 Ecosystm predictions for Cities of the Future in 2021. This is a summary of our Cities of the Future predictions – the full report (including the implications) is available to download for free on the Ecosystm platform here.

The Top 5 Cities of the Future Trends for 2021

#1 Cities Will Re-start Their Transformation Journey by Taking Stock

In 2021 the first thing that cities will do is introspect and reassess. There have been a lot of abrupt policy shifts, people changes, and technology deployments. Most have been ad-hoc, without the benefit of strategy planning, but many of the services that cities provide have been transformed completely. Government agencies in cities have seen rapid tech adoption, changes in their business processes and in the mindset of how their employees – many who were at the frontline of the crisis – provide citizen services. 

Technology investments, in most cases, took on an unexpected trajectory and agencies will find that they have digressed from their technology and transformation roadmap. This also provides an opportunity, as many solutions would have gone through an initial ‘proof-of-concept’ without the formal rigours and protocols. Many of these will be adopted for longer term applications. In 2021, they will retain the same technology priorities as 2020, but consolidate and strengthen on their spend.  

#2 Cities Will be Instrumented Using Intelligent Edge Devices

The capabilities of edge devices continue to increase dramatically, while costs decline. This reduces the barriers to entry for cities to collect and analyse significantly more data about the city and its people. Edge devices move computational power and data storage as close to the point of usage as possible to provide good performance. Devices range from battery powered IoT devices for data collection through to devices such as smart CCTV cameras with embedded pattern recognition software.

Cities will develop many use cases for intelligent edge devices. These uses will range from enhancing old assets using newer approaches to data collection – through to accelerating the speed and quality of the build of a new asset. The move to data-driven maintenance and decision-making will improve outcomes. 

#3 COVID-19 Will Impact City Design

The world has received a powerful reminder of the vulnerability of densely populated cities, and the importance of planning and regulating public health. COVID-19 will continue to have an impact on city design in 2021.  

A critical activity in controlling the pandemic in this environment is the test-and-trace capabilities of the local public health authorities. Technology to provide automated, accurate, contact tracing to replace manual efforts is now available. Scanning of QR codes at locations visited is proving to be the most widely adopted approach. The willingness of citizens to track their travels will be a crucial aid in managing the spread of COVID-19.  

Early detection of new disease outbreaks, or other high-risk environmental events, is essential to minimise harm. Intelligent edge devices that detect the presence of viruses will become crucial tools in a city’s defence.

Intelligent edge devices will also play a role in managing building ventilation. Well-ventilated spaces are an important factor in controlling virus transmission. But a limited number of buildings have ventilation systems that are capable of meeting those requirements. Property owners will begin to refit their facilities to provide better air movement.  

#4 Technology Vendors Will Emerge as the Conductors of Cities of the Future

The built environment comprises not only of the physical building, but also the space around the buildings and building operations. The real estate developer/investor owns the building – the urban fabric, the relationship of buildings to each other, the common space and the common services provided to the city, is owned by the City. The question is who will coordinate the players, e.g. business, citizens, government and the built environment. Ideally the government should be the conductor. However, they may not have sufficient experience or knowledge to properly implement this role. This means a capable and knowledgeable neutral consultant will at least initially fill this role. There is an opportunity for a technology vendor to fill that consulting role and impact the city fabric. This enhanced city environment will be requested by the Citizen, driven by the City, and guided by Technology Vendors. 2021 will see leading technology vendors working very closely with cities.

#5 Compliance Will be at the Core of Citizen Engagement Initiatives

Many Smart Cities have long focused on online services – over the last couple of years mobile apps have further improved citizen services. In 2020, the pandemic challenged government agencies to continue to provide services to citizens who were housebound and had become more digital savvy almost overnight. And many cities were able to scale up to fulfill citizen expectations.

However, in 2021 there will be a need to re-evaluate measures that were implemented this year – and one area that will be top priority for public sector agencies is compliance, security and privacy.

The key drivers for this renewed focus on security and privacy are:

  • The need to temper the focus of ‘service delivery at any cost’ and further remind agencies and employees that security and privacy must comply with standard to allow the use of government data.
  • The rise of cyberattacks that target not only essential infrastructure, but also individual citizens and small and medium enterprises (SMEs).
  • The rise of app adoption by city agencies – many that have been developed by third parties. It will become essential to evaluate their compliance to security and privacy requirements.

New call-to-action
A Leaner IBM – What Lies Ahead

5/5 (6)

5/5 (6)

IBM announced its intention to spin off its infrastructure services business as a separate public company, allowing Big Blue to focus on hybrid cloud and AI. The newly formed entity, temporarily named NewCo, will offer project and outsourcing services that currently fall under its GTS business unit. NewCo will have a staff of around 90,000 employees and is expected to earn revenue of about $19B. While GTS has experienced declining revenue for some time now, IBM believes that the split will unlock growth and put it on a path to recovery.

Once the Red Hat acquisition closed last year and the tag team of Jim Whitehurst and Arvind Krishna were announced, it became clear that IBM was gearing up to become a leaner, more agile leader in the hybrid cloud space. One of two possible courses seemed apparent – either wither away for years until IBM was small enough to become nimble, or take bold action. IBM has opted for the latter and is likely to be rewarded for it. The new IBM will have revenue of around $59B, well short of its peak at over $100B, but sacrificing turnover for margin and growth gives it a more positive long-term outlook.

Stripping back IBM to become smaller, faster growing, and more profitable, will help solve many of its greatest challenges. Significant investment into growth segments will become more palatable without the financial burden of the declining infrastructure services unit. The well-needed cultural change and drive to think like a start-up will become more practical in the new IBM.

NewCo to Build New Cloud Partnerships

IBM’s infrastructure services unit has had some great success in larger, complex, hybrid cloud deals recently – but at the lower end of the market there have been many head winds. Public cloud providers have eroded what was once a lucrative compute and storage services market. At the same time, application service providers, like Accenture, TCS, and HCL have been pivoting towards infrastructure. Untethering infrastructure services makes a turnaround story more likely, giving NewCo greater flexibility and speed, which clients have been crying out for.

The greatest benefit to NewCo will be the ability to freely partner with other cloud providers, like AWS, Microsoft, and Google. Although IBM has made noises about being willing to embrace its competitors, this was not necessarily implemented on the ground nor was it reciprocated.

It is no secret that GTS and GBS have had a rocky relationship since day one. The split will reassure clients that each of them is agnostic and relieve any internal pressure to partner unless it is best for the client. While elements of this decision look like the unfolding of a long-term strategy that began under Ginni Rometty, it does, however, leave open the question of why GTS and GBS were more closely integrated over the last few years. This also means IBM is moving in the opposite direction to its competitors, who are shifting towards offerings that cover the full stack of services from infrastructure up to applications.

What Lies Ahead for IBM

One detail that is not immediately certain is the fate of IBM security services, which could be integrated with security software at IBM, spun out with the rest of infrastructure services, or even split into consulting and delivery. An important differentiator for IBM has been its ability to build in security at the beginning of transformation projects making final placement a difficult decision.

It might be tempting to predict that next IBM would couple its Systems unit and Support Services to be spun off or sold although Mr. Krishna ruled that out. Over the long term, these are both financially underperforming units but there is an advantage to building the core infrastructure that critical workloads are run on.

Each new IBM CEO has had a make or break moment and Mr. Krishna has decided that his will come early. For the company to thrive for another 100 years it needed to place a big bet and it could not have come soon enough.

Get your Free Copy

Telefonica & AWS Partnership – The End of the Beginning for Private Cloud

5/5 (5)

5/5 (5)

Earlier this month, Telefónica Germany announced it will use Amazon Web Services (AWS) to virtualise its 5G core for a proof of concept (POC) in an industrial use case and plans to move to commercial deployment in 2021. Part of the POC process is to ensure compliance with all applicable data protection guidelines and certifying them according to relevant industry standards.

“With the virtualisation of our 5G core network, we are laying the foundation for the digital transformation of the German economy. This collaboration with AWS is an important part of our strategy for building industrial 5G networks”, said Markus Haas, CEO of Telefónica Germany.

Sentiment about cloud – especially public cloud – has been on a slight roller-coaster ride since they emerged back in the “noughties”: From initial reluctance to reluctant acceptance – to  customer driven enthusiasm to scaling back and migrating back data and apps to on-premises data centres or private clouds to a more recent acceptance, that most enterprise resources may work best in a hybrid or public cloud environment.

Still, the viewpoint of many is still that core resources for the most part belong on-premises – especially if they are essential for the running of the business or involves sensitive data.

It is in this light that the Telefónica Germany announcement is interesting. On the face of it, it may appear that this is a possible major validation of public cloud as a platform for core systems and sensitive data. Although the core network components will remain on a different platform delivered by Ericsson, there is clearly an element of that.

Perception on Public Cloud

Many organisations remain sceptical with regards to public cloud. Ecosystm data shows that almost 40% have private cloud as their primary cloud deployment model (Figure 1); roughly a third have gone for a hybrid model and only around one quarter have chosen a public cloud model.

Primary Cloud Deployment Model

Most cloud deployment strategies ultimately come down to an evaluation of cost vs. risk and this evaluation is clearly demonstrated in Ecosystm data. Close to 80% of those choosing an on-premises private cloud model mention security and compliance as a main reason whereas cost considerations are the main reason for those opting for a public cloud model (Figure 2). What our data also shows is that public cloud providers are not necessarily winning the argument of cost savings among users.

Reasons for Choosing Deployment Model

For many organisations today, security and compliance concerns are still a valid point against public cloud as a primary deployment model. However, as we see more and more initiatives like Telefónica Germany, this argument diminishes – and it will become harder for IT organisations to convince senior management that this is still the way to go.

The Edge Complements the Cloud

The other noteworthy take-away from the Telefónica Germany initiative is how cloud-enabled edge computing is being embraced by the network design to ensure lower latencies for those who need it. The company states, “If companies use 5G network functions based on the cloud-based 5G core network of Telefónica Germany / O2 in the future, they will no longer need a physical core network infrastructure at their logistics and production sites, for example, but only a 5G radio network (RAN) with corresponding antennas.”

As I’m sure that you are an avid reader of Ecosystm Predicts every year, this should not come as a surprise as we wrote about something like this in the Top 5 Cloud Trends for 2020. Although some are touting Edge computing as the ultimate replacement of Cloud, we then believed – and still do – that it will be complimentary rather than competing technology. Cloud-based setups can benefit from pushing computing heavy workloads to the Edge in much the same way as IoT and provides a great platform for managing the Edge computing endpoints.

But to go back to the private cloud bit – while private cloud is not going away in the foreseeable future, we may be starting to see its demise in the more distant future.

To paraphrase a famous Brit: Now this is not the end. It is not even the beginning of the end. But it is, perhaps, the end of the beginning for private cloud.

Identifying emerging cloud computing trends can help you drive digital business decision making, vendor and technology platform selection and investment strategies.Gain access to more insights from the Ecosystm Cloud Study.

Ecosystm Cloud Insights

Zoom selects Oracle as Cloud Infrastructure Provider

5/5 (6)

5/5 (6) The COVID-19 crisis has forced countries to implement work from home policies and lockdowns. Since the crisis hit, uptake of cloud communication and collaboration solutions have seen a dramatic increase. Video conferencing provider, Zoom has emerged as a key player in the market, with a rapid increase in user base from 10 million daily active participants in December 2019 to 200 million in March 2020 – a growth in the number of users of nearly 200%!

Security Concerns around Zoom

The rapid increase in user base and the surge in traffic has required Zoom to re-evaluate its offerings and capacity. The platform was primarily built for enterprises and now is seeing unprecedented usage in conducting team meetings, webinars, virtual conferences, e-learning, and social events.

The one area where they were impacted most is security. In his report, Cybersecurity Considerations in the COVID-19 Era, Ecosystm Principal Advisor Andrew Milroy says, “The extraordinary growth of Zoom has made it a target for attackers. It has had to work remarkably hard to plug the security gaps, identified by numerous breaches. Many security vulnerabilities have been discovered with Zoom such as, a vulnerability to UNC path injection in the client chat feature, which allows hackers to steal Windows credentials, keeping decryption keys in the cloud which can potentially be accessed by hackers and the ability for trolls to ‘Zoombomb’ open and unprotected meetings.”

“Zoom largely responded to these disclosures quickly and transparently, and it has already patched many of the weaknesses highlighted by the security community. But it continues to receive rigorous stress testing by hackers, exposing more vulnerabilities.”

However, Milroy does not think that this issue is unique to Zoom. “Collaboration platforms tend to tread a fine line between performance and security. Too much security can cause performance and usability to be impacted negatively. Too little security, as we have seen, allows hackers to find vulnerabilities. If data privacy is critical for a meeting, then perhaps collaboration platforms should not be used, or organisations should not share critical information on them.”

Zoom to increase Capacity and Scalability

Zoom is aware that it has to increase its service capacity and scalability of its offerings, if it has to successfully leverage its current market presence, beyond the COVID-19 crisis. Last week Zoom announced that that it had selected Oracle as its cloud Infrastructure provider. One of the reasons cited for the choice is Oracle’s “industry-leading security”. It has been reported that Zoom is transferring more than 7 PB of data through Oracle Cloud Infrastructure servers daily.

In addition to growing their data centres, Zoom has been using AWS and Microsoft Azure as its hosting providers. Milroy says, “It makes sense for Zoom to use another supplier rather than putting ‘all its eggs in one or two baskets’. Zoom has not shared the commercial details, but it is likely that Oracle has offered more predictable pricing. Also, the security offered by the Oracle Cloud Infrastructure deal is likely to have impacted the choice and it is likely that Oracle has also priced its security features very competitively.”

“It must also be borne in mind that Google, Microsoft and Amazon are all competing directly with Zoom. They all offer video collaboration platforms and like Zoom, are seeing huge growth in demand. Zoom may not wish to contribute to the growth of its competitors any more than it needs to.”

Milroy sees another benefit to using Oracle. “Oracle is known to have a presence in the government sector – especially in the US. Working with Oracle might make it easier for Zoom to win large government contracts, to consolidate its market presence.”

Gain access to more insights from the Ecosystm Cloud Study

Ecosystm Cloud Insights