At a recently held Ecosystm roundtable, in partnership with Qlik and 121Connects, Ecosystm Principal Advisor Manoj Chugh, moderated a conversation where Indian tech and data leaders discussed building trust in data strategies. They explored ways to automate data pipelines and improve governance to drive better decisions and business outcomes. Here are the key takeaways from the session.
Data isn’t just a byproduct anymore; it’s the lifeblood of modern businesses, fuelling informed decisions and strategic growth. But with vast amounts of data, the challenge isn’t just managing it; it’s building trust. AI, once a beacon of hope, is now at risk without a reliable data foundation. Ecosystm research reveals that a staggering 66% of Indian tech leaders doubt their organisation’s data quality, and the problem of data silos is exacerbating this trust crisis.
At the Leaders Roundtable in Mumbai, I had the opportunity to moderate a discussion among data and digital leaders on the critical components of building trust in data and leveraging it to drive business value. The consensus was that building trust requires a comprehensive strategy that addresses the complexities of data management and positions the organisation for future success. Here are the key strategies that are essential for achieving these goals.
1. Adopting a Unified Data Approach
Organisations are facing a growing wave of complex workloads and business initiatives. To manage this expansion, IT teams are turning to multi-cloud, SaaS, and hybrid environments. However, this diverse landscape introduces new challenges, such as data silos, security vulnerabilities, and difficulties in ensuring interoperability between systems.
A unified data strategy is crucial to overcome these challenges. By ensuring platform consistency, robust security, and seamless data integration, organisations can simplify data management, enhance security, and align with business goals – driving informed decisions, innovation, and long-term success.
Real-time data integration is essential for timely data availability, enabling organisations to make data-driven decisions quickly and effectively. By integrating data from various sources in real-time, businesses can gain valuable insights into their operations, identify trends, and respond to changing market conditions.
Organisations that are able to integrate their IT and operational technology (OT) systems find their data accuracy increasing. By combining IT’s digital data management expertise with OT’s real-time operational insights, organisations can ensure more accurate, timely, and actionable data. This integration enables continuous monitoring and analysis of operational data, leading to faster identification of errors, more precise decision-making, and optimised processes.
2. Enhancing Data Quality with Automation and Collaboration
As the volume and complexity of data continue to grow, ensuring high data quality is essential for organisations to make accurate decisions and to drive trust in data-driven solutions. Automated data quality tools are useful for cleansing and standardising data to eliminate errors and inconsistencies.
As mentioned earlier, integrating IT and OT systems can help organisations improve operational efficiency and resilience. By leveraging data-driven insights, businesses can identify bottlenecks, optimise workflows, and proactively address potential issues before they escalate. This can lead to cost savings, increased productivity, and improved customer satisfaction.
However, while automation technologies can help, organisations must also invest in training employees in data management, data visualisation, and data governance.
3. Modernising Data Infrastructure for Agility and Innovation
In today’s fast-paced business landscape, agility is paramount. Modernising data infrastructure is essential to remain competitive – the right digital infrastructure focuses on optimising costs, boosting capacity and agility, and maximising data leverage, all while safeguarding the organisation from cyber threats. This involves migrating data lakes and warehouses to cloud platforms and adopting advanced analytics tools. However, modernisation efforts must be aligned with specific business goals, such as enhancing customer experiences, optimising operations, or driving innovation. A well-modernised data environment not only improves agility but also lays the foundation for future innovations.
Technology leaders must assess whether their data architecture supports the organisation’s evolving data requirements, considering factors such as data flows, necessary management systems, processing operations, and AI applications. The ideal data architecture should be tailored to the organisation’s specific needs, considering current and future data demands, available skills, costs, and scalability.
4. Strengthening Data Governance with a Structured Approach
Data governance is crucial for establishing trust in data, and providing a framework to manage its quality, integrity, and security throughout its lifecycle. By setting clear policies and processes, organisations can build confidence in their data, support informed decision-making, and foster stakeholder trust.
A key component of data governance is data lineage – the ability to trace the history and transformation of data from its source to its final use. Understanding this journey helps organisations verify data accuracy and integrity, ensure compliance with regulatory requirements and internal policies, improve data quality by proactively addressing issues, and enhance decision-making through context and transparency.
A tiered data governance structure, with strategic oversight at the executive level and operational tasks managed by dedicated data governance councils, ensures that data governance aligns with broader organisational goals and is implemented effectively.
Are You Ready for the Future of AI?
The ultimate goal of your data management and discovery mechanisms is to ensure that you are advancing at pace with the industry. The analytics landscape is undergoing a profound transformation, promising to revolutionise how organisations interact with data. A key innovation, the data fabric, is enabling organisations to analyse unstructured data, where the true value often lies, resulting in cleaner and more reliable data models.
GenAI has emerged as another game-changer, empowering employees across the organisation to become citizen data scientists. This democratisation of data analytics allows for a broader range of insights and fosters a more data-driven culture. Organisations can leverage GenAI to automate tasks, generate new ideas, and uncover hidden patterns in their data.
The shift from traditional dashboards to real-time conversational tools is also reshaping how data insights are delivered and acted upon. These tools enable users to ask questions in natural language, receiving immediate and relevant answers based on the underlying data. This conversational approach makes data more accessible and actionable, empowering employees to make data-driven decisions at all levels of the organisation.
To fully capitalise on these advancements, organisations need to reassess their AI/ML strategies. By ensuring that their tech initiatives align with their broader business objectives and deliver tangible returns on investment, organisations can unlock the full potential of data-driven insights and gain a competitive edge. It is equally important to build trust in AI initiatives, through a strong data foundation. This involves ensuring data quality, accuracy, and consistency, as well as implementing robust data governance practices. A solid data foundation provides the necessary groundwork for AI and GenAI models to deliver reliable and valuable insights.
As AI evolves, the supporting infrastructure has become a crucial consideration for organisations and technology companies alike. AI demands massive processing power and efficient data handling, making high-performance computing clusters and advanced data management systems essential. Scalability, efficiency, security, and reliability are key to ensuring AI systems handle increasing demands and sensitive data responsibly.
Data centres must evolve to meet the increasing demands of AI and growing data requirements.
Equinix recently hosted technology analysts at their offices and data centre facilities in Singapore and Sydney to showcase how they are evolving to maintain their leadership in the colocation and interconnection space.
Equinix is expanding in Latin America, Africa, the Middle East, and Asia Pacific. In Asia Pacific, they recently opened data centres in Kuala Lumpur and Johor Bahru, with capacity additions in Mumbai, Sydney, Melbourne, Tokyo, and Seoul. Plans for the next 12 months include expanding in existing cities and entering new ones, such as Chennai and Jakarta.
Ecosystm analysts comment on Equinix’s growth potential and opportunities in Asia Pacific.
Small Details, Big Impact
TIM SHEEDY. The tour of the new Equinix data centre in Sydney revealed the complexity of modern facilities. For instance, the liquid cooling system, essential for new Nvidia chipsets, includes backup cold water tanks for redundancy. Every system and process is designed with built-in redundancy.
As power needs grow, so do operational and capital costs. The diesel generators at the data centre, comparable to a small power plant, are supported by multiple fuel suppliers from several regions in Sydney to ensure reliability during disasters.
Security is critical, with some areas surrounded by concrete walls extending from the ceiling to the floor, even restricting access to Equinix staff.
By focusing on these details, Equinix enables customers to quickly set up and manage their environments through a self-service portal, delivering a cloud-like experience for on-premises solutions.
Equinix’s Commitment to the Environment
ACHIM GRANZEN. Compute-intensive AI applications challenge data centres’ “100% green energy” pledges, prompting providers to seek additional green measures. Equinix addresses this through sustainable design and green energy investments, including liquid cooling and improved traditional cooling. In Singapore, one of Equinix’s top 3 hubs, the company partnered with the government and Sembcorp to procure solar power from panels on public buildings. This improves Equinix’s power mix and supports Singapore’s renewable energy sector.
TIM SHEEDY Building and operating data centres sustainably is challenging. While the basics – real estate, cooling, and communications – remain, adding proximity to clients, affordability, and 100% renewable energy complicates matters. In Australia, reliant on a mixed-energy grid, Equinix has secured 151 MW of renewable energy from Victoria’s Golden Plains Wind Farm, aiming for 100% renewable by 2029.
Equinix leads with AIA-rated data centres that operate in warmer conditions, reducing cooling needs and boosting energy efficiency. Focusing on efficient buildings, sustainable water management, and a circular economy, Equinix aims for climate neutrality by 2030, demonstrating strong environmental responsibility.
Equinix’s Private AI Value Proposition
ACHIM GRANZEN. Most AI efforts, especially GenAI, have occurred in the public cloud, but there’s rising demand for Private AI due to concerns about data availability, privacy, governance, cost, and location. Technology providers in a position to offer alternative AI stacks (usually built on top of a GPU-as-a-service model) to the hyperscalers find themselves in high interest. Equinix, in partnership with providers such as Nvidia, offers Private AI solutions on a global turnkey AI infrastructure. These solutions are ideal for industries with large-scale operations and connectivity challenges, such as Manufacturing, or those slow to adopt public cloud.
SASH MUKHERJEE. Equinix’s Private AI value proposition will appeal to many organisations, especially as discussions on AI cost efficiency and ROI evolve. AI unites IT and business teams, and Equinix understands the need for conversations at multiple levels. Infrastructure leaders focus on data strategy capacity planning; CISOs on networking and security; business lines on application performance, and the C-suite on revenue, risk, and cost considerations. Each has a stake in the AI strategy. For success, Equinix must reshape its go-to-market message to be industry-specific (that’s how AI conversations are shaping) and reskill its salesforce for broader conversations beyond infrastructure.
Equinix’s Growth Potential
ACHIM GRANZEN. In Southeast Asia, Malaysia and Indonesia provide growth opportunities for Equinix. Indonesia holds massive potential as a digital-savvy G20 country. In Malaysia, the company’s data centres can play a vital part in the ongoing Mydigital initiative, having a presence in the country before the hyperscalers. Also, the proximity of the Johor Bahru data centre to Singapore opens additional business opportunities.
TIM SHEEDY. Equinix is evolving beyond being just a data centre real estate provider. By developing their own platforms and services, along with partner-provided solutions, they enable customers to optimise application placement, manage smaller points of presence, enhance cloud interconnectivity, move data closer to hyperscalers for backup and performance, and provide multi-cloud networking. Composable services – such as cloud routers, load balancers, internet access, bare metal, virtual machines, and virtual routing and forwarding – allow seamless integration with partner solutions.
Equinix’s focus over the last 12 months on automating and simplifying the data centre management and interconnection services is certainly paying dividends, and revenue is expected to grow above tech market growth rates.
AI tools have become a game-changer for the technology industry, enhancing developer productivity and software quality. Leveraging advanced machine learning models and natural language processing, these tools offer a wide range of capabilities, from code completion to generating entire blocks of code, significantly reducing the cognitive load on developers. AI-powered tools not only accelerate the coding process but also ensure higher code quality and consistency, aligning seamlessly with modern development practices. Organisations are reaping the benefits of these tools, which have transformed the software development lifecycle.
Impact on Developer Productivity
AI tools are becoming an indispensable part of software development owing to their:
- Speed and Efficiency. AI-powered tools provide real-time code suggestions, which dramatically reduces the time developers spend writing boilerplate code and debugging. For example, Tabnine can suggest complete blocks of code based on the comments or a partial code snippet, which accelerates the development process.
- Quality and Accuracy. By analysing vast datasets of code, AI tools can offer not only syntactically correct but also contextually appropriate code suggestions. This capability reduces bugs and improves the overall quality of the software.
- Learning and Collaboration. AI tools also serve as learning aids for developers by exposing them to new or better coding practices and patterns. Novice developers, in particular, can benefit from real-time feedback and examples, accelerating their professional growth. These tools can also help maintain consistency in coding standards across teams, fostering better collaboration.
Advantages of Using AI Tools in Development
- Reduced Time to Market. Faster coding and debugging directly contribute to shorter development cycles, enabling organisations to launch products faster. This reduction in time to market is crucial in today’s competitive business environment where speed often translates to a significant market advantage.
- Cost Efficiency. While there is an upfront cost in integrating these AI tools, the overall return on investment (ROI) is enhanced through the reduced need for extensive manual code reviews, decreased dependency on large development teams, and lower maintenance costs due to improved code quality.
- Scalability and Adaptability. AI tools learn and adapt over time, becoming more efficient and aligned with specific team or project needs. This adaptability ensures that the tools remain effective as the complexity of projects increases or as new technologies emerge.
Deployment Models
The choice between SaaS and on-premises deployment models involves a trade-off between control, cost, and flexibility. Organisations need to consider their specific requirements, including the level of control desired over the infrastructure, sensitivity of the data, compliance needs, and available IT resources. A thorough assessment will guide the decision, ensuring that the deployment model chosen aligns with the organisation’s operational objectives and strategic goals.
Technology teams must consider challenges such as the reliability of generated code, the potential for generating biased or insecure code, and the dependency on external APIs or services. Proper oversight, regular evaluations, and a balanced integration of AI tools with human oversight are recommended to mitigate these risks.
A Roadmap for AI Integration
The strategic integration of AI tools in software development offers a significant opportunity for companies to achieve a competitive edge. By starting with pilot projects, organisations can assess the impact and utility of AI within specific teams. Encouraging continuous training in AI advancements empowers developers to leverage these tools effectively. Regular audits ensure that AI-generated code adheres to security standards and company policies, while feedback mechanisms facilitate the refinement of tool usage and address any emerging issues.
Technology teams have the opportunity to not only boost operational efficiency but also cultivate a culture of innovation and continuous improvement in their software development practices. As AI technology matures, even more sophisticated tools are expected to emerge, further propelling developer capabilities and software development to new heights.
In my previous Ecosystm Insights, I covered how to choose the right database for the success of any application or project. Often organisations select cloud-based databases for the scalability, flexibility, and cost-effectiveness.
Here’s a look at some prominent cloud-based databases and guidance on the right cloud-based database for your organisational needs.
Click here to download ‘Databases Demystified. Cloud-Based Databases’ as a PDF.
Amazon RDS (Relational Database Service)
Pros.
Managed Service. Automates database setup, maintenance, and scaling, allowing you to focus on application development.
Scalability. Easily scales database’s compute and storage resources with minimal downtime.
Variety of DB Engines. Supports multiple database engines, including MySQL, PostgreSQL, MariaDB, Oracle, and SQL Server.
Cons.
Cost. Can be expensive for larger databases or high-throughput applications.
Complex Pricing. The pricing model can be complex to understand, with costs for storage, I/O, and data transfer.
Google Cloud SQL
Pros.
Fully Managed. Takes care of database management tasks like replication, patch management, and backups.
Integration. Seamlessly integrates with other GCP services, enhancing data analytics and machine learning capabilities.
Security. Offers robust security features, including data encryption at rest and in transit.
Cons.
Limited Customisation. Compared to managing your own database, there are limitations on configurations and fine-tuning.
Egress Costs. Data transfer costs (especially egress) can add up if you have high data movement needs.
Azure SQL Database
Pros.
Highly Scalable. Offers a scalable service that can dynamically adapt to your application’s needs.
Advanced Features. Includes advanced security features, AI-based performance optimisation, and automated updates.
Integration. Deep integration with other Azure services and Microsoft products.
Cons.
Learning Curve. The wide array of options and settings might be overwhelming for new users.
Cost for High Performance. Higher-tier performance levels can become costly.
MongoDB Atlas
Pros.
Flexibility. Offers a flexible document database that is ideal for unstructured data.
Global Clusters. Supports global clusters to improve access speeds for distributed applications.
Fully Managed. Provides a fully managed service, including automated backups, patches, and security.
Cons.
Cost at Scale. While it offers a free tier, costs can grow significantly with larger deployments and higher performance requirements.
Indexing Limitations. Efficient querying requires proper indexing, which can become complex as your dataset grows.
Amazon DynamoDB
Pros.
Serverless. Offers a serverless NoSQL database that scales automatically with your application’s demands.
Performance. Delivers single-digit millisecond performance at any scale.
Durability and Availability. Provides built-in security, backup, restore, and in-memory caching for internet-scale applications.
Cons.
Pricing Model. Pricing can be complex and expensive, especially for read/write throughput and storage.
Learning Curve. Different from traditional SQL databases, requiring time to learn best practices for data modeling and querying.
Selection Considerations
Data Model Compatibility. Ensure the database supports the data model you plan to use (relational, document, key-value, etc.).
Scalability and Performance Needs. Assess whether the database can meet your application’s scalability and performance requirements.
Cost. Understand the pricing model and estimate monthly costs based on your expected usage.
Security and Compliance. Check for security features and compliance with regulations relevant to your industry.
Integration with Existing Tools. Consider how well the database integrates with your current application ecosystem and development tools.
Vendor Lock-in. Be aware of the potential for vendor lock-in and consider the ease of migrating data to other services if needed.
Choosing the right cloud-based database involves balancing these factors to find the best fit for your application’s requirements and your organisation’s budget and skills.
The tech industry tends to move in waves, driven by the significant, disruptive changes in technology, such as cloud and smartphones. Sometimes, it is driven by external events that bring tech buyers into sync – such as Y2K and the more recent pandemic. Some tech providers, such as SAP and Microsoft, are big enough to create their own industry waves. The two primary factors shaping the current tech landscape are AI and the consequential layoffs triggered by AI advancements.
While many of the AI startups have been around for over five years, this will be the year they emerge as legitimate solutions providers to organisations. Amidst the acceleration of AI-driven layoffs, individuals from these startups will go on to start new companies, creating the next round of startups that will add value to businesses in the future.
Tech Sourcing Strategies Need to Change
The increase in startups implies a change in the way businesses manage and source their tech solutions. Many organisations are trying to reduce tech debt, by typically consolidating the number of providers and tech platforms. However, leveraging the numerous AI capabilities may mean looking beyond current providers towards some of the many AI startups that are emerging in the region and globally.
The ripple effect of these decisions is significant. If organisations opt to enhance the complexity of their technology architecture and increase the number of vendors under management, the business case must be watertight. There will be less of the trial-and-error approach towards AI from 2023, with a heightened emphasis on clear and measurable value.
AI Startups Worth Monitoring
Here is a selection of AI startups that are already starting to make waves across Asia Pacific and the globe.
- ADVANCE.AI provides digital transformation, fraud prevention, and process automation solutions for enterprise clients. The company offers services in security and compliance, digital identity verification, and biometric solutions. They partner with over 1,000 enterprise clients across Southeast Asia and India across sectors, such as Banking, Fintech, Retail, and eCommerce.
- Megvii is a technology company based in China that specialises in AI, particularly deep learning. The company offers full-stack solutions integrating algorithms, software, hardware, and AI-empowered IoT devices. Products include facial recognition software, image recognition, and deep learning technology for applications such as consumer IoT, city IoT, and supply chain IoT.
- I’mCloud is based in South Korea and specialises in AI, big data, and cloud storage solutions. The company has become a significant player in the AI and big data industry in South Korea. They offer high-quality AI-powered chatbots, including for call centres and interactive educational services.
- H2O.ai provides an AI platform, the H2O AI Cloud, to help businesses, government entities, non-profits, and academic institutions create, deploy, monitor, and share data models or AI applications for various use cases. The platform offers automated machine learning capabilities powered by H2O-3, H2O Hydrogen Torch, and Driverless AI, and is designed to help organisations work more efficiently on their AI projects.
- Frame AI provides an AI-powered customer intelligence platform. The software analyses human interactions and uses AI to understand the driving factors of business outcomes within customer service. It aims to assist executives in making real-time decisions about the customer experience by combining data about customer interactions across various platforms, such as helpdesks, contact centres, and CRM transcripts.
- Uizard offers a rapid, AI-powered UI design tool for designing wireframes, mockups, and prototypes in minutes. The company’s mission is to democratise design and empower non-designers to build digital, interactive products. Uizard’s AI features allow users to generate UI designs from text prompts, convert hand-drawn sketches into wireframes, and transform screenshots into editable designs.
- Moveworks provides an AI platform that is designed to automate employee support. The platform helps employees to automate tasks, find information, query data, receive notifications, and create content across multiple business applications.
- Tome develops a storytelling tool designed to reduce the time required for creating slides. The company’s online platform creates or emphasises points with narration or adds interactive embeds with live data or content from anywhere on the web, 3D renderings, and prototypes.
- Jasper is an AI writing tool designed to assist in generating marketing copy, such as blog posts, product descriptions, company bios, ad copy, and social media captions. It offers features such as text and image AI generation, integration with Grammarly and other Chrome extensions, revision history, auto-save, document sharing, multi-user login, and a plagiarism checker.
- Eightfold AI provides an AI-powered Talent Intelligence Platform to help organisations recruit, retain, and grow a diverse global workforce. The platform uses AI to match the right people to the right projects, based on their skills, potential, and learning ability, enabling organisations to make informed talent decisions. They also offer solutions for diversity, equity, and inclusion (DEI), skills intelligence, and governance, among others.
- Arthur provides a centralised platform for model monitoring. The company’s platform is model and platform agnostic, and monitors machine learning models to ensure they deliver accurate, transparent, and fair results. They also offer services for explainability and bias mitigation.
- DNSFilter is a cloud-based, AI-driven content filtering and threat protection service, that can be deployed and configured within minutes, requiring no software installation.
- Spot AI specialises in building a modern AI Camera System to create safer workplaces and smarter operations for every organisation. The company’s AI Camera System combines cloud and edge computing to make video footage actionable, allowing customers to instantly surface and resolve problems. They offer intelligent video recorders, IP cameras, cloud dashboards, and advanced AI alerts to proactively deliver insights without the need to manually review video footage.
- People.ai is an AI-powered revenue intelligence platform that helps customers win more revenue by providing sales, RevOps, marketing, enablement, and customer success teams with valuable insights. The company’s platform is designed to speed up complex enterprise sales cycles by engaging the right people in the right accounts, ultimately helping teams to sell more and faster with the same headcount.
These examples highlight a few startups worth considering, but the landscape is rich with innovative options for organisations to explore. Similar to other emerging tech sectors, the AI startup market will undergo consolidation over time, and incumbent providers will continue to improve and innovate their own AI capabilities. Till then, these startups will continue to influence enterprise technology adoption and challenge established providers in the market.
Hewlett Packard Enterprise (HPE) has entered into a definitive agreement to acquire Juniper Networks for USD 40 per share, totaling an equity value of about USD 14 Billion. This strategic move is aimed to enhance HPE’s portfolio by focusing on higher-growth solutions and reinforcing their high-margin networking business. HPE expects to double their networking business, positioning the combined entity as a leader in networking solutions. With the growing demand for secure, unified technology driven by AI and hybrid cloud trends, HPE aims to offer comprehensive, disruptive solutions that connect, protect, and analyse data from edge to cloud.
This would also be the organisation’s largest deal since becoming an independent company in 2015. The acquisition is expected to be completed by late 2024 or early 2025.
Ecosystm analysts Darian Bird and Richard Wilkins provide their insights on the HPE acquisition and its implications for the tech market.
Converging Networking and Security
One of the big drawcards for HPE is Juniper’s Mist AI. The networking vendors have been racing to catch up – both in capabilities and in marketing. The acquisition though will give HPE a leadership position in network visibility and manageability. With GreenLake and soon Mist AI, HPE will have a solid AIOps story across the entire infrastructure.
HPE has been working steadily towards becoming a player in the converged networking-security space. They integrated Silver Peak well to make a name for themselves in SD-WAN and last year acquiring Axis Security gave them the Zero Trust Network Access (ZTNA), Secure Web Gateway (SWG), and Cloud Access Security Broker (CASB) modules in the Secure Service Edge (SSE) stack. Bringing all of this to the market with Juniper’s networking prowess positions HPE as a formidable player, especially as the Secure Access Service Edge (SASE) market gains momentum.
As the market shifts towards converged SASE, there will only be more interest in the SD-WAN and SSE vendors. In just over one year, Cato Networks and Netskope have raised funds, Check Point acquired Perimeter 81, and Versa Networks has made noises about an IPO. The networking and security players are all figuring out how they can deliver a single-vendor SASE.
Although HPE’s strategic initiatives signal a robust market position, potential challenges arise from the overlap between Aruba and Juniper. However, the distinct focus on the edge and data center, respectively, may help alleviate these concerns. The acquisition also marks HPE’s foray into the telecom space, leveraging its earlier acquisition of Athonet and establishing a significant presence among service providers. This expansion enhances HPE’s overall market influence, posing a challenge to the long-standing dominance of Cisco.
The strategic acquisition of Juniper Networks by HPE can make a transformative leap in AIOps and Software-Defined Networking (SDN). There is a potential for this to establish a new benchmark in IT management.
AI in IT Operations Transformation
The integration of Mist’s AI-driven wireless solutions and HPE’s SDN is a paradigm shift in IT operations management and will help organisations transition from a reactive to a predictive and proactive model. Mist’s predictive analytics, coupled with HPE’s SDN capabilities, empower networks to dynamically adjust to user demands and environmental changes, ensuring optimal performance and user experience. Marvis, Mist’s Virtual Network Assistant (VNA), adds conversational troubleshooting capabilities, enhancing HPE’s network solutions. The integration envisions an IT ecosystem where Juniper’s AI augments HPE’s InfoSight, providing deeper insights into network behaviour, preemptive security measures, and more autonomous IT operations.
Transforming Cloud and Edge Computing
The incorporation of Juniper’s AI into HPE’s cloud and edge computing solutions promises a significant improvement in data processing and management. AI-driven load balancing and resource allocation mechanisms will significantly enhance multi-cloud environment efficiency, ensuring robust and seamless cloud services, particularly vital in IoT applications where real-time data processing is critical. This integration not only optimises cloud operations but also has the potential to align with HPE’s commitment to sustainability, showcasing how AI advancements can contribute to energy conservation.
In summary, HPE’s acquisition of Juniper Networks, and specifically the integration of the Mist AI platform, is a pivotal step towards an AI-driven, efficient, and predictive IT infrastructure. This can redefine the standards in AIOps and SDN, creating a future where IT systems are not only reactive but also intuitively adaptive to the evolving demands of the digital landscape.
As an industry, the tech sector tends to jump on keywords and terms – and sometimes reshapes their meaning and intention. “Sustainable” is one of those terms. Technology vendors are selling (allegedly!) “sustainable software/hardware/services/solutions” – in fact, the focus on “green” or “zero carbon” or “recycled” or “circular economy” is increasing exponentially at the moment. And that is good news – as I mentioned in my previous post, we need to significantly reduce greenhouse gas emissions if we want a future for our kids. But there is a significant disconnect between the way tech vendors use the word “sustainable” and the way it is used in boardrooms and senior management teams of their clients.
Defining Sustainability
For organisations, Sustainability is a broad business goal – in fact for many, it is the over-arching goal. A sustainable organisation operates in a way that balances economic, social, and environmental (ESG) considerations. Rather than focusing solely on profits, a sustainable organisation aims to meet the needs of the present without compromising the ability of future generations to meet their own needs.
This is what building a “Sustainable Organisation” typically involves:
Economic Sustainability. The organisation must be financially stable and operate in a manner that ensures long-term economic viability. It doesn’t just focus on short-term profits but invests in long-term growth and resilience.
Social Sustainability. This involves the organisation’s responsibility to its employees, stakeholders, and the wider community. A sustainable organisation will promote fair labour practices, invest in employee well-being, foster diversity and inclusion, and engage in ethical decision-making. It often involves community engagement and initiatives that support societal growth and well-being.
Environmental Sustainability. This facet includes the responsible use of natural resources and minimising negative impacts on the environment. A sustainable organisation seeks to reduce its carbon footprint, minimise waste, enhance energy efficiency, and often supports or initiates activities that promote environmental conservation.
Governance and Ethical Considerations. Sustainable organisations tend to have transparent and responsible governance. They follow ethical business practices, comply with laws and regulations, and foster a culture of integrity and accountability.
Security and Resilience. Sustainable organisations have the ability to thwart bad actors – and in the situation that they are breached, to recover from these breaches quickly and safely. Sustainable organisations can survive cybersecurity incidents and continue to operate when breaches occur, with the least impact.
Long-Term Focus. Sustainability often requires a long-term perspective. By looking beyond immediate gains and considering the long-term impact of decisions, a sustainable organisation can better align its strategies with broader societal goals.
Stakeholder Engagement. Understanding and addressing the needs and concerns of different stakeholders (including employees, customers, suppliers, communities, and shareholders) is key to sustainability. This includes open communication and collaboration with these groups to foster relationships based on trust and mutual benefit.
Adaptation and Innovation. The organisation is not static and recognises the need for continual improvement and adaptation. This might include innovation in products, services, or processes to meet evolving sustainability standards and societal expectations.
Alignment with the United Nations’ Sustainable Development Goals (UNSDGs). Many sustainable organisations align their strategies and operations with the UNSDGs which provide a global framework for addressing sustainability challenges.
Organisations Appreciate Precise Messaging
A sustainable organisation is one that integrates economic, social, and environmental considerations into all aspects of its operations. It goes beyond mere compliance with laws to actively pursue positive impacts on people and the planet, maintaining a balance that ensures long-term success and resilience.
These factors are all top of mind when business leaders, boards and government agencies use the word “sustainable”. Helping organisations meet their emission reduction targets is a good starting point – but it is a long way from all businesses need to become sustainable organisations.
Tech providers need to reconsider their use of the term “sustainable” – unless their solution or service is helping organisations meet all of the features outlined above. Using specific language would be favoured by most customers – telling them how the solution will help them reduce greenhouse gas emissions, meet compliance requirements for CO2 and/or waste reduction, and save money on electricity and/or management costs – these are all likely to get the sale over the line faster than a broad “sustainability” messaging will.
Public sector organisations are looking at 2021 as the year where they either hobble back to normalcy or implement their successful pilots (that were honed under tremendous pressure). Ecosystm research finds that 60% of government agencies are looking at 2021 as the year they make a recovery to normal – or the normal that finally emerges. The path to recovery will be technology-driven, and this time they will look at scalability and data-driven intelligence.
Ecosystm Advisors Alan Hesketh, Mike Zamora and Sash Mukherjee present the top 5 Ecosystm predictions for Cities of the Future in 2021. This is a summary of our Cities of the Future predictions – the full report (including the implications) is available to download for free on the Ecosystm platform here.
The Top 5 Cities of the Future Trends for 2021
#1 Cities Will Re-start Their Transformation Journey by Taking Stock
In 2021 the first thing that cities will do is introspect and reassess. There have been a lot of abrupt policy shifts, people changes, and technology deployments. Most have been ad-hoc, without the benefit of strategy planning, but many of the services that cities provide have been transformed completely. Government agencies in cities have seen rapid tech adoption, changes in their business processes and in the mindset of how their employees – many who were at the frontline of the crisis – provide citizen services.
Technology investments, in most cases, took on an unexpected trajectory and agencies will find that they have digressed from their technology and transformation roadmap. This also provides an opportunity, as many solutions would have gone through an initial ‘proof-of-concept’ without the formal rigours and protocols. Many of these will be adopted for longer term applications. In 2021, they will retain the same technology priorities as 2020, but consolidate and strengthen on their spend.
#2 Cities Will be Instrumented Using Intelligent Edge Devices
The capabilities of edge devices continue to increase dramatically, while costs decline. This reduces the barriers to entry for cities to collect and analyse significantly more data about the city and its people. Edge devices move computational power and data storage as close to the point of usage as possible to provide good performance. Devices range from battery powered IoT devices for data collection through to devices such as smart CCTV cameras with embedded pattern recognition software.
Cities will develop many use cases for intelligent edge devices. These uses will range from enhancing old assets using newer approaches to data collection – through to accelerating the speed and quality of the build of a new asset. The move to data-driven maintenance and decision-making will improve outcomes.
#3 COVID-19 Will Impact City Design
The world has received a powerful reminder of the vulnerability of densely populated cities, and the importance of planning and regulating public health. COVID-19 will continue to have an impact on city design in 2021.
A critical activity in controlling the pandemic in this environment is the test-and-trace capabilities of the local public health authorities. Technology to provide automated, accurate, contact tracing to replace manual efforts is now available. Scanning of QR codes at locations visited is proving to be the most widely adopted approach. The willingness of citizens to track their travels will be a crucial aid in managing the spread of COVID-19.
Early detection of new disease outbreaks, or other high-risk environmental events, is essential to minimise harm. Intelligent edge devices that detect the presence of viruses will become crucial tools in a city’s defence.
Intelligent edge devices will also play a role in managing building ventilation. Well-ventilated spaces are an important factor in controlling virus transmission. But a limited number of buildings have ventilation systems that are capable of meeting those requirements. Property owners will begin to refit their facilities to provide better air movement.
#4 Technology Vendors Will Emerge as the Conductors of Cities of the Future
The built environment comprises not only of the physical building, but also the space around the buildings and building operations. The real estate developer/investor owns the building – the urban fabric, the relationship of buildings to each other, the common space and the common services provided to the city, is owned by the City. The question is who will coordinate the players, e.g. business, citizens, government and the built environment. Ideally the government should be the conductor. However, they may not have sufficient experience or knowledge to properly implement this role. This means a capable and knowledgeable neutral consultant will at least initially fill this role. There is an opportunity for a technology vendor to fill that consulting role and impact the city fabric. This enhanced city environment will be requested by the Citizen, driven by the City, and guided by Technology Vendors. 2021 will see leading technology vendors working very closely with cities.
#5 Compliance Will be at the Core of Citizen Engagement Initiatives
Many Smart Cities have long focused on online services – over the last couple of years mobile apps have further improved citizen services. In 2020, the pandemic challenged government agencies to continue to provide services to citizens who were housebound and had become more digital savvy almost overnight. And many cities were able to scale up to fulfill citizen expectations.
However, in 2021 there will be a need to re-evaluate measures that were implemented this year – and one area that will be top priority for public sector agencies is compliance, security and privacy.
The key drivers for this renewed focus on security and privacy are:
- The need to temper the focus of ‘service delivery at any cost’ and further remind agencies and employees that security and privacy must comply with standard to allow the use of government data.
- The rise of cyberattacks that target not only essential infrastructure, but also individual citizens and small and medium enterprises (SMEs).
- The rise of app adoption by city agencies – many that have been developed by third parties. It will become essential to evaluate their compliance to security and privacy requirements.
IBM announced its intention to spin off its infrastructure services business as a separate public company, allowing Big Blue to focus on hybrid cloud and AI. The newly formed entity, temporarily named NewCo, will offer project and outsourcing services that currently fall under its GTS business unit. NewCo will have a staff of around 90,000 employees and is expected to earn revenue of about $19B. While GTS has experienced declining revenue for some time now, IBM believes that the split will unlock growth and put it on a path to recovery.
Once the Red Hat acquisition closed last year and the tag team of Jim Whitehurst and Arvind Krishna were announced, it became clear that IBM was gearing up to become a leaner, more agile leader in the hybrid cloud space. One of two possible courses seemed apparent – either wither away for years until IBM was small enough to become nimble, or take bold action. IBM has opted for the latter and is likely to be rewarded for it. The new IBM will have revenue of around $59B, well short of its peak at over $100B, but sacrificing turnover for margin and growth gives it a more positive long-term outlook.
Stripping back IBM to become smaller, faster growing, and more profitable, will help solve many of its greatest challenges. Significant investment into growth segments will become more palatable without the financial burden of the declining infrastructure services unit. The well-needed cultural change and drive to think like a start-up will become more practical in the new IBM.
NewCo to Build New Cloud Partnerships
IBM’s infrastructure services unit has had some great success in larger, complex, hybrid cloud deals recently – but at the lower end of the market there have been many head winds. Public cloud providers have eroded what was once a lucrative compute and storage services market. At the same time, application service providers, like Accenture, TCS, and HCL have been pivoting towards infrastructure. Untethering infrastructure services makes a turnaround story more likely, giving NewCo greater flexibility and speed, which clients have been crying out for.
The greatest benefit to NewCo will be the ability to freely partner with other cloud providers, like AWS, Microsoft, and Google. Although IBM has made noises about being willing to embrace its competitors, this was not necessarily implemented on the ground nor was it reciprocated.
It is no secret that GTS and GBS have had a rocky relationship since day one. The split will reassure clients that each of them is agnostic and relieve any internal pressure to partner unless it is best for the client. While elements of this decision look like the unfolding of a long-term strategy that began under Ginni Rometty, it does, however, leave open the question of why GTS and GBS were more closely integrated over the last few years. This also means IBM is moving in the opposite direction to its competitors, who are shifting towards offerings that cover the full stack of services from infrastructure up to applications.
What Lies Ahead for IBM
One detail that is not immediately certain is the fate of IBM security services, which could be integrated with security software at IBM, spun out with the rest of infrastructure services, or even split into consulting and delivery. An important differentiator for IBM has been its ability to build in security at the beginning of transformation projects making final placement a difficult decision.
It might be tempting to predict that next IBM would couple its Systems unit and Support Services to be spun off or sold although Mr. Krishna ruled that out. Over the long term, these are both financially underperforming units but there is an advantage to building the core infrastructure that critical workloads are run on.
Each new IBM CEO has had a make or break moment and Mr. Krishna has decided that his will come early. For the company to thrive for another 100 years it needed to place a big bet and it could not have come soon enough.