In our previous Ecosystm Insights, Ecosystm Principal Advisor, Gerald Mackenzie, highlighted the key drivers for boosting ESG maturity and the need to transition from standalone ESG projects to integrating ESG goals into organisational strategy and operations.
This shift can be difficult, requiring an alignment of ESG objectives with broader strategic aims and using organisational capabilities effectively. The solution involves prioritising essential goals, knitting them into overall business strategy, quantifying success metrics, and establishing incentives and governance for effective execution.
The benefits are proven and significant. Stronger Customer and Employee Value Propositions, better bottom line, improved risk profile, and more attractive enterprise valuations for investors and lenders.
According to Gerald, here are 5 things to keep in mind when starting on an ESG journey.
A 2020 McKinsey report revealed that executives and investors value companies with robust ESG performance around 10% higher in valuations than laggards. Equally pivotal, workplace diversity is now recognised as a strategic advantage; a study in the Harvard Business Review finds that companies with above-average total diversity had both 19% higher innovation revenues and 9% higher EBIT margins, on average. Against this backdrop, organisations must recognise that embracing ESG principles is not merely an ethical gesture but a strategic imperative that safeguards resilience, reputation, and enduring financial prosperity.
The data from the ongoing Ecosystm State of ESG Adoption study was used to evaluate the status and maturity of organisations’ ESG strategy and implementation progress. A diverse representation across industries such as Financial Services, Manufacturing, and Retail & eCommerce, as well as from roles across the organisation has helped us with insights and an understanding of where organisations stand in terms of the maturity of their ESG strategy and implementation efforts.
A Tailored Approach to Improve ESG Maturity
Ecosystm assists clients in driving greater impact through their ESG adoption. Our tools evaluate an organisation’s aspirations and roadmaps using a maturity model, along with a series of practical drivers that enhance ESG response maturity. The maturity of an organisation’s approach on ESG tends to progress from a reactive, or risk/compliance-based focus, to a customer, or opportunity driven approach, to a purpose led approach that is focused on embedding ESG into the core culture of the organisation. Our advisory, research and consulting offerings are customised to the transitions organisations are seeking to make in their maturity levels over time.
Within the maturity framework outlined above, Ecosystm has identified the key organisational drivers to improve maturity and adoption. The Ecosystm ESG Consulting offerings are configured to both support the development of ESG strategy and the delivery and ‘story telling’ around ESG programs based on the goals of the customer (maturity aspiration) and the gaps they need to close to deliver the aspiration.
Key Findings of the Ecosystm State of ESG Study
89% of respondents self-reported that their organisation had an ESG strategy; however, a notable 60% also identified that a lack of alignment of sustainability goals to enterprise strategy was a key issue in implementation. This reflects many of the client discussions we’ve had, where customers share that ESG goals that have not been fully tested against other organisational priorities can create tensions and make it difficult to solve trade-offs across the organisation during implementation.
People & Leadership/Execution & Governance
Capabilities are still emerging. 40% of respondents mentioned that a lack of a governance framework for ESG was a barrier to adoption, and 56% mentioned that immature metrics and reporting maturity slowed adoption. 64% of respondents also mentioned that a lack of specialised resources as a key barrier to ESG adoption.
In our discussions with customers, we understand that there is good support for ESG across organisations, but there needs to be a simple narrative compelling them to action on a few clearly articulated priorities, a clear mandate from senior leadership and credible resourcing and governance to ensure follow through.
Data and Technology Enablement
There is a strong opportunity for improvement. “We can’t manage what we cannot measure” has been the common refrain from the clients we have spoken to and the survey reflected this. Only 47% of respondents say that preparing data, analytics, reporting, and metrics for internal consumption is a priority for their tech teams.
ESG is rapidly emerging as a key priority for customers, investors, talent, and other stakeholders who seek a comprehensive and genuine commitment from the organisations they interact with. Successfully determining the right priorities and effectively mobilising your organisation and external collaborators for implementation are pivotal. It’s crucial to acknowledge the intricacy and extent of effort needed for this endeavour.
With our timely research findings complementing our ESG maturity and implementation frameworks, analyst insights and consulting support, Ecosystm is well-positioned to help you to navigate your journey to ESG maturity.
This situation is only exacerbated by social media and the prevalence of “fake news” that can quickly propagate incorrect, unscientific or unsubstantiated rumours.
As AI is evolving, it is raising some new ethical and legal questions. AI works by analysing data that is fed into it and draws conclusions based on what it has learned or been trained to do. Though it has many benefits, it may pose a threat to humans, data privacy, and the potential outcomes of the decisions. To curb the chances of such outcomes, organisations and policymakers are crafting recommendations about ensuring the responsible and ethical use of AI. In addition, governments are also taking initiatives to take it a step further and working on the development of principles, drafting laws and regulations. Tech developers are also trying to self-regulate their AI capabilities.
The goal of the councils is to work on a global level around new technology policy guidance, best policy practices, strategic guidelines and to help regulate technology under six domains – AI, precision medicine, autonomous driving, mobility, IoT, and blockchain. There is participation of over 200 industry leaders from organisations such as Microsoft, Qualcomm, Uber, Dana-Farber, European Union, Chinese Academy of Medical Sciences and the World Bank, to address the concerns around absence of clear unified guidelines.
Similarly, the Organization for Economic Co-operation and Development (OECD) created a global reference point for AI adoption principles and recommendations for governments of countries across the world. The OECD AI principlesare called “values-based principles,” and are clearly envisioned to endorse AI “that is innovative and trustworthy and that respects human rights and democratic values.”
Likewise, in April, the European Union published a set of guidelineson how companies and governments should develop ethical applications of AI to address the issues that might affect society as we integrate AI into sectors like healthcare, education, and consumer technology.
“Before an organisation embarks on the project, it is vital for a regulation to be in place right from the beginning of the project. This enables the vendor and the organisation to reach a common goal and understanding of what is ethical and right. With such practices in place bias, breach of confidentiality and ethics can be avoided” says Ecosystm Analyst, Audrey William. “Apart from working with the AI vendor and a service provider or systems integrator, it is highly recommended that the organisation consult a specialist such as Foundation for Responsible Robotics, Data & Society, AI Ethics Labthat help look into the parameters of ethics and bias before the project deployment.”
Another challenge arises from a data protection perspective because AI models are fed with data sets for their training and learning. This data is often obtained from usage history and data tracking that may compromise an individual’s identity. The use of this information may lead to a breach of user rights and privacy which may leave an organisation facing consequences around legal prosecutions, governance, and ethics.
One other area that is not looked into is racial and gender bias. Phone manufacturers have been criticised in the past on matters of racial and gender bias, when the least errors in identification occur with light-skinned males. This opened conversations on how the technology works on people of different races and genders.
San Francisco recently banned the use of facial recognition by the police and other agencies, proposing that the technology may pose a serious threat to civil liberties. “Implementing AI technologies such as facial recognition solution means organisations have to ensure that there are no racial bias and discrimination issues. Any inaccuracy or glitches in the data may tend to make the machines untrustworthy” saysWilliam.
Given what we know about existing AI systems, we should be very concerned that the possibilities of technology breaching humanitarian laws, are more likely than not.
Could strong governance restrict the development and implementation of AI?
The disruptive potential of AI poses looming risks around ethics, transparency, and security, hence the need for greater governance. AI will be used safely only once governance and policies have been framed, mandating its use.
William thinks that, “AI deployments have positive implications on creating better applications in health, autonomous driving, smart cities, and a eventually a better society. Worrying too much about regulations will impede the development of AI. A fine line has to be drawn between the development of AI and ensuring that the development does not cross the boundaries of ethics, transparency, and fairness.”
While AI as a technology has a way to go before it matures, at the moment it is the responsibility of both organisations and governments to strike a balance between technology development and use, and regulations and frameworks in the best interest of citizens and civil liberties.