Phil Burr, Director at Lumai, on how 3D optical processing is a breakthrough for sustainable, high-performance AI hardware.

A few months ago, Nvidia’s CEO Jensen Huang outlined a growing datacentre problem. Talking to CNBC news, he revealed that not only will the company’s new next-generation chip architecture – the Blackwell GPU – cost $30,000 to $40,000, but Nvidia itself spent an incredible $10 billion developing the platform. 

These figures reflect the considerable cost of trying to draw out more performance from current AI accelerator products. Why are costs this high?

Essentially, the performance demand needed to power the surge in AI development is increasing much faster than the abilities of the underlying technology used in today’s datacentre AI processors. The industry’s current solution is to add more silicon area, more power and, of course, more cost. But this is an approach pursuing diminishing returns. 

Throw in the sizeable infrastructure bill that comes from activities such as cooling and power-delivery, not to mention the substantial environmental impact of datacentres, and the sector is facing a real necessity to create a new way of building its AI accelerators. This new way, as it turns out, is already being developed. 

Optical processing techniques are an innovative and cost-efficient means to provide the necessary jump in AI performance. Not only will the technology accomplish this, however, but it will also simultaneously enhance the sector’s energy efficiency. This technique is 3D, or “free space”, optics. 

Making the jump to 3D 

3D optical compute is a perfect match for the maths that makes AI tick. If it can be harnessed effectively, it has the potential to generate immense performance and efficiency gains. 

3D optics is one of two optics solutions available in the tech landscape – the other, is integrated photonics. 

Integrated photonics is ideally suited to interconnect and switching where it holds huge potential. However, trials using integrated photonics for AI processing have shown that the technology can’t match the performance demand required for processing, like the fact it isn’t easily scalable and lacks compute precision. 

3D optics, on the other hand, surpasses the restrictions of both integrated photonics and electronic-only AI solutions. Using just 10% of the power of a GPU, the technology easily provides the necessary leap in performance by using light rather than electrons to compute and performs highly parallel computing. 

For datacentres, using a 3D optical AI accelerator will give them the many benefits seen in the optical communications we use daily, from rapid clock speeds to negligible energy use. These accelerators also offer far greater scalability than their ‘2D’ chip counterparts as they perform computations in all three spatial dimensions.  

The process behind the processor

Copying, multiplying and adding. These are the three fundamental operations of matrix multiplication, the maths behind processing. The optical accelerator carries out these steps by manoeuvring millions of individual beams of light. In just one clock cycle, millions of parallel operations occur, with very little energy consumed. What’s amazing is that the platform becomes more power efficient as performance grows due to its quadratic scaling abilities. 

Memory bandwidth can also impact an accelerator’s effectiveness. Optical processing enables a greater bandwidth without needing a costly memory chip, as it can disperse the memory across the vector width. 

Certain components found in optical processors already have evidence of successful use in datacentres. Google’s Optical Circuit Switch has used such devices for years, proving that employing similar technology is effective and reliable. 

Powering the AI revolution sustainably

Google’s news at the start of July illustrated the extent to which AI has triggered an increase in global emissions. It highlights just how much work the industry has to do to reverse this trend, and key to creating this shift will be a desire from companies to embrace new methods and tools. 

It’s worth remembering that between 2015-2019, datacentre power demand remained relatively stable even as workloads almost trebled. For the sector, it illustrates what’s possible. We need to come together to introduce inventive strategies that can maintain AI development without consuming endless energy. 

For every Watt of power consumed, more energy and cooling are needed and more emissions are generated. Therefore, if AI accelerators require less power, datacentres can also last longer and there is less need for new buildings. 

A sustainable approach also aligns with a cost-efficient one. Rather than use expensive new silicon technology or memory, 3D optical processors can leverage optical and electronic hardware currently used in datacentres. If we join these cost savings with reduced power consumption and less cooling, the total cost of ownership is a tiny portion of a GPU. 

An optical approach

Spiralling costs and rocketing AI performance demand mean current processors are running out of steam. Finding new tools and processes that can create the necessary leap in performance is crucial to the industry getting on top of these costs and improving its carbon footprint. 

3D optics can be the answer to AI’s hardware and sustainability problems, significantly increasing performance while consuming a fraction of the energy of a GPU processor. While broader changes such as green energy and sustainable manufacturing have a crucial part to play in the sector’s development, 3D optics delivers an immediate hardware solution capable of powering AI’s growth. 

  • Data & AI
  • Sustainability Technology

Matt Watts, Chief Technology Evangelist at NetApp UK&I, explores the relationship between skyrocketing demand for storage and the growing carbon cost associated with modern data storage.

Artificial Intelligence (AI) has found its way onto the product roadmap of most companies, particularly over the past two years. Behind the scenes, this has created a parallel boom in the demand for data, and the infrastructure to store it, as we train and deploy AI models. But it has also created soaring levels of data waste, and a carbon footprint we cannot afford to ignore. 

In some ways, this isn’t surprising. The environmental impact of physical waste is easy to see and understand – landfills, polluted rivers and so on. But when it comes to data, the environmental impact is only now emerging. In turn, as we embrace AI we must also embrace new approaches to manage the carbon footprint of the training data we use. 

In the UK, NetApp’s research classes 41% of data as “unused or unwanted”. Poor data storage practices cost the private sector up to £3.7 billion each year. Rather than informing decisions that can help business leaders make their organisations more efficient and sustainable, this data simply takes up vast amounts of space across data centres in the UK, and worldwide. 

Uncovering the hidden footprint of data storage waste

To demonstrate the scale of the issue, it is estimated that by 2026, 211 zettabytes of data will have been pumped into the global datasphere, already costing businesses up to one third of their IT budgets to store and manage. At the same time, nearly 68% of the world’s data is never accessed or used after its creation. This is not only creating unnecessary emissions, but also means businesses are using their spending budget and emissions on storage and energy consumption when they simply don’t need to. Instead, that budget could be invested more effectively in developing innovative new products or hiring the best talent. 

Admittedly, this conundrum isn’t entirely new, as over 50% of IT providers acknowledge that this level of spending on data storage is unsustainable. And the sheer scale of the “data waste” problem is part of what makes it so daunting, as IT leaders are unsure where to begin. 

Better data management for a greener planet

To tackle these problems confidently, IT teams need digital tools that can help them manage the increasing volumes of data. It is important that organisations have the right infrastructure in place for CTOs and CIOs to feel confident in their leadership roles to implement important data management practices to reduce waste. Additionally, IT leaders also need visibility of all their data to ensure they comply with evolving data regulation standards. If they don’t, they could face fines and reputational damage. After all, who can trust a business if they can’t locate, retrieve, or validate data they hold – especially if it is their customer’s data?

This is why intelligent data management is a crucial starting point. Businesses on average are spending £213,000 per year in maintaining their data through storage. This number will likely rise considerably as businesses collect more and more data for operational, employee and customer analytics. So by developing a strategy and a framework to manage visibility, storage, and the retention of data, businesses can begin chipping away at the data waste issue before it becomes even more unwieldy. 

From there, organisations can implement processes to classify data, and remove duplications. At the same time, conducting regular audits can ensure that departments are adhering to the framework in place. And as a result, businesses will be able to operate more efficiently, profitably, and sustainably. 

  • Infrastructure & Cloud
  • Sustainability Technology

Experts from IBM, Rackspace, Trend Micro, and more share their predictions for the impact AI is poised to have on their verticals in 2025.

Despite what can only be described as a herculean effort on behalf of the technology vendors who have already poured trillions of dollars into the technology, the miraculous end goal of an Artificial General Intelligence (AGI) failed to materialise this year. What we did get was a slew of enterprise tools that sort of work, mounting cultural resistance (including strikes and legal action from more quarters of the arts and entertainment industries), and vocal criticism leveled at AI’s environmental impact.  

It’s not to say that generative artificial intelligence hasn’t generated revenue, or that many executives are excited about the technology’s ability to automate away jobs— uh I mean increase productivity (by automating away jobs), but, as blockchain writer and research Molly White pointed out in April, there’s “a yawning gap” between the reality that “AI tools can be handy for some things” and the narrative that AI companies are presenting (and, she notes, that the media is uncritically reprinting). She adds: “When it comes to the massively harmful ways in which large language models (LLMs) are being developed and trained, the feeble argument that ‘well, they can sometimes be handy…’ doesn’t offer much of a justification.” 

Two years of generative AI and what do we have to show for it?

Blood in the Machine author Brian Merchant pointed out in a recent piece for the AI Now Institute that the “frenzy to locate and craft a viable business model” for AI by OpenAI and other companies driving the hype trainaround the technology has created a mixture of ongoing and “highly unresolved issues”. These include disputes over copyright, which Merchant argues threaten the very foundation of the industry.

“If content currently used in AI training models is found to be subject to copyright claims, top VCs investing in AI like Marc Andreessen say it could destroy the nascent industry,” he says. Also, “governments, citizens, and civil society advocates have had little time to prepare adequate policies for mitigating misinformation, AI biases, and economic disruptions caused by AI. Furthermore, the haphazard nature of the AI industry’s rise means that by all appearances, another tech bubble is being rapidly inflated.” Essentially, there has been so much investment so quickly, all based on the reputations of the companies throwing themselves into generative AI — Microsoft, Google, Nvidia, and OpenAI — that Merchant notes: “a crash could prove highly disruptive, and have a ripple effect far beyond Silicon Valley.” 

What does 2025 have in store for AI?

Whether or not that’s what 2025 has in store for us — especially given the fact that an incoming Trump presidency and Elon Musk’s self-insertion into the highest levels of government aren’t likely to result in more guardrails and legislation affecting the tech industry — is unclear. 

Speaking less broadly, we’re likely to see not only more adoption of generative AI tools in the enterprise sector. As the CIO of a professional services firm told me yesterday, “the vendors are really pushing it and, well, it’s free isn’t it?”. We’re also going to see AI impact the security sector, drive regulatory change, and start to stir up some of the same sanctimonious virtue signalling that was provoked by changing attitudes to sustainability almost a decade ago. 

To get a picture of what AI might have in store for the enterprise sector this year, we spoke to 6 executives across several verticals to find out what they think 2025 will bring.    

CISOs get ready for Shadow AI 

Nataraj Nagaratnam, CTO IBM Cloud Security

“Over the past few years, enterprises have dealt with Shadow IT – the use of non-approved Cloud infrastructure and SaaS applications without the consent of IT teams, which opens the door to potential data breaches or noncompliance. 

“Now enterprises are facing a new challenge on the horizon: Shadow AI. Shadow AI has the potential to be an even bigger risk than Shadow IT because it not only impacts security, but also safety. 

“The democratisation of AI technology with ChatGPT and OpenAI has widened the scope of employees that have the potential to put sensitive information into a public AI tool. In 2025, it is essential that enterprises act strategically about gaining visibility and retaining control over their employees’ usage of AI. With policies around AI usage and the right hybrid infrastructure in place, enterprises can put themselves in a better position to better manage sensitive data and application usage.” 

AI drives a move away from traditional SaaS  

Paul Gaskell, Chief Technology Officer at Avantia Law

“In the next 12 months, we will start to see a fundamental shift away from the traditional SaaS model, as businesses’ expectations of what new technologies should do evolve. This is down to two key factors – user experience and quality of output.

“People now expect to be able to ask technology a question and get a response pulled from different sources. This isn’t new, we’ve been doing it with voice assistants for years – AI has just made it much smarter. With the rise of Gen AI, chat interfaces have become increasingly popular versus traditional web applications. This expectation for user experience will mean SaaS providers need to rapidly evolve, or get left behind.  

“The current SaaS models on the market can only tackle the lowest dominator problem felt by a broad customer group, and you need to proactively interact with it to get it to work. Even then, it can only do 10% of a workflow. The future will see businesses using a combination of proprietary, open-source, and bought-in models – all feeding a Gen AI-powered interface that allows their teams to run end-to-end processes across multiple workstreams and toolsets.”

AI governance will surge in 2025

Luke Dash, CEO of ISMS.online

“New standards drive ethical, transparent, and accountable AI practices: In 2025, businesses will face escalating demands for AI governance and compliance, with frameworks like the EU AI Act setting the pace for global standards. Compliance with emerging benchmarks such as ISO 42001 will become crucial as organisations are tasked with managing AI risks, eliminating bias, and upholding public trust. 

“This shift will require companies to adopt rigorous frameworks for AI risk management, ensuring transparency and accountability in AI-driven decision-making. Regulatory pressures, particularly in high-stakes sectors, will introduce penalties for non-compliance, compelling firms to showcase robust, ethical, and secure AI practices.”

This is the year of “responsible AI” 

Mahesh Desai, Head of EMEA public cloud, Rackspace Technology

“This year has seen the adoption of AI skyrocket, with businesses spending an average of $2.5million on the technology. However, legislation such as the EU AI Act has led to heightened scrutiny into how exactly we are using AI, and as a result, we expect 2025 to become the year of Responsible AI.

While we wait for further insight on regulatory implementation, many business leaders will be looking for a way to stay ahead of the curve when it comes to AI adoption and the answer lies in establishing comprehensive AI Operating Models – a set of guidelines for responsible and ethical AI adoption. These frameworks are not just about mitigating risks, but about creating a symbiotic relationship with AI through policies, guardrails, training and governance.

This not only prepares organisations for future domestic and international AI regulations but also positions AI as a co-worker that can empower teams rather than replace them. As AI technology continues to evolve, success belongs to organisations that adapt to the technology as it advances and view AI as the perfect co-worker, albeit one that requires thoughtful, responsible integration”.

AI breaches will fuel cyber threats in 2025 

Lewis Duke, SecOps Risk & Threat Intelligence Lead at Trend Micro  

“In 2025 – don’t expect the all too familiar issues of skills gaps, budget constraints or compliance to be sidestepped by security teams. Securing local large language models (LLMs) will emerge as a greater concern, however, as more industries and organisations turn to AI to improve operational efficiency. A major breach or vulnerability that’s traced back to AI in the next six to twelve months could be the straw that breaks the camel’s back. 

“I’m also expecting to see a large increase in the use of cyber security platforms and, subsequently, integration of AI within those platforms to improve detection rates and improve analyst experience. There will hopefully be a continued investment in zero-trust methodologies as more organisations adopt a risk-based approach and continue to improve their resilience against cyber-attacks. I also expect we will see an increase in organisations adopting 3rd party security resources such as managed SOC/SIEM/XDR/IR services as they look to augment current capabilities. 

“Heading into the new year, security teams should maintain a focus on cyber security culture and awareness. It needs to be driven by the top down and stretch far. For example, in addition to raising base security awareness, Incident Response planning and testing

 should also be an essential step taken for organisations to stay prepared for cyber incidents in 2025. The key to success will be for security to keep focusing on the basic concepts and foundations of securing an organisation. Asset management, MFA, network

 segmentation and well-documented processes will go further to protecting an organisation than the latest “sexy” AI tooling.” 

AI will change the banking game in 2025 

Alan Jacobson, Chief Data and Analytics Officer at Alteryx 

“2024 saw financial services organisations harness the power of AI-powered processes in their decision-making, from using machine learning algorithms to analyse structured data and employing regression techniques to forecast. Next year, I expect that firms will continue to fine-tune these use cases, but also really ramp up their use of unstructured data and advanced LLM technology. 

“This will go well beyond building a chatbot to respond to free-form customer enquiries, and instead they’ll be turning to AI to translate unstructured data into structured data. An example here is using LLMs to scan the web for competitive pricing on loans or interest rates and converting this back into structured data tables that can be easily incorporated into existing processes and strategies.  

“This is just one of the use cases that will have a profound impact on financial services organisations. But only if they prepare. To unlock the full potential of AI and analytics in 2025, the sector must make education a priority. Employees need to understand how AI works, when to use it, how to critique it and where its limitations lie for the technology to genuinely support business aspirations. 

“I would advise firms to focus on exploring use cases that are low risk and high reward, and which can be supported by external data. Summarising large quantities of information from public sources into automated alerts, for example, plays perfectly to the strengths of genAI and doesn’t rely on flawless internal data. Businesses that focus on use cases where data imperfections won’t impede progress will achieve early wins faster, and gain buy-in from employees, setting them up for success as they scale genAI applications.” 

  • Cybersecurity
  • Data & AI
  • Sustainability Technology

Oliver Findlow, Business Development Manager at Ipsotek, an Eviden business, explores what it will take to realise the smart city future we were promised.

The world stands at the precipice of a major shift. By 2050, it is estimated that over 6.7 billion people – a staggering 68% of the global population – will call urban areas home. These burgeoning cities are the engines of our global economy, generating over 80% of global GDP. 

Bigger problems, smarter cities 

However, this rapid urbanisation comes with its own set of specific challenges. How can we ensure that these cities remain not only efficient and sustainable, but also offer an improved quality of life for all residents?

The answer lies in the concept of ‘smart cities.’ These are not simply cities adorned with the latest technology, but rather complex ecosystems where various elements work in tandem. Imagine a city’s transportation network, its critical infrastructure including power grids, its essential utilities such as water and sanitation, all intertwined with healthcare, education and other vital social services.

This integrated system forms the foundation of a smart city; complex ecosystems reliant on data-driven solutions including AI Computer Vision, 5G, secure wireless networks and IoT devices.

Achieving the smart city vision

But how do we actually achieve the vision of a truly connected urban environment and ensure that smart cities thrive? Well, there are four key pillars that underpin the successful development of smart cities.

The first is technology integration; where we see electronic and digital technologies weaved into the fabric of everyday city life. The second is ICT (information and communication technologies) transformation, whereby we are utilising ITC to transform both how people live and work within these cities. 

Third is government integration. It is only by embedding ICT into government systems that we will achieve the necessary improvements in service delivery and transparency. Then finally, we need to see territorialisation of practices. In other words, bringing people and technology together to foster increased innovation and better knowledge sharing, creating a collaborative space for progress.

ICT underpinning smart cities 

When it comes to the role of ICT and emerging technologies for building successful smart city environments, one of the most powerful tools is of course AI, and this includes the field of computer vision. This technology acts as a ‘digital eye’, enabling smart cities to gather real-time data and gain valuable insights into various, everyday aspects of urban life 24 hours a day, 7 days a week.

Imagine a city that can keep goods and people flowing efficiently by detecting things such as congestion, illegal parking and erratic driving behaviours, then implementing the necessary changes to ensure smooth traffic flow. 

Then think about the benefits of being able to enhance public safety by identifying unusual or threatening activities such as accidents, crimes and unauthorised access in restricted areas, in order to create a safer environment for all.

Armed with the knowledge of how people and vehicles move within a city, think about how authorities would be able to plan for the future by identifying popular routes and optimising public transportation systems accordingly. 

Then consider the benefits of being able to respond to emergency incidents more effectively with the capability to deliver real-time, situational awareness during crises, allowing for faster and more coordinated response efforts.

Visibility and resilience 

Finally, what about the positive impact of being able to plan for and manage events with ease. Imagine the capability to analyse crowd behaviour and optimise event logistics to ensure the safety and enjoyment of everyone involved. This would include areas such as optimising parking by being able to monitor parking space occupancy in real-time, guiding drivers to available spaces and reducing congestion accordingly. 

All of these capabilities share one thing in common – data. 

Data, data, data 

The key to unlocking the full and true potential of smart cities lies in data, and it is by leveraging computer vision and other technologies that cities can gather and analyse data. 

Armed with this, they can make the most informed decisions about infrastructure investment, resource allocation, and service delivery. Such a data-driven approach also allows for continuous optimisation, ensuring that cities operate efficiently and effectively.

However, it is also crucial to remember that a smart city is not an island. It thrives within a larger network of interconnected systems, including transportation links, critical infrastructure, and social services. It is only through collaborative efforts and a shared vision that can we truly unlock the potential of data-driven solutions and build sustainable, thriving urban spaces that offer a better future for all.

Furthermore, this is only going to become more critical as the impacts of climate change continue to put increased pressure on countries and consequently cities to plan sustainably for the future. Indeed, the International Institute for Management Development recently released the fifth edition of its Smart Cities Index, charting the progress of over 140 cities around the world on their technological capabilities. 

The top 20 heavily features cities in Europe and Asia, with none from North America or Africa present. Only time will tell if cities in these continents catch up with their European and Asian counterparts moving forward, but for now the likes of Abu Dhabi, London and Singapore continue to be held up as examples of cities that are truly ‘smart’. 

  • Data & AI
  • Infrastructure & Cloud
  • Sustainability Technology

Gaurav Bansal, Senior Transformation Leader at Stellarmann, explores the steps organisations can take towards better Scope 3 reporting.

Everyone has a responsibility to help meet Net Zero targets. For businesses that means adhering to emerging reporting regulations around their Environmental, Social and Governance (ESG) obligations.

In the UK, for example, Streamlined Energy and Carbon Reporting (SECR) already requires large organisations to disclose their energy use, greenhouse gas (GHG) emissions and carbon footprint as part of their annual financial reporting. Many more businesses will also need to adhere to the Corporate Sustainability Reporting Directive (CSRD) and the Sustainability Disclosure Requirements (SDR) – which aims to tackle issues such as ‘greenwashing’. 

Pressure to be more transparent is coming from multiple areas – from international governments to shareholders and consumers. And, even if there isn’t a regulatory requirement for your organisation currently, if you’re in the supply chain of businesses that do have to report, you will increasingly be asked for your Scope 1 data as part of pitches and due diligence. Essentially, your Scope 1 data is someone else’s Scope 3. 

The consequences of not reporting effectively could be significant – both financially and in terms of brand reputation. Put simply, it’s not worth the risk.

Rather than fear these changes, however, companies should see this as an opportunity to gain visibility and clarity over their supply chains, identify areas where positive changes can be made, and become more sustainable, ethical, and competitive. 

People, processes and building a reporting platform

Compliance relies on gathering data from across the business and the wider supply chain, which can be challenging for organisations. This information will need to be pulled from disparate sources – especially when it comes to data around Scope 3 emissions. 

You also need to know who owns the data, and the frequency and cadence with which it is refreshed. A certain level of knowledge is required to understand units of measurement and how robustly suppliers are undertaking their own measurement.

All of this means building a dedicated ESG reporting team that understands what data needs to be reported on and where that data resides. 

This raises the question of where ESG should sit within the organisation, and who will lead it. Successful reporting relies on putting the right people and processes in place, and deciding which elements of an ESG reporting platform an organisation wants to build in-house and what it outsources.

There are seven simple steps that companies can follow when building the foundations:

Outline clear objectives

Set clear objectives for calculating carbon emissions. These should cover specific regulatory requirements to ensure compliance, as well as commercial considerations. It is essential to take a high-level approach to effectively monitor and reduce emissions.

Detail requirements and scope

Identify the data required to calculate Scope 1,2 and 3 emissions. This includes emissions from data centres, property and power consumption, for example – as well as company travel and vehicles, and supply chain and financed emissions.

Define an overarching operating model and governance structure

Define an ongoing process for calculating and reporting on emissions, including tracking the progress of remedial actions. Set up an overarching governance structure and agree on roles and responsibilities across different divisions of the business.  

Appoint staff to roles identified in the operating model 

Make sure you have the right staff in place – and ensure that they have received sufficient training. This shouldn’t be tacked on to the day job, but resourced properly with people who are motivated by ESG issues. 

Identify skills or capability gaps 

ESG reporting teams need to evaluate the skills they possess in-house and where they need to bring in specialist consulting or technology partners, to build additional capabilities.

Don’t try to solve everything at once 

Focus on making incremental improvements and taking an iterative approach to ESG reporting. It’s essential to take time to understand obligations and timelines. This is necessary to ensure project deliverables are aligned to meeting the minimum requirements for critical targets.

Connect with industry peers 

Share knowledge with other organisations that are going through the process. ESG reporting teams should be encouraged to connect with their peers and exchange experiences and ideas to learn and improve. There are more and more opportunities to do this, through groups such as CFO Network, the Environmental Business Network or ESG Peers, for example.

The path to better reporting

ESG reporting will become an imperative for businesses as we aim for Net Zero. Companies need to see it as a priority, and they should be preparing now. 

There are challenges, limitations and pain points that need addressing before companies can build their own ESG reporting model, however. Without standardisation, it’s important to establish what ‘good’ looks like for your individual business over time. 

Whichever route you choose, cross-departmental support will be critical, as it has the potential to impact – and benefit – every part of the organisation. Those who lead ESG reporting need the training and resources to do the job to the best of their ability. And, if the appropriate skills are not available in-house, companies should look to partner with companies that can provide the expertise they need.

Ultimately, leaders and decision-makers must recognise that ESG reporting is not a burden or a threat, but a huge opportunity to reassess in-house processes and those of their partners. It could lead to positive changes that benefit the business, its customers and suppliers and, ultimately, the planet. 

For further guidance on on preparing your organisation for the next chapter of sustainability reporting, click here to read Stellarmann’s most recent white paper.

  • Sustainability Technology

Martin Hartley, Group CCO at international IT and business consultancy emagine, on making complex, daunting sustainability goals more achievable.

‘Sustainability’ is not just a buzzword on business agendas, it is an urgent call to action for the corporate world. Incorporating more sustainable business practices is essential for the sake of people and planet, but also for corporate survival. 

Requirements around reporting emissions and meeting other sustainability criteria are far from uniform. Nevertheless, businesses that fail to work in a more environmentally and socially responsible way will get left behind by competitors, risking non-compliance as the regulatory landscape becomes more complex. 

Neither will the journey end, as goalposts move and official requirements, such as through the Corporate Sustainability Reporting Directive, increase over time.  

International companies in particular face complex challenges, but there are ways to break these down on the road to greater sustainability. 

Size matters to sustainability

The challenges and existing requirements vary greatly depending on the size, type and location of a business. 

Faced with making changes to company policies, practices and suppliers, small-to-medium-sized business will have greater agility to pivot and adapt how they operate and who with. They may only have a local market and legislation to consider. On the other hand, these firms have less financial resources to allocate and becoming a more responsible business can initially come with some greater costs, such as switching to more responsible suppliers that may be less cost-effective.  

Whilst a larger business may have a deeper funding pot and more people to support the sustainability journey, these organisations face a complex task where operations span multiple international markets with respective local legislation and supply chains to manage. Businesses that are actively growing and acquiring other companies must quickly bring these operations in line with their ESG policies to ensure uninterrupted accountability. 

The importance of buy-in  

As in any project, setting clear goals and earning buy-in from all stakeholders are crucial steps. The board, senior leadership teams and employees at all levels across the business need to be involved and invested, or else new initiatives will fail. 

Organisations can overcome the initial reluctance to invest the time and effort it takes to build solid ESG values by educating teams on the value of more sustainable business. As well as the environmental and social benefits, there is no shortage of research into the advantage of being a more ethical business when it comes to hiring and retaining talent and the growing appeal to potential clients, which both ultimately impact operating profits. 

Once you have buy-in, people need focus. ‘Sustainability’ is a broad term and it is important to break it down into what it means for your business and set clear targets. Working with a reputable sustainability platform such as  EcoVadis, for example, will provide structure and help the management of ESG risk and compliance, meeting corporate sustainability goals, and guiding overall sustainability performance. 

Creating a tangible plan and building a project with milestones that involve everyone in the organisation will help to future-proof new policies and people are generally more eager to participate if there is an end goal to reach, such as achieving a particular sustainability rating.  

What action to take? 

ESG efforts can focus on enhancing employees’ wellbeing and improving policies, actions and training, such as in relation to human rights, health and safety, diversity, equity, and inclusion. Refurbishment and recycling of IT equipment are also among potential measures.  

At emagine, as well as the above, over the last year we have put greater emphasis on our commitment to uploading and disclosing firmwide data to reduce CO2 emissions by signing up to the SBTi (Science Based Target initiative) and using more green energy.  

We have also signed a sustainability-linked loan with our bank, linking loans to ESG goals. The firm must live up to certain targets relating to ESG performance in order to get a discount on its fixed interest rates. This of course carries risk and demonstrates the firm’s commitment. 

Navigating the green maze of regulations and standards 

ESG is booming, maturing and changing every day. To embrace sustainable business, regular analysis of the ESG landscape and attending webinars, reading articles and leaning on professional networks is time well spent. 

Some movements in the ESG space are not set in stone and can therefore be open to interpretation, and the number of new standards and trends that are constantly emerging can be overwhelming. This reinforces the importance of staying informed, so businesses can prioritise what matters to their organisation.  

Managing new acquisitions 

In our experience, when acquiring smaller companies, they are usually less advanced in their ESG initiatives. We can use our experience of adopting more sustainable practices to bring them in line with our existing operation, including achieving internal buy-in, relatively quickly. Businesses can greatly help this process by only exploring merger and acquisition opportunities with companies that have similar values from the outset. 

Every business is on a sustainability journey, whether voluntarily or not, as official requirements and consumer expectations around responsible business grow. An increasing number of organisations are voluntarily taking steps, such as disclosing emissions data through frameworks such as the Science Based Targets initiative (SBTi). To remain competitive and survive long-term, being proactive will be essential as well as the right thing to do.

  • Digital Strategy
  • Sustainability Technology

Despite pledging to conserve water at its data centres, AWS is leaving thirsty power plants out of its calculations.

While much of the conversation around the data centre industry’s environmental impact tends to focus on its (operational and embedded) carbon footprint, there’s another critical resource that data centres consume in addition to electricity: water.

Data centres consume a lot of water. Hyperscale data centres in particular, like those used to host cloud workloads (and, increasingly, generative AI applications) consume twice as much water as the average enterprise data centre.  

Server farming is thirsty work 

Data from Dgtl Infra suggests that, while the average retail colocation data centre consumes around 18,000 gallons of water per day (about the same as 51 households), a hyperscale facility like the ones operated by Google, Meta, Microsoft, and Amazon Web Services (AWS), consumes an average of 550,000 gallons of water every day. 

This means that clusters of hyperscale data centres — in addition to placing remarkable strain on local power grids — drink up as much water as entire towns. In parts of the world where the climate crisis is making water increasingly scarce, local municipalities are increasingly being forced to choose between having enough water to fuel the local hyperscale facility or provide clean drinking water to their residents. In many poorer parts of the world, tech giants with deep pockets are winning out over the basic human rights of locals. And, as more and more cap-ex is thrown at generative AI (despite the fact the technology might not actually be very, uh, good), these facilities are consuming more energy and more water all the time, placing more and more stress on local water supplies

A report by the Financial Times in August found that water consumption across dozens of data centres in Virginia had risen by close to two-thirds since 2019. Facilities in the world’s largest data centre market consumed at least 1.85 billion gallons of water last year, according to records obtained by the Financial Times via freedom of information requests. Another study found that data centres operated by Microsoft, Google, and Meta draw twice as much water from rivers and aquifers as the entire country of Denmark. 

AWS pledges water positivity in Santiago 

Earlier in 2024, AWS announced plans to build two new data centre facilities in Santiago, Chile, a city that has emerged in the past decade as the leading hub for the country’s tech industry. The facilities will be AWS’ first in Latin America. 

The announcement faced widespread protests from local residents and climate experts critical of AWS’ plans to build highly water-intensive facilities in one of the most water stressed regions in the world. Chile’s reservoirs — suffering from over a decade of climate-crisis-related drought — are drying up. The addition of more massive, thirsty data centres at a time when the country desperately needs all the water it can get has been widely protested. Shortly afterwards, AWS made a second announcement. This, on the face of it, wasan answer to the question: where will Chile get the water to power these new facilities? 

Amazon said it will invest in water conservation along the Maipo River — the main source of water for Santiago and the surrounding area. The company says it will partner with a water technology startup that helps farmers along the river install drip irrigation systems on 165 acres of farmland. If successful, the plan will conserve enough water to supply around 300 homes per year. It’s part of AWS’ campaign, announced in 2022, to become “water positive” by 2030. 

Being “water positive” means conserving or replenishing more water than a company and its facilities uses. AWS isn’t the only hyperscaler to make such pledges; Microsoft made a similar one following local resistance to its facilities in the Netherlands, and Meta isn’t far behind. 

However, much like pledges to become “net zero” when it comes to carbon emissions, water positivity pledges are more complicated than hyperscalers’ websites would have you believe. 

“Water positive” — a convenient omission 

While it’s true that AWS and other hyperscalers have taken significant steps towards reducing the amount of water consumed at their facilities, the power plants providing electricity for these data centres are still consuming huge amounts of water. Many hyperscalers conveniently leave this detail out of their water usage calculations. 

“Without a larger commitment to mitigating Amazon’s underlying stress on electricity grids, conservation efforts by the company and its fellow tech giants will only tackle part of the problem,” argued a recent article published in Grist. As energy consumption continues to rise, the uncomfortable knock-on consumption effects will also rise, as even relatively water-sustainable operations like AWS continue to push local energy infrastructure to consumer more water to keep up with demand. 

AWS may be funding dozens of conservation projects in the areas where it builds facilities, but despite claiming to be 41% of the way to being “water positive”, the company is still not anywhere near accounting for the water consumed in the generation of electricity used to power its facilities. Even setting aside this glaring omission, AWS still only conserves 4 gallons of water for every 10 gallons it consumes.    

  • Infrastructure & Cloud
  • Sustainability Technology

Jason Murphy, Managing Director of Global Retail at IMS EVOLVE, explores a new approach to supermarket sustainability.

Supermarkets are at the heart of our communities. As a result, they are the frontlines of the battle against climate change. As major players in the retail sector, supermarkets’ role in the UK’s clean energy transition is pivotal. 

Leading the charge by setting ambitious sustainability goals are top food retailers like Tesco, Morrisons, and Asda. Tesco and Morrisons aim for net zero operational emissions by 2035, and Asda has committed to net zero by 2040, with a 20% reduction in food waste by 2025.

Achieving these targets isn’t just about meeting regulations—it’s about redefining what it means to be a sustainable business

While addressing scope 3 emissions across the entire value chain is crucial, supermarkets have a unique opportunity to make a tangible impact within their own operations too. Energy usage from high-consuming assets and food waste are just some of the sustainability challenges retailers face, and although they are significant, they are also surmountable. 

Digital solutions are revolutionising store operations, from cutting edge energy management systems that optimise consumption to advanced analytics that drive efficient and effective maintenance, as well as minimising food waste. These technologies are not just tools; they are catalysts for change, enabling retailers to achieve their sustainability goals while enhancing efficiency and reducing costs.

In this new era, sustainability is not a burden, but rather an opportunity to lead and innovate. By embracing digital transformation, food retailers can pioneer a greener future, setting new standards for the industry and making a lasting positive impact on the planet.

Curbing Consumption

Reducing energy consumption through digitalisation is a game-changer for supermarkets. By optimising machines, such as refrigeration equipment and HVAC systems, retailers can achieve significant energy savings. Deploying solutions that are controls-agnostic means that seamless integration of any device, regardless of its manufacturer or age, into a modern digital system can be achieved at speed and scale. This approach transforms existing environments, allowing retailers to harness the power of Internet of Things (IoT) technology without the traditional need for costly machine upgrades. 

The result is a revolutionised operation that maximises efficiency while minimising costs and consumption.

Once integrated, these IoT solutions mine millions of raw, real-time data points from the retailer’s infrastructure. Everything from machine health and performance to energy consumption and set points are being collected from thousands of machines across a retail estate, enabling visibility and control like never before. Advanced IoT solutions can then analyse the data to identify inefficiencies in machine performance. Beyond just detection, these systems automatically enact adjustments to ensure optimal output, protecting the integrity of assets, extending their life cycle, and reducing unnecessary energy consumption.

Furthermore, through clever contextualisation with other systems and data sets, IoT solutions can leverage the visibility and control they have over machines to automate more effective schedules to again reduce and optimise the consumption of energy. For example, stores can set lighting and HVAC systems to automatically adjust and maintain themselves based on store opening hours to slash energy consumption during out of hours and reduce costs.

Modernised Maintenance 

This unprecedented access to real-time performance and efficiency data is transforming maintenance, shifting it from reactive to predictive. IoT solutions continuously monitor assets for incremental changes and can identify early when an asset performance deviates from ideal conditions and is demonstrating warnings of a fault or failure. Advanced solutions can enact immediate and automatic changes to keep the asset within its peak operational efficiency. If these changes are unsuccessful in correcting the problem, the solution would automatically create an alert to notify a relevant engineer. 

With access to this technology, engineers can often attempt remote fixes or accurately diagnose the issue before even arriving on-site. When a physical visit is necessary, engineers are equipped with detailed insights into the problem, ensuring that the right person, with the right tools and parts, is dispatched. This approach significantly increases the first-time fix rate, reducing both the manpower and the number of truck rolls required to resolve the issue.

Early fault detection and swift resolution are crucial in preventing catastrophic machine breakdowns, which can lead to excessive energy consumption or, in the case of refrigeration, the loss of valuable stock. By addressing issues before they escalate, retailers can maintain operational efficiency and minimise risks to their business.

Reducing Food Waste

With an estimated one-third of all the food produced in the world going to waste, tackling the complex issue of food waste is a critical sustainability issue. Food retailers are at the forefront of this effort, using digital technology to improve food safety, quality and shelf life, significantly reducing waste levels.

IoT technology offers the granular monitoring and management of refrigeration to ensure immediate action and intervention is possible to protect perishable goods. Traditionally, the complexity of the supply chain has led to retailers chilling all food to the lowest temperature required by the most sensitive items, such as meat. However, with the integration of IoT technology and third-party data like merchandising systems, retailers can now automatically set, monitor, and maintain refrigeration temperatures tailored to the specific contents. As a result, not only does IoT hugely reduce energy consumption, but i also enhances food qualit, and minimises food wastage. 

In response to extreme temperatures, such as the heatwaves in the summer of 2022, retailers are more focused than ever on maintaining optimal conditions for fresh produce and protecting against the heat. Digital technology supports this by implementing load-shedding strategies, shifting energy from less critical units (for example, those containing fizzy drinks) to support the most critical units, which require the most energy and to be cooled to the lowest temperature (e.g containing fresh produce). This ensures product safety and freshness, reducing unnecessary food waste.

A Real-World Impact

Digital technology is revolutionising the food retail industry. Control-agnostic IoT solutions, real-time data collection, and automated action are helping retailers improve energy management, optimise machine maintenance, and reduce food waste. 

Going forward, food retailers must continue embracing digital innovation to stay flexible and responsive to new challenges, such as rising temperatures and increasing heatwaves. This commitment to technology will drive continued progress in sustainability, ensuring a greener future for the industry and the planet.

  • Digital Strategy
  • Sustainability Technology

Shelley Salomon, VP of Global Business at Amazon Business, discusses her company’s commitment to fostering gender diversity in procurement… Procurement’s…

Shelley Salomon, VP of Global Business at Amazon Business, discusses her company’s commitment to fostering gender diversity in procurement…

Procurement’s gender imbalance isn’t new.

Traditionally, the function was regarded as a male-dominated profession. But change is afoot, in more ways than one. While a digital transformation amidst technological innovation is well-publicised, another evolution is underway within the workforce.

Gender diversity has become an important component of many company strategies globally. While progress to encourage more women into procurement has already started. There still remains an imbalance, particularly among those holding leadership positions. With current statistics suggesting around one in four leadership positions are held by women, there is still room for improvement.

So, is progress happening quickly enough? Shelley Salomon, VP of Global Business at Amazon Business, discusses her organisation’s commitment to fostering gender diversity and how women can reach parity in procurement. 

In your opinion, where is procurement today in terms of women’s representation in 2024?

Shelley Salomon: “Women’s representation in procurement has seen progress these past few years, but there remains room for further improvement. Gartner’s data shows that women comprise 41% of the supply chain workforce. It’s encouraging to see greater gender diversity within the industry.

“While these statistics are encouraging, they also highlight ongoing challenges. Particularly at the leadership level. Only 25% of leadership roles are held by women. This disparity underscores the need for sustained efforts to promote gender diversity and support women’s ascension to senior positions within procurement.

“My perspective on this trend is one of cautious optimism. The progress we see is promising, reflecting a growing recognition of women’s unique contributions to procurement roles. Diverse perspectives and gender equity are vital for effective decision-making and problem-solving. Additionally, multiple credible studies show that companies with the greatest gender balance in the C-suite are likelier to achieve above average financial results. However, much work must be done to ensure these advancements translate into lasting change.”

While progress to encourage more women into the workforce seems to be underway, there is still a major disparity in the number of women leaders in procurement. What is the best way to go about rectifying this? 

Shelley Salomon: “I believe there’s a significant opportunity to welcome more women into procurement leadership roles. By establishing robust mentorship and sponsorship programmes, organisations can provide invaluable guidance, support, and networking opportunities. Thus empowering women to thrive in their careers and gain visibility within the organisation. Investing in inclusive leadership development programmes is essential. These initiatives focus on building inclusive skills and readiness for leadership roles, continuing to foster a more inclusive and dynamic workforce.

“In my opinion, implementing inclusive hiring practices that actively promote gender diversity, such as using diverse hiring panels and conducting blind recruitment processes, is essential to minimising biases. 

“Lastly, setting clear, measurable goals for increasing the number of women procurement leaders and regularly reporting on progress to hold leadership teams accountable can drive meaningful change. By taking these proactive steps, organisations can create a more equitable environment that supports the advancement of women into leadership roles within procurement.”

Read the full story here!


  • AI in Procurement
  • Data in Procurement
  • Digital Procurement
  • Procurement Strategy
  • Sustainability Technology

Mav Turner, Chief Product and Strategy Officer at Tricentis, explores the relationship between software testing and sustainability.

According to the 2024 Gartner CEO and Senior Business Executive Survey, over two- thirds (69%) of global CEOs consider sustainability as a significant growth opportunity for their business. Discussing the findings, Gartner highlighted that sustainability is one of the main factors that will “frame competition” and surpass both productivity and efficiency in terms of business priorities for 2024. 

However, other research suggests that a significant gap exists between views around sustainability at board level and the actual implementation of enterprise-wide strategies and the tools needed to deploy them. Capgemini’s 2021 Sustainable IT report found that just 43% of executives are aware of their organisation’s IT footprint, while nearly half (49%) lack the tools required to adopt and deploy solutions that will deliver their sustainability goals.

The role of Quality Assurance 

One key aspect that is too frequently overlooked yet could significantly impact sustainability when firms develop their products and services is quality assurance (QA). It makes perfect sense if you stop and think about it: inefficient software developed without proper application testing processes will have an environmental cost.

Software testing has the potential to significantly improve resource optimisation and energy consumption. The testing process verifies that applications behave as they should, meet specified requirements, and identify errors and defects to ensure that software is operating at the highest level of efficiency.

For example, by simulating legitimate use cases and real-world scenarios, the testing process enables developers and QA teams to proactively identify inefficiencies that could increase energy consumption before applications go into production.

Removing inefficient code 

Identifying the reasons for and impact of inefficient code is another key benefit provided by implementing rigorous software testing. Poor resource management, inadequate memory usage and redundant computations are some of the most common factors.

Such inefficiencies have far-reaching implications, particularly in today’s enterprise environments dominated by cloud computing and distributed systems. Slow execution times, excessive memory consumption, and increased energy usage all lead to an increase in operational costs, present scalability challenges, and negatively impact end-user experiences.

Identifying these issues early on in the application lifecycle and optimising code efficiency will allow developers to minimise resource wastage while also enhancing performance and the overall sustainability of their applications and codebases.

By incorporating test-driven development in this way, emphasising test creation before writing code, developers will have a much clearer understanding of their code’s functionality and expected behaviour from the outset.

Ultimately, this approach creates green code because consistently running tests throughout the development process helps identify and prevent defects early, resulting in cleaner code that is less prone to bugs and easier to maintain over time. 

In a practical sense, automating the testing process requires less processing power and fewer resources compared to the traditional and time-intensive process of doing so manually, but it also frees up time, a scarce and valuable resource, for IT teams to dedicate to more critical tasks.

Out with old data

Another critical element to consider is the impact of old, legacy data, which can cause a number of sustainability-related challenges. Too often, and sometimes unknowingly, enterprises hold onto huge volumes of poor-quality, old data, which negatively impacts both application performance and the time taken to produce business-critical reports. 

This also directly impacts energy consumption by increasing the amount of energy consumed by devices and machines running the applications. This is where data integrity testing can play a vital role: evaluating legacy data quality to pinpoint any data redundancies and ultimately reduce an organisation’s carbon footprint. 

If sustainability is to truly deliver the impact business leaders predict and demand, then meeting sustainable goals must start with an in-depth look at IT operations through a quality assurance lens. There is a direct link between software testing and successfully delivering on sustainability pledges, which can no longer be overlooked or ignored. 

  • Digital Strategy
  • Sustainability Technology

Clare Walsh at the Institute of Analytics explores the fact that, while your Chatbot may look like your online search browser, there are some dramatic differences between the two technologies with serious implications for organisational sustainability.

In the early days of growing environmental awareness, the ‘paperless office’ was hailed as a release from the burden of deforestation, then the most urgent concern. The machines that replaced filing cabinets came with other, less visible, environmental costs. The latest generation of machines are the dirtiest we have ever produced, and we need to factor their carbon impact into our environmental planning. 

When mandatory ESG reporting was introduced in the UK, the technology sector was not among the first sectors required to comply. Part of the reason that the tech sector draws less attention to itself is that we don’t have we don’t have clear headline busting statistics to rely on. For example, according to Google.com, one internet search produces approximately 0.2g of CO2. If your website gets around 10,000 views per year, that’s around 211 kg per year. Add a chatbot functionality to that website and you jump into a whole different league.

The hidden costs of new algorithms

Chatbots are based on Large Language Model algorithms, which have very little in common with the search browsers that we’re more familiar with, even if their interfaces look familiar. Every time you run your query in a service like Bard, LLama or Co-Pilot, the machine has to traverse over every data point in its network. We don’t know for certain how big that network is, but estimates for exemple, that ChatGPT4, runs on around 4 x 1.7 trillion bytes are plausible. 

We aren’t yet able to measure how much CO2 that produces with every query. Estimates range from 15 to 100 times more carbon produced on one sophisticated chatbot request compared to a regular search query, depending on how you factor into the equation the trillions and trillions of times that the machine had to run over that data set during the ‘training’ phase, before it was even released. And many of us are ‘entering queries’ with the casual back-and-forth conversational style like we’re chatting to a friend.  

Given that these machines are now responding daily to trivial and minor requests across organisational networks, the CO2 production will quickly add up. It is time to look at the environmental bottom line of these technologies.

Solutions on the horizon

Atmospheric carbon may come under some control soon. In the heart of Silicon Valley, the California Resources Corporation saw their plans for carbon capture and storage reach the draft permission stage earlier this month. There are another 200 applications for similar projects waiting in line. Under such schemes, carbon is returned to the earth in ‘TerraVaults’. The idea is to remove it from the atmosphere by injecting it deep into depleted oil reserves left behind after fossil fuel extraction. It’s the kind of solution that is popular, because it takes the onus of lifestyle change away from the public. However, it’s a controversial technology that divides environmental experts. 

Only half an answer to a complicated problem

It also only addresses half the problem. These supercomputers burn through carbon at a shocking rate when they power up. They also need electricity to cool down. In fact, it is estimated that 43% data centre electricity could go on cooling alone. Regional water stress is a major part of the climate problem, too. Data centres guzzle water to run their cooling systems at a rate of millions of litres of water per year. This is nothing, however, compared to the volume of water needed to run the steam turbines to generate the electricity. It’s a vicious cycle of depletion.

It is an irony that the supercomputers that threaten the environment are also needed to save it. Without the kind of climate modelling that a supercomputer can provide, it will be harder to respond to climate challenges. Supercomputers are also improving their own efficiency. Manufacturers today use processors that constantly try to operate at maximum efficiency – a faster result means less energy consumption. These top end dilemmas over whether to use these machines are similar to those faced at an organisational level. At what point does it become worthwhile? 

What you can do

We need to develop a culture of transparency around the true cost of these sophisticated technologies. Transparency supports accountability and it benefits those who are doing the right thing. There are data centres that use 100% renewable energy today. Some, like Digital Realty, have even achieved carbon net neutrality in their operations in France. As more of us ask uncomfortable questions about where our chatbots are powered, we’ll start to get better answers.

In the meantime, the solution lies mostly in sensible deployment of these technologies. If your organisation is committed to the drive to net neutrality, it is worth considering where and how you apply these advanced technologies to meet with commitments your organisation has made. A customer facing chatbot may not be the optimal solution for your business or environmental needs.

  • Data & AI
  • Sustainability Technology

Businesses have been forced to navigate and adapt to these challenges to ensure continuity, limit interruption and reduce risk

From Brexit to the pandemic and the current geopolitical conflict, the supply chain industry has faced a flood of challenges in recent years. This has caused disruption to supply chains. Businesses have been forced to navigate and adapt to these challenges to ensure continuity, limit interruption and reduce risk. 

Alice Strevens, Director Human Rights and Social Impact, Mazars 

As part of this, it’s increasingly important for businesses to ensure they have robust human rights due diligence processes in place. These processes support companies in their decision-making during crises, and help them identify risks in their supply chains. This ultimately protects them in both stable and unstable times. 

Human rights and environmental due diligence provides a basis on which to address environmental, social and governance issues that impact supply chain resilience. Companies that respond to crises with an approach based on due diligence are more likely to protect their relationships with suppliers. Plus, they get to mitigate the impact on workers in their value chain. An example of this is during the Covid-19 pandemic. Many companies saw buyers abruptly cancel orders, request refunds in full and pause orders for months. With many suppliers facing reduced sales at the time, it led to questions as to whether businesses were working alongside suppliers. Or taking advantage of the circumstances to get reduced costs. 

It’s important to learn from these lessons to build strong sustainable supply chain strategies. This will help businesses remain resilient both in stable times. And in the face of significant events. There isn’t a perfect formula. However, the concept of double materiality (i.e. considering sustainability matters from both the perspective of the impact on people and the environment, and the perspective of the financial risks and opportunities to the business) is helping businesses to assess sustainability-related risk strategically.  

Supplier engagement will ensure long-term success 

Building a sustainable supply chain for the long-term requires engagement and collaboration with supply chain partners. Long-term relationships can provide a basis to share challenging risks and impacts transparently. Human rights and environmental due diligence foregrounds the importance of engagement and collaboration to mitigate identified risks and build resilience. 

The responsible supply chain strategy should be integrated into the overarching sourcing strategy and supplier engagement approach. Delivery against the strategy should be built into performance targets and incentives. Regular reviews of impacts, targets and KPIs should be conducted at board level. Making use of the latest technological developments, including assessing their risk for social/environmental concerns and measuring and tracking performance. This will help companies stay ahead and be prepared in their processes. 

An evolving regulatory landscape calls for preparedness 

Another important point to keep in mind is the legislative landscape. This is especially pertinent in the EU, as the rules will make previous voluntary standards now mandatory and will impact large companies. This includes those in their supply chain, including in the UK. 

Companies should therefore look to base their strategies on the authoritative voluntary frameworks on conducting human rights and environmental due diligence. Primarily the UN Guiding Principles on Business and Human Rights and the OECD Guidelines for Multinational Enterprises on Responsible Business Conduct. This will set them up for meeting legislative requirements down the line. For example, Mazars and Shift co-wrote the UNGP Reporting Framework, which provides a framework for companies to adopt responsible practices, and manage human rights risks. 

The future of supply chain is now 

Ultimately, companies and suppliers should work together to ensure collaboration and a robust strategy which takes all parties into consideration. Listening to feedback and promoting good communication between stakeholders will ensure smooth sailing during the business-as-usual times. And the more tumultuous periods. 

Implementing long-lasting strategies and creating resilience to risks will increase business’ market access and promote their financial value. Thus ensuring that they deliver quality goods and gain loyalty among suppliers. 

Read the full issue of SCS here!

  • Sustainability Technology
  • Sustainable Procurement

Soqui Calderon, Regional Director of Sustainability for Grupo Modelo and the Middle Americas Zone, reveals how the beverage giant is tackling sustainability from a procurement and supply chain perspective

Our exclusive cover story this month is with Soqui Calderon, Regional Director of Sustainability for Grupo Modelo and the Middle Americas Zone. She reveals how the beverage giant is tackling sustainability from a procurement and supply chain perspective. 

Grupo Modelo is a giant and a leader in the production, distribution and sale of beer in Mexico. Grupo Modelo is part of the Middle America Region (of the AB InBev Group) and boasts 17 national brands. Corona Extra is the most valuable brand in Latin America. Its other brands include: Modelo Especial, Victoria, Pacífico and Negra Modelo. The company also exports eight brands and has a presence in more than 180 countries while operating 11 brewing plants in Mexico. 

Through more than nine decades, Grupo Modelo has invested and grown within – and with – Mexico. It has also generated more than 30,000 direct jobs in its breweries and vertical operations throughout the country. 

Grupo Modelo, like many forward-thinking companies, is currently focused on a drive towards establishing a truly sustainable business. This endeavour is best exemplified in the Middle Americas Zone (MAZ), where sustainability efforts have been led by for the past five years by Soqui Calderon Aranibar, Regional Sustainability and ESG Director. Ambitious targets have been established for the region, but some remarkable achievements have already been made. As Calderon says: “For our team, sustainability is not just part of our business, it IS our business.” 

Sustainability in the MAZ 

The Middle Americas Zone is made up of several countries: Mexico, Colombia, Perú, Ecuador, Honduras, El Salvador, Dominican Republic, Panama, Guatemala + other Caribbean islands. Each territory is home to its own brands that are household names in their respective countries. However, Grupo Modelo’s Corona beer, manufactured in Mexico, is one of the top five best-selling beers globally.

Calderon’s regional role means she travels extensively throughout the territories, engaging with all their businesses and collaborating closely with their partners and suppliers. Her job? To effectively outline their sustainability goals…

Read the full story here!

Elsewhere we have some incredible names imparting expert insights from companies such as Amazon Business, Source Day, DHL and Marriott International and lots, lots more! 

Read the full issue here!

Enjoy! 

  • AI in Procurement
  • Data in Procurement
  • Sustainability Technology
  • Sustainable Procurement

Rolf Bienert, Managing and Technical Director at OpenADR Alliance, a global industry alliance, discusses the potential for virtual power plants as an untapped resource.

Balancing supply and demand is critical to maintain a reliable electricity grid. Virtual Power Plants (VPPs) present an innovative and alternative solution, enabling local grid operators to use energy flexibility to ensure a more stable supply, improved energy efficiency and enhanced grid capacity.

The potential for virtual power plants

With the energy sector focusing more on renewable forms of energy and distributed energy resources (DER), VPPs are attracting more attention, able to deliver value to customers, and the potential to offer huge benefits to DER installers, grid operators and utilities.

Drawing on the capacities of a range of energy sources, such wind turbines, solar panels, and electric vehicles, together with battery storage and other assets, the cost of implementing VPPs can be much lower when compared to traditional power plants. Controlled by grid operators or third-party aggregators, these energy resources can be monitored and optimised with bi-directional communications between components for a more efficient and resilient power grid.

Looking to a less carbon-intensive energy future, VPPs could play a key role in providing resource adequacy and other grid services at a negative net cost to the utility.

The global market for VPPs was expected to grow to $2.36 billion in 2023 at a compound annual growth rate (CAGR) of 22.5%, according to the Virtual Power Plant Global Market Report 2023. Despite geopolitical issues, rising commodity prices and supply chain disruptions, the market is expected to reach $5.04 billion by 2027 at a CAGR of 20.9%.

As a member-led industry alliance, we can see momentum shifting already. With major players in the VPP sector focusing on the adoption of advanced technologies and open standards, which is helping to drive growth. Partnerships will be key to this growth, as utilities and energy providers collaborate with technology companies and device manufacturers to turn homes, workplaces, and communities into virtual power plants.

Two companies, Swell Energy and SunPower, are playing their part in this transformative shift, having established VPPs that offer new value to utilities and their customers.

Making waves in Hawaii

Swell Energy creates VPPs by linking utilities, customers and third-party service providers together, and by aggregating and co-optimising DER through its software platform. The VPPs provide a variety of grid service capabilities through projects in Hawaii, California, and New York, so utilities can deliver cleaner energy to customers and reduce dependence on fossil fuels.

The project in Hawaii, where Swell is working with Hawaiian Electric, represents a major advance in aggregated battery storage management technology. It will co-optimise batteries in 6,000 different homes to create a decentralised power plant for the local utility on three Islands. The program will deliver 80 megawatts hours of grid services using OpenADR-based integration, including capacity reduction, capacity build and fast frequency response to the three island grids, while also reducing bills and providing financial incentives for participating customers.

The VPP tackles several challenges, driven by Hawaiian Electric’s need for energy storage and renewable generation through DER, along with capacity and ancillary services to ensure adequate supply and system reliability across its service territory.

 Futureproofing energy supplies 

VPPs are also futureproofing energy supplies. Hawaii became the first US state to commit to generating 100% of its electricity from renewables by 2045, which means replacing fossil-fuelled plants with sustainable alternatives. While Hawaii has plentiful sunshine, grids can become saturated with solar production at midday, requiring batteries to store the surplus and make it available after the sun goes down.

Swell Energy will supplement Hawaiian Electric’s energy supply by relieving the grids of excess renewable energy as production spikes and absorbing excess energy when needed, reducing peak demand and providing 24/7 response to balance the grids. The renewable energy storage systems collectively respond to grid needs dynamically.

The model is a win-win. It provides homeowners with backup power and savings on their energy bills. At the same time, battery capacity is available to the utility to deal with the challenges of transitioning to a much cleaner energy source. This requires balancing grid needs while ensuring that customers are backed up and compensated. 

Rewarding customers in California

Global solar energy company SunPower’s VPP platform interfaces with utility DERMS platforms to ensure its customers’ SunVault storage systems are charging and discharging in line with the needs of the utility grid. The goal is to enroll customers in the program, dispatch according to the utility’s schedule, handle customer opt-outs and report performance data to the utility. As SunPower is a national installer, it must be able to communicate with dozens of utilities across the country.

The company also announced a partnership with OhmConnect to provide a new VPP offering for SunPower customers in California. Homeowners in selected locations with solar and SunVault battery storage can now connect with OhmConnect directly through the mySunPower app to earn rewards for managing their electricity use during times of peak demand. The idea being to make it as simple as possible for customers, putting them in full control of their energy use. 

The future potential of virtual power plants 

VPP programs like these demonstrate how to balance energy supply and demand on the network by adjusting or controlling the load during periods of peak demand, supporting the health of the grid, absorbing excess renewable energy, and much more.

Companies are already showcasing the potential capabilities of an advanced, distributed, and dispatchable energy future. But there are a relatively small number of initiatives globally. With the technology and communications standards to support it available, we need more opportunities like this to drive greater adoption and participant enrolment.

The timing has never been more important as we look ever more closely at an energy future that relies less on fossil fuels. With growing demands on the grid, especially in densely populated cities and with increasingly extreme weather events – VPPs offer an attractive solution. 

But it’s up to utilities, energy companies and partners to work together and embrace change, with governments supporting and driving change through regulation.

  • Infrastructure & Cloud
  • Sustainability Technology

Simon Yeoman, CEO at Fasthosts, discusses how businesses can ensure their cloud storage is more sustainable in an age of rising demand for data and AI.

With over half of all corporate data held in the cloud as of 2022, demand for cloud storage has never been higher. This has triggered extreme energy consumption throughout the data centre industry, leading to hefty greenhouse gas (GHG) emissions.

Worryingly, the European Commission now estimates that by 2030, EU data centre energy use will increase from 2.7% to 3.2% of the Union’s total demand. This would put the industry’s emissions almost on par with pollution from the EU’s international aviation.

Despite this, it must be remembered that cloud storage is still far more sustainable than the alternatives. 

Why should we consider cloud storage to be sustainable?

It’s important to put the energy used by cloud storage into context and consider the savings it can make elsewhere. Thanks to file storage and sharing services, teams can collaborate and work wherever they are, removing the need for large offices and everyday commuting.

As a result, businesses can downsize their workspaces as well as reduce the environmental impact caused by employees travelling. In fact, it’s estimated that working from home four days a week can reduce nitrogen dioxide emissions by around 10%. 

In addition, cloud storage reduces reliance on physical, on-premises servers. For small and medium-sized businesses (SMBs), having on-site servers or their own data centres can be expensive, whilst running and cooling the equipment requires a lot of energy, which means more CO2 emissions. 

Cloud servers, on the other hand, offer a more efficient alternative. Unlike on-premises servers that might only be used to a fraction of their capacity, cloud servers in data centres can be used much more effectively. They often operate at much higher capacities, thanks to virtualisation technology that allows a single physical server to act as multiple virtual ones. 

Each virtual server can be used by different businesses, meaning fewer physical units are needed overall. This means less energy is required to power and cool, leading to a reduction in overall emissions.

Furthermore, on-premises servers often have higher storage and computing capacity than needed just to handle occasional spikes in demand, which is an inefficient use of resources. Cloud data centres, by contrast, pool large amounts of equipment to manage these spikes more efficiently. 

In 2022, the average power usage effectiveness of data centres improved. This indicates that cloud providers are using energy more efficiently and helping companies reduce their carbon footprint with cloud storage.

A sustainable transition: three steps to create green cloud storage

Importantly, there are ways to further improve the sustainability of services like cloud storage, which could translate to energy savings of 30-50% through greening strategies. So, how can ordinary cloud storage be turned into green cloud storage? We believe there are three fundamental steps.

Firstly, businesses should carefully consider location. This means choosing a cloud storage provider that’s close to a power facility. This is because distance matters. If electricity travels a long way between generation and use, a proportion is lost. In addition, data centres located in cooler climates or underwater environments can cut down on the energy required for cooling.

Next, businesses should quiz green providers about what they’re doing to reduce their environmental impact. For example, powering their operations with wind, solar or biofuels minimises reliance on fossil fuels and so lowering GHG emissions. Some facilities will house large battery banks to store renewable energy and ensure a continuous, eco-friendly power supply.

Last but certainly not least, technology offers powerful ways to enhance the energy efficiency of cloud storage. Some providers have been investing in algorithms, software and hardware designed to optimise energy use. For example, introducing frequency scaling or AI and machine learning algorithms can significantly improve how data centres manage power consumption and cooling. 

For instance, Google’s use of its DeepMind AI has reduced its data centre cooling bill by 40% – a prime example of how intelligent systems can work towards greater sustainability. 

At a time when the world is warming up at an accelerating rate, selecting a cloud storage provider that demonstrates a clear commitment to sustainability can have a significant impact. In fact, major cloud providers like Google, Microsoft and Amazon have already taken steps to make their cloud services greener, such as by pledging to move to 100% renewable sources of energy.  

Cloud storage without the climate cost

The cloud’s impact on businesses is undeniable, but our digital growth risks an unsustainable future with serious environmental consequences. However, businesses shouldn’t have to choose between innovation and the planet.

The answer lies in green cloud storage. By embracing providers powered by renewable energy, efficient data centres, and innovative technologies, businesses can reap the cloud’s benefits without triggering a devastating energy tax. 

The time to act is now. Businesses have a responsibility to choose green cloud storage and be part of the solution, not the problem. By making the switch today, we can ensure the cloud remains a convenient sanctuary, not a climate change culprit.

  • Infrastructure & Cloud
  • Sustainability Technology

ESG data can not just measure and help improve organisations’ sustainability performance, but improve organisational efficiency and cut costs as well.

Caitlin Keam, Director of Product Management (ESG), at IFS explores how organisations can take the next step in committing to data transparency, prioritising accessibility, and embedding sustainability into the heart of their operations.

With a heightened global focus on sustainability practices, there is a real opportunity for data to help improve the operational efficiency of businesses across manufacturing, utilities, construction, and engineering. 

A December 2020 survey from NAVEX Global, found that about two-thirds of privately-owned companies have ESG initiatives in place. This is the latest in a series of signs that organisations are beginning to take more meaningful action around ESG. 

ESG data is more than just box-ticking

What may not be as well understood is that, in addition to being a green tick against regulatory compliance, ESG data can also provide a competitive edge in a competitive marketplace. 

However, this can often be hindered by mandatory reporting requirements and a box-ticking attitude. Metrics often aren’t fully understood, which holds organisations back from unlocking the real potential of their ESG data.

This data can go far beyond simply measuring carbon emissions, which is where the process begins and ends for many. It encompasses all business functions, from supply chain labour transparency to social metrics. It is also everywhere: from Excel spreadsheets and siloed, disconnected systems to disparate sites across the world. This makes it difficult for organisations to access their data in a timely way. By not fully understanding the value of their data and the many sources it comes from, it can hold businesses back from unlocking its real potential.

This is the true test for organisations. It isn’t about having a huge volume and range of ESG data to track and collect. Rather, it’s about rendering the information accessible, intelligible, and ultimately, impactful. 

ESG reporting, when fuelled by clear, comprehensive, and accurate datasets can have multi-faceted impacts, potentially influencing everything from day-to-day operational decisions to long-term strategic planning and global market trends.

Manufacturers, for instance, might leverage ESG data to understand where they can make changes within their processes to reduce energy use, while retailers might monitor their supply chains for opportunities for more ethical sourcing. 

The challenge, however, transcends mere data collection; it lies in transforming this data into actionable insights. Achieving this means promoting transparency and equipping all sectors of a business with an understandable and actionable sustainability framework, embedding ESG considerations into every decision-making process.

The data dilemma in ESG reporting

The journey to lucid ESG reporting is convoluted due to diverse methodologies and conflicting data sources. 

Companies grapple with simplifying data capture and enhancing accessibility. Complicating matters further is the issue of data transparency and trust. Some organisations are hesitant to share their ESG data, for fear of disseminating potential flaws in their reporting, while others may turn to consultants due to a lack of knowledge or confidence to get hold of accurate data on their own. The necessity for companies to own their reporting process is clear, avoiding reliance on estimated data and instead focusing on accurate, holistic information that leads to precise environmental impact assessments and strategic decisions.

Additionally, contemporary consumers are increasingly value-driven, often seeking detailed insights into a company’s ESG practices before patronage. 

Companies, therefore, must prioritise transparency in areas like carbon emissions and sustainable practices to meet these evolving expectations. For example, a software company that discloses accurate emissions data pertaining to product usage can foster deeper trust and satisfy the modern customer’s desire for transparency. 

Leveraging technology to navigate ESG complexity

So what’s the solution to all this? The emergence of cloud-based platforms that provide centralised data repositories for ESG data, is making it simpler for businesses across industries, including manufacturing, engineering and construction for example, to more easily access and manage data from across different departments within the business, as well as ensuring that it is consistent, accurate, and up to date.

Technology is also making it easier to connect and embed ESG data from various sources, including internal systems, external databases, and third-party vendors. This data harmonisation ensures that data is standardised and compatible, enabling meaningful analysis and reporting. This is key as it enables ESG data to be used to efficiently drive the core operations of the business.

Rather than having a dedicated sustainability team capture data separately, organisations need to embed sustainability measurements into existing processes. 

For instance, having finance teams capture consumption data at the source, alongside financial data, not only saves time but also increases efficiency. 

By capturing that data and sharing it more widely, it helps those responsible for setting and meeting ESG targets to create greater alignment on their broader organisational strategy, eliminating the isolation of data within a single team.

 Advanced analytics and ESG footprints 

By harnessing advanced analytics, businesses can gain a deeper understanding of their ESG footprint, helping them to not only meet regulatory standards but also drive innovative solutions that enhance sustainability while optimising costs. As the role of ESG continues to expand in business operations, AI will inevitably become an increasingly critical tool for gathering more accurate, consistent data.

Herein lies the opportunity for businesses: by using ESG data insights, they can enhance their operations by identifying opportunities to upgrade existing infrastructure and tools to lower emission alternatives, reducing costs and creating better outcomes. 

For example, by reducing the energy consumption of machines in the manufacturing space, or reducing waste production, that manufacturer can in turn, lower its overall operational costs and improve its profit margins, while also having a more positive impact on the environment. The trigger point for these reductions often lies in the operational data.

Final thoughts

While the ESG landscape is undeniably complex, it offers fertile ground for innovation and leadership in sustainability. 

However, this is only possible by steering the narrative from mere reporting to being part of the solution for empowering businesses to take charge of their ESG data and using it to drive their business forward. 

By working with partners and industry experts, organisations can take the next step in committing to data transparency, prioritising accessibility, and embedding sustainability into the heart of their operations. With mandated reporting standards such as CSRD and GRI coming down the line, businesses can use this as an opportunity to unlock the transformative power of their ESG data, paving the way for a more sustainable and profitable future.

  • Sustainability Technology

The major infrastructure project points to a new direction for Microsoft’s data centre ambitions in Africa.

Microsoft is partnering with UAE-based AI firm G42 on a major new data centre project in Kenya. The companies announced this week that they have committed to investing $1 billion in Kenya to support the nation’s digital economy and digital infrastructure. Building a “state of the art green data centre” is part of that package. Micsrofot identified the project as “one of the Kenyan investment priorities,” in a press release

The US tech giant has said the data centre will support the expansion of its cloud computing platform in East Africa. Microsoft invested $1.5 billion into the Abu Dhabi-based G42 in April in order to support the firm’s efforts to train an open-source large-language AI model in both Swahili and English.

Pivoting to Kenya (and away from Nigeria) 

So far, Microsoft’s data centre footprint in Africa has been restricted to two sites in Cape Town and Johannesburg, South Africa. The country will likely account for the majority of the $5 billion investment expected to enter the Africa data centre market by 2026. It already hosts the majority of the region’s data centre capacity. 

Looking to expand northwards into Sub-Saharan Africa, Microsoft initially looked as though it was gearing up to use Nigeria as its base in the region. However, last month, the company announced plans to shut down its Africa development centre located in Lagos, putting 200 people out of work

Now, it appears Microsoft is pivoting from Nigeria towards Kenya. The new facility, built by G42, will serve as the hub for Microsoft Azure in a new East Africa Cloud Region. Microsoft has announced plans for the site to come online in the next two years.

Additionally, Microsoft has pledged to bring last-mile wireless internet access to 20 million people in Kenya, and 50 million people across East Africa, by the end of 2025. The opening of an East Africa Innovation Lab in Kenya was also announced, and will presumably replace the one recently closed in Nigeria. 

Geothermal power in East Africa 

Beyond a statement that “Organisational and workforce adjustments are a necessary and regular part of managing our business,” Microsoft made little by way of explanation as to why it was shuttering its Nigerian business shortly before expanding in Kenya. However, one of the most likely reasons is the company’s ongoing struggle to reconcile its green ambitions with the growing demand for AI infrastructure. 

In Microsoft’s 2024 sustainability report, President Brad Smith and Chief Sustainability Officer Melanie Nakagawa highlighted the challenges the company faced due to the building of more datacenters and the associated embodied carbon in building materials, as well as hardware components such as semiconductors, servers, and racks. 

WIth AI-infrastructure threatening Microsoft’s ambitions to become carbon neutral by 2030, the company may be looking for ways to cut the emissions in its infrastructure by building as green as possible. 

Nigeria, which has a power mix dominated by natural gas and biofuels, is nowhere near as renewables-focused compared with Kenya. By comparison, Kenya sources up to 91% of its energy from renewables. Its mix is 47% geothermal, 30% hydro, 12% wind and 2% solar. The country hopes to transition fully to renewables by the end of the decade. This is largely thanks to geothermal, which reportedly has the potential to increase capacity as high as 10,000MW, far exceeding peak demand in Kenya currently, which is about 2,000MW.

Abundant geothermal power undoubtedly played a role in Microsoft’s decision to refocus its East-African ambitions on Kenya. Microsoft claims the new data centre campus in Olkaria, Kenya, will run entirely on renewable geothermal energy. It will also be designed with state-of-the-art water conservation technology—another area where the company admitted it was struggling to meet sustainability targets in its report. 

  • Infrastructure & Cloud
  • Sustainability Technology

Rising data centre demand as a result of AI adoption has spiked Microsoft’s carbon emissions by almost 30% since 2020.

Ahead of the company’s 2024 sustainability report, Brad Smith, Vice Chair and President; and Melanie Nakagawa, Chief Sustainability Officer at Microsoft, highlighted some of the ways in which the company is on track to achieve its sustainability commitments. However, they also flagged a troubling spike in the company’s aggregate emissions. 

Despite cutting Scope 1 and 2 emissions by 6.3% in 2023 (compared to a 2020 baseline), the company’s Scope 3 emissions ballooned. Microsoft’s indirect emissions increased by 30.9% between 2020 and last year. As a result, the company’s emissions in aggregate rose by over 29% during the same period. A potentially sour note for a company that tends to pride itself on leading the pack for sustainable tech. 

Four years ago, Microsoft committed to becoming carbon negative, water positive, zero waste, and protecting more land than the company uses by 2030. 

Smith and Nakagawa stress that, despite radical, industry-disrupting changes, Microsoft remains “resolute in our commitment to meet our climate goals and to empower others with the technology needed to build a more sustainable future.” They highlighted the progress made by Microsoft over the past four years, particularly in light of the “sobering” results of the Dubai COP28. “During the past four years, we have overcome multiple bottlenecks and have accelerated progress in meaningful ways.” 

However, despite being “on track in several areas” to meet the company’s 2030 commitments, Microsoft is also falling behind elsewhere. Specifically, Smith and Nakagawa draw attention to the need for Microsoft toreduceScope 3 emissions in its supply chain, as well as cut down on water usage in its data centres. 

Carbon reduction and Scope 3 emissions 

Carbon reduction, especially related to Scope 3 emissions, is a major area of concern for Microsoft’s sustainability goals. 

Microsoft’s report attributes the rise in its Scope 3 emissions to the building of more datacenters and the associated embodied carbon in building materials, as well as hardware components such as semiconductors, servers, and racks. 

AI is undermining Microsoft’s ESG targets 

Mass adoption of generative artificial intelligence (AI) tools is fueling a data centre boom to rival that of the cloud revolution. Growth in AI and machine learning investment is expected (somewhat conservatively) to drive more than 300% growth in global data centre capacity over the next decade. Already this year OpenAI and Microsoft were rumoured to be planning a 5GW, $100 billion data centre—the largest in history—to support the next generation of AI. 

In response to the need to continue growing its data centre footprint while also developing greener concrete, steel, fuels, and chips, Microsoft has launched “a company-wide initiative to identify and develop the added measures we’ll need to reduce our Scope 3 emissions.” 

Smith and Nakagawa add that: “Leaders in every area of the company have stepped up to sponsor and drive this work. This led to the development of more than 80 discrete and significant measures that will help us reduce these emissions – including a new requirement for select scale, high-volume suppliers to use 100% carbon-free electricity for Microsoft delivered goods and services by 2030.”

How Microsoft plans to get back on track

The five pillars of Microsoft’s initiative will be: 

  1. Improving measurement by harnessing the power of digital technology to garner better insight and action
  2. Increasing efficiency by applying datacenter innovations that improve efficiency as quickly as possible
  3. Forging partnerships to accelerate technology breakthroughs through our investments and AI capabilities, including for greener steel, concrete, and fuels
  4. Building markets by using our purchasing power to accelerate market demand for these types of breakthroughs
  5. Advocating for public policy changes that will accelerate climate advances

Despite being largely responsible for the growth in its data centre infrastructure, Microsoft is also confident that AI will have a role to play in reducing emissions as well as increasing them. “New technologies, including generative AI, hold promise for new innovations that can help address the climate crisis,” write Smith and Nakagawa.

  • Data & AI
  • Sustainability Technology

New advancements in Enhanced Geothermal Systems are turning the technology into a viable addition to wind and solar for renewable energy.

Geothermal energy has long been among the most niche forms of renewable energy generation. Wind and sunlight affect (almost) every part of the globe. Geothermal energy, by contrast, has only been able to be captured in volcanic regions like Iceland, where boiling water rises through the earth to the surface. 

Until now, that is. New technologies and techniques developed over the past decade are transcending the traditional limitation of geothermal power generation. A new clutch of companies and government projects are making the next generation of Enhanced Geothermal Systems (EGS) look like a viable source of renewable energy at a time when the green transition is in need of new ways to cut down on fossil fuels.  

A rocky road for EGS

For nearly 50 years, the EGS projects have been working on a way to convert low permeability, hot formations into economically viable geothermal reservoirs. Governments in the US and Japan, among others, have invested significantly into EGS projects. However, most projects have had mixed results. 

Some projects failed to produce significantly higher energy yields. Others caused bigger problems. In 2017, an EGS plant in South Korea had to close down after likely causing a 5.5 magnitude earthquake as a result of fracking too close to a tectonic fault.

The most successful EGS projects have depended on expanding and stimulating large preexisting faults in the rock. This approach is not scalable, explains Mark McClure, founder of ResFrac, because “it relies on finding large faults in the subsurface.” While we aren’t exploiting anywhere near the number of usable faults, the number of fractures to exploit are finite, and finding them isn’t always easy. Despite companies in Germany like Herrenknecht developing novel solutions like “thumper trucks” that could drive around urban areas looking for geothermal faults, most experts agree the solution is finding ways to create new fault lines in the earth to access water warmed by the Earth’s core.  

Fervo’s Project Red and the next steps for geothermal  

In late April, Turboden, a company that makes advanced turbines for capturing geothermal energy, announced a new partnership with Fervo Energy

For Fervo, the partnership is part of the natural progression of a project that came online in November of 2023, but has been in the works for years. 

Located in the heart of the Nevada desert, Project Red is a new kind of geothermal power plant, one which uses a new approach to dramatically increase the amount of hot water and steam it can access in an area without naturally-occurring hot springs from volcanic activity.  

Fervo’s Project Red location in Nevada has seen remarkable success by harnessing techniques borrowed from the oil sector. By creating a 3000 ft lateral (sideways) extension to the bottom, the wells achieved by far the highest circulation rates ever circulated between EGS wells.

Drilling and fracking methods have grown increasingly sophisticated since the 2010s, thanks to the boom in oil and gas extraction from shale. The EGS sector has embraced these methods, and as a result, “the techniques that are central to EGS were perfected and brought down significantly in cost,” Wilson Ricks, an energy systems researcher at Princeton University told Knowable Magazine.

Nevertheless, Project Red is a relatively small demonstration of EGS’ potential. The station draws enough steam up from the earth to generate 3.5 megawatts of power. That’s enough to power more than 2,500 homes and more than any other EGS plant today. Nevertheless, it’s significantly smaller than nuclear or coal power plants can generate, and quite a bit less than solar, wind, and traditional geothermal sources. 

Now, however, Fervo plans to partner with companies like Turboden to rapidly scale up its technology. 

Scaling up Project Red

Situated in southwest Utah, Cape Station is positioned to redefine geothermal energy production with an anticipated total project capacity of approximately 400 MW. If successful, Fervo has claimed the project will represent a “transformative leap towards carbon-free energy solutions.”

The project will begin with an initial 90 MW phase. This includes the installation of three generators with six ORC turbines manufactured by Turboden.

“The success of Cape Station will not only validate the efficacy of EGS technology but also unlock vast potential for future geothermal power projects across the United States,” said the company in a statement.
One 2019 report projected that advances in EGS could result in geothermal power providing about 60 gigawatts of installed capacity to the US grid by 2050. That would account for 8.5% of the country’s electricity. Not only would this be more than 20 times the geothermal capacity of the US today, but the ability to plug up geothermal reservoirs and extract energy when needed could be used to complement more sizable but capricious wind and solar power. It’s just one more piece of the green transition puzzle.

  • Infrastructure & Cloud
  • Sustainability Technology

Larger drones than ever are being cleared to operate outside the field of vision of human overseers in autonomous swarms.

Swarms of autonomous drones could be changing the face of agriculture in the US and beyond. In a landmark ruling, the American Federal Aviation Administration (FAA) recently made an exemption to existing drone operation rules. The exemption allows Texan drone manufacturer Hylio to let a single pilot simultaneously fly up to three 165-pound AG-230 drones. Hylio’s pilots have clearance to fly multiple heavy drones beyond lone of sight, and can do so at night. The decision has been heralded as a major step forwards in industrial drone deployment.  

Industry experts believe this ruling could be a pivotal step in paving the way for “drone swarm farming”. While the ruling currently just applies to Hylio, it could soon be extended to the rest of the agri-drone industry. If so, it could put the technology competitive with traditional spraying and planting methods. 

While the FAA’s permission currently extends solely to Hylio pilots, the FAA is expected to generalise its approval through a “summary grant.”

“It’s definitely going to increase adoption of drones because you can’t just write drones off as cool for spot-spray,” says Arthur Erickson, Hylio CEO. “Now they’re a mainstay for farmers, even large row crop farmers.”

Hylio’s exemption from the FAA could be a pivotal step towards industrial scale “drone swarm farming”

Drone demand soars in the agricultural sector

The agricultural drone market was worth about $1.85 billion in 2022. While drones used primarily to spray crops with pesticide and fertiliser have been met with some enthusiasm, FAA regulations have placed significant limitations on the scale and degree of autonomy with which drones can work in agriculture. 

Weight restrictions have, until now, limited drones flying beyond visual line of sight (BVLOS) to 55 lbs(24.9kg). Also, the ratio of drones to pilots has been limited to 1:1. Technological limitations and regulatory guidelines have, therefore, allowed traditional agricultural methods to remain more effective. This could all be about to change, however. 

Erickson, in a recent interview, stressed the transformative impact of the FAA’s ruling on autonomous agriculture. “Swarming drones over 55 pounds has long been the desperately sought Holy Grail in the agricultural industry,” he explained.  

Growth in drone services-related revenue will likely stem from the rising adoption of drones in agriculture. In addition to the drones themselves, this will also necessicte a mixture of services. These could include drone operation, data analysis, customisation, and regulatory compliance assistance. 

The hardware segment dominated the market with a revenue share of about 51%. While hardware is expected to grow significantly over the coming decade, the software and especially services portion of the market is expected to register a significant CAGR over the forecast period.

Most farmers lack the necessary expertise to fully harness drone technology’s potential, which will boost demand for specialised services to enable effective drone utilisation and data interpretation. By 2030, the market for agricultural drones is predicted to exceed $10 billion. 

Ag-drones are paving the way for autonomous swarms in other sectors

If successful, Erickson argues that autonomous drone swarms in the agricultural sector could pave the way for their adoption in other industries. The agricultural sector is a relatively low-risk environment, with relatively little scope for injury in the event of an accident or error. As a result, Erickson argues that it makes an ideal testing ground for refining sensitive avoidance systems essential for the safe operation of autonomous drones. 

This could then pave the way for the broader adoption of drone swarms in other industrial sectors. Assuming they are proven safe and effective in a controlled environment like agriculture. 

However, manned crop spraying organisation, the National Agriculture Aviation Association, has raised concerns over the FAA’s ruling. The NAAA published an open letter raising safety concerns for manned crop-duster pilots. “UAS [unmanned aerial systems] performing the same mission in the same airspace present a significant hazard [to manned aeroplanes], particularly during seasonally busy application windows,” they warn.

  • Infrastructure & Cloud
  • Sustainability Technology

The UPS systems supporting data centres could be used to add resilience to local power grids during the transition to renewable energy.

When it comes to the worldwide green energy transition, data centres are certainly part of the problem. However, they could also be a part of the solution.

Part of the problem

Data centres have attracted their share of controversy and negative attention for their power consumption. 

Large data centres place enormous pressure on regional power grids. This has already driven some regional and national governments to freeze or outright ban construction. For example, the Irish government’s ban on connecting new data centres to Dublin’s electricity grid won’t end until 2028. Singapore and the Netherlands have also legislated to pause data centre construction. Both cited concerns over sustainability and the toll that multi-megawatt facilities take on their power grids. 

Data centres were early adopters of green energy, and have been drivers of sustainable engineering practices for over a decade. The “green” data centre market was worth $49.2 billion in 2020 and is expected to reach $140.3 billion by 2026. However, the overall consumption of the industry is still rising. It’s also expected to rise a great deal more, thanks to artificial intelligence (AI). 

The International Energy Agency (IEA) reported that data centres, which consumed 460TWh in 2022, could use more than 1,000TWh by 2026. Responsibility for this explosion of demand can be largely laid at the feet of the ongoing AI boom. 

High intensity workloads like artificial intelligence are accelerating the growth of data centre power demand, and the world may not be able to keep up. This is especially problematic as the global drive towards a green energy transition picks up steam.

“We have many grids around the world that cannot handle these AI [driven] workloads,” Hiral Patel, head of sustainable and thematic research at Barclays, said in an interview with the Financial Times. Going forward, she added that “data centre operators and tech companies will have to play a more active role in the grid.”

Power grids in crisis

One of the main problems faced by governments trying to restructure their energy mix is intermittent power generation. Wind and solar power can create abundant, cheap electricity. Not only that, but manufacturing wind and solar infrastructure is getting quicker and cheaper. As a result, large-scale engineering projects are increasingly putting more wind and solar energy into energy grids. 

However, there’s a problem with these methods of electricity generation. Essentially, when the wind doesn’t blow and the sun goes down (or behind a cloud), the power turns off. Battery technology also hasn’t evolved to a point where it’s practical (or possible, really) to store enough energy to tide the grid over when solar and wind fall short.

Currently, natural gas, coal, and other fossil fuels are used as a stopgap. These fuels are used to support energy grids when demand outstrips what renewables can supply. Nuclear is increasingly recognised as the best, cleanest source of consistent complementary power to support intermittent renewables. However, nuclear infrastructure takes a long time to build. Not only this, but regulation moves slowly. Most debilitatingly, nuclear power is still lumbered with an image problem—something the fossil fuel industry has worked hard to stoke over the past several decades. 

Add an unsteady energy transition to the fact that the power grids in many developed and developing nations are ageing, poorly maintained, and overloaded, and you have a  

In the meantime, data centres could offer part of the solution to power grids that lack resilience. 

Data centres must take on “a more active role in the grid”

All data centres have an uninterruptible power supply (UPS) of some sort. All critical infrastructure does, from hospitals to government buildings. It’s a fancy term for a backup generator. 

If the grid fails, the UPS kicks in and can keep the lights (and servers) on until service is resumed. Data centre UPS systems are of special interest here because of the sheer volume of energy they can provide. 

These facilities are equipped with a very large array of either lead-acid or lithium-ion batteries. This array will be sized to the IT load of the data centre, meaning a 500 MW facility is equipped with enough batteries to power your average town—for a while at least. Most data centres aren’t that big, but there are a lot of them. 

Some experts argue that there are enough data centres (especially big ones) with enough power constantly being stored in high battery arrays that they have the capacity to return power to the grid, sharing the load when the system as a whole comes under strain. This substantial energy storage capacity is often underutilised. 

“As the transition to renewable energy accelerates, maintaining a stable grid is paramount. Data centre operators can have a crucial role to play in grid balancing,” argues Michael Sagar of lead-acid battery manufacturer EnerSys. By feeding power back into the grid to support it in moments of overwhelming demand, he explains that “data centres can contribute to grid stability and potentially generate additional revenue.”

  • Infrastructure & Cloud
  • Sustainability Technology

The e-waste crisis continues to be the biggest obstacle to the development of a circular economy, and tackling the problem requires multiple different approaches.

Click here to read Part One of our series on e-waste, where we contextualise the growing crisis. You can also click here for Part Two, where we examine the environmental, human, and economic cost of e-waste. 

The global e-waste crisis generates millions of tonnes of discarded smartphones, TVs, servers, and other electronics every year. Just 17.4% of the e-waste stream is collected and properly recycled. The resulting consequences for humanity, our planet, and the economy are becoming increasingly severe. 

In the previous parts of this series, we examined why e-waste is such a large and thorny problem. We looked at its environmental impact, and the ways in which it harms both people and our planet. We also examined the economic waste that stems from improperly disposing of broken electronics. The practice, the WEO finds, results in the loss of billions of dollars in unreclaimed rare earth metals every year. 

In many ways, e-waste is another symptom of capitalistic overconsumption. Electronics are being designed and sold increasingly as cheap and disposable. This, combined with a failure on behalf of governments to regulate corporate activity are driving huge growth in e-waste. The problem is not insurmountable. However, much like climate change, there is no single solution to the e-waste crisis. 

E-waste: the big, obvious solutions 

Regulation of manufacturers to ensure longer product lifespans, more rigorous requirements for recycling programs, and efforts to tackle corruption and human rights abuses in the supply chain are all vital steps towards overcoming this problem. 

Barbara Metz, Executive Director or Environmental Action German, argues that “Producers of electronics must bear more responsibility for the environmental problems caused by their products.” She adds that the “financial burden” faced by electronics producers generating e-waste is minimal. Because such a small amount of e-waste ever ends up back in a recycling plant for organisations to pay to recycle, manufactuers rarely pay to have them processed. “This must end now,” she says. Electronics manufacturers should be required to participate in e-waste return and recycling networks. She adds that that collection and reuse targets must become legally binding. Producers offering short-lived and poorly repairable equipment should also bear higher costs.” 

Regulators in the EU are making progess. Legislation like the right to repair is lengthening the lifespans of devices. However, governments and companies outside the EU must also act to effect lasting e-waste reduction. This is especially true in low-income areas of the world where a large amount e-waste ends up.

There are also some interesting technological developments that could work alongside regulatory and consumption-reduction efforts to bring the e-waste stream under control. 

Changing our approach to technology ownership could be a valuable first step, argue Guy Ryder and Zhao Houlin of the UN and ITU, respectively. Device-as-a-service business models present one potential avenue. “This is an extension of current leasing models, in which consumers can access the latest technology without high up-front costs,” they write. “With new ownership models, the manufacturer has an incentive to ensure that all resources are used optimally over a device’s lifecycle.” 

Technology as part of the solution 

Solving the e-waste problem in an increasingly technology-dependent world is a complex and challenging prospect. 

However, while structural and regulatory changes to electronics design, consumption, and disposal are at the core of the solution, there are some interesting technological developments that also promise to help tackle the crisis. These discoveries and developments range from the exceedingly high-tech to the shockingly simple. 

Biodegradable circuits 

One way to reduce the amount of electronics in landfills is to eliminate the need to recycle them. One reason why e-waste is such a pernicious issue is the complexity in separating useful and useless parts of each device. Each device contains some materials that are banal but unrecyclable. Others are actively toxic, and some can only be recycled once separated from their surrounding components. Lastly, there are others that are immensely valuable but very hard to recover. It’s a complicated, tangled mess.

One team of researchers at North Carolina State University is taking a novel approach to the problem. The team is tackling the issue of e-waste by developing devices that are “simultaneously recyclable and biodegradable.” 

The result of their experiments is an electronic patch made largely of a renewable biomaterial that can be composted. The electrical circuits are made from silver nanowires that can be reclaimed as the rest of the patch biodegrades. 

Yong Zhu, a professor and one of the device’s developers, said in an interview that “The electronic patch can, after further development, be used for a wide range of applications, such as human health monitoring, electronic textiles, sports performance monitoring, soft robotics, prosthetics, and human–machine interfaces.”

Robotic recycling 

One of the most detrimental effects of e-waste is on the health of people exposed to recycling operations. 

E-waste typically contains multiple hazardous materials, including lead, mercury, and cadmium. These are not only environmentally devastating, but have grave health consequences for those responsible for recycling them. Improper e-waste “recycling” practices have been linked to serious health issues. These issues affect both in those who work in these facilities and unborn generations. 

Robots are used throughout just about every industry to increase productivity and safety. As such, they are a natural fit for the e-waste recycling process. Intelligent robotics are increasingly able to separate types of material, isolate hazardous waste, and sort through e-waste without risking a human’s health. The issue is that it remains legal and cheaper to ship e-waste from producer regions like Europe and North America to places like Ghana (which also produce significant amounts of e-waste domestically) than to build the necessary infrastructure. 

The countries that are home to the world’s biggest electronics manufacturers need to take responsibility for modernising their e-waste recycling infrastructure. Not only this, but they have a moral and pragmatic imperative to support the modernisation of e-waste recycling capabilities in countries that both bear the brunt of the e-waste crisis and represent some electronics manufacturers’ fastest-growing markets. 

Vegemite? 

In Australia, a recent discovery could provide a refreshingly affordable, low-tech solution of plastic pollution and e-waste. A team of scientists have started using yeast for its properties as a “biosorbent”. Specificallty, they are using spent brewer’s yeast—also a key ingredient in Vegemite and Marmite.

“In order to achieve a selective metal recovery, we investigated spent brewer’s yeast as an effective and environmentally friendly biosorbent,” said Dr Klemens Kremser who helped conduct the study. The scientists used spent brewing yeast to separate aluminium, copper and zinc by means of adsorption. The yeast was also reused up to five times, separating additional types of metal from samples of e-waste. 

Using yeast to more effectively separate trace rare earth metals from e-waste, Klemser adds, “reduces the need of primary resources, thus helping to reduce the environmental impact of improper e-waste recycling.” 

  • Sustainability Technology

The e-waste crisis continues to be the biggest obstacle to the development of a circular economy, and we are only just beginning to feel the cost of inaction.

CONTINUED FROM PART ONE, in which we contextualise the scope of the growing e-waste crisis.

Electronic waste (e-waste) is the world’s fastest-growing hazardous waste stream. Over 61 million tonnes of e-waste were generated worldwide last year. By the end of the decade, continued consumption is projected to generate to more than 74 million tonnes

This growing avalanche of discarded smart phones, computers, televisions, and other electronics presents a very real environmental, human, and economic danger if not handled correctly. Right now, the vast majority of e-waste is disposed of improperly. Not only this, but the consequences of improper disposal are multifaceted and devastating. The release of toxic pollutants into the environment and human populations have harrowing consequences. Not only this, but improper e-wast disposal also causes the loss of billions of dollars in unreclaimed rare earth metals. 

As part of our ongoing series on e-waste, we’ll explore the environmental, human, and economic cost of the crisis. 

The environmental cost of the e-waste crisis

According to a WHO study, just 17.4% of the 53.6 million tonnes of e-waste generated in 2019 was collected with the intention of being recycled. Due to corruption, lax regulations, and other factors, even less e-waste even made it to the recycling plant. 

Even the materials that are classified as having been recycled may end up contributing more to emissions than those headed straight for the landfill. Notorious “recycling” operations like Agbogbloshie in Ghana incinerate e-waste in order to recover raw materials using open fires. These open air fires are used to strip insulation material from copper wire to recover the “recycled” metal. They are also used to burn away the rubber on tires to recover trace quantities of steel. 

Burning or otherwise improperly disposing of e-waste can release “as many as 1000 different chemical substances” into the environment, according to the WHO. In 2020, e-waste disposal was responsible for the release of 580 million tonnes of carbon emissions. The problem is more involved than simple carbon emissions, however. 

Increasing consumption of electronic devices and increasing generation of e-waste depletes valuable resources (such as lithium, palladium, and copper), escalates energy demand, and inflicts environmental harm during raw material extraction. Raw earth materials that are improperly recycled can emit greenhouse gases and pollutants, but those that make their way to landfills also add heavy metals to the ground and water supply over time. 

The human cost of e-waste 

The long-term contributions of e-waste to our environmental collapse are clear. More importantly, however, there is also an unconscionable human price being paid right now. 

The improper disposal of e-waste is in of itself a multi-billion dollar industry. This industry predominantly employs people in the developing world. The WHO has identified some of these practices as including dumping e-waste on land or in water bodies, landfilling it along with regular waste, open burning it, breaking down devices in acid baths, stripping and shredding plastic coatings, and the manual disassembly of equipment. 

All of these activities obviously harm the environment. However, they are also highlighted by the WHO as being hazardous to the people who perform them. “They release toxic pollutants, contaminating the air, soil, dust, and water at recycling sites and in neighbouring communities,” notes the WHO report. It is added that pregnant women and children are particularly vulnerable. This is due to their unique pathways of exposure and their developmental status to toxins and heavy metals like lead. 

E-waste exposure has been linked to adverse neonatal outcomes, as well as neurodevelopment, learning and behavioural issues. In later life, prolonged exposure to e-waste recycling and disposal has been tied to reduced lung and respiratory function. Incidences of asthma also increase across the board with prolonged exposure.

The economic cost of e-waste 

Humanity suffers incalculable losses every year due to the holistic damage of the climate crisis. It is a price our species will likely continue to pay for decades, if not centuries. The damage caused by the e-waste crisis is not solely dealt to the environment or population health, however. The data suggests that our current damaging and unsustainable approach isn’t even cost effective. 

Most electronics contain scarce, valuable metals. It’s true that each individual smartphone, server, hearing aid, or smart toaster might only contain a few milligrams of gold, silver, and platinum. However, in aggregate, the problem really starts to take shape. 

It’s estimated that the global e-waste stream contains $62.5 billion worth of recoverable materials annually. The majority of these materials are improperly disposed of, unsustainably recovered, or simply buried in a landfill. For context, this is three times more value than is generated each year by all the world’s silver mines. 

As argued by Guy Ryder, Under-Secretary-General for Policy at the UN, and Zhao Houlin, Secretary-General of the International Telecommunication Union, the e-waste crisis represents “a golden opportunity.” 

“More than 120 countries have an annual GDP lower than the value of our growing pile of global e-waste. By harvesting this valuable resource, we will generate substantially less CO2 emissions when compared to mining the earth’s crust for fresh minerals. It makes sense too – there is 100 times more gold in a tonne of mobile phones than in a tonne of gold ore,”they note

CONTINUES IN PART THREE. 

  • Sustainability Technology

Solving the e-waste crisis in an increasingly technology-dependent world is a complex and challenging problem with no easy answers.

Electronic waste (or e-waste) is the fastest growing solid waste stream in the world. 

In 2019, the World Health Organisation estimated that more than 53 million tonnes of e-waste were produced globally. 

Worldwide lockdowns in response to the COVID-19 pandemic drove work, education, and social interaction online in 2020. This further intensified the problem. Used to bridge the gaps created by the lockdown, the pandemic sparked a jump in e-waste. Dubbed “a ticking time bomb” by researchers at UCL, this huge growth in devices is poised to further accelerate the crisis.   

Solving this problem in an increasingly technology-dependent world is a complex and challenging prospect. There are no easy answers. 

E-waste out of control

E-waste is a bigger problem now than it has ever been. Around the world, just 17.4% of discarded electronics are collected and properly recycled each year. 

“Our electronics consumption keeps increasing without any consideration for our planet’s capacity. E-waste is piling up – not being reused, not being repaired,” says Fanny Rateau, Programme Manager at the European Environmental Bureau’s (EEB) Environmental Coalition on Standards. According to data gathered by the EEB, the European Union (which has some of the stricter regulatory frameworks related to e-waste in the world) fails to collect more than half the e-waste it produces each year. 

Fynn Hauschke, an EEB policy officer for circular economy and waste, adds that “Almost every EU Member State fails to reach e-waste collection targets,” which causes “considerable environmental impacts and lost opportunities for reuse and recycling.” 

In the UK, a report from 2020 by the EAC found that the average household had 20 unused electronic items at home. It also highlighted the growth of “disposable” electronics like single use vapes as a “huge and growing stream of hard-to-recycle waste.” 

E-waste is not an individual issue

Many e-waste reports tend to frame the issue in terms of the individual. In an article for the Journal of Cleaner Production, for example, authors Md Tasbirul Islam et al argue as much. “E-waste often ends up in landfill due to improper disposal of e-waste with household waste by consumers,” they observe. 

It’s true that improper disposal of electronics, as well as consumer buying habits focused on owning current generation devices are, on the face of it, responsible for the creation of e-waste.  However, the rhetoric bears the hauntological stamp of carbon footprint marketing campaigns championed by the oil and gas industry in the 2000s. Corporate (and government) interests have a proven track record of shifting culpability onto the consumer to conceal their role in unsustainable practices. 

For example, reports often frame the amounts of e-waste generated in terms of kilos per capita. In Europe last year, the EEB calculated that the biggest consumers of electrical and electronic equipment per inhabitant were the Netherlands (35.1 kg), Germany (31.3 kg), Denmark (30.7 kg), France (30.5 kg) and Belgium (29.2 kg). At first glance, it might seem as though the e-waste crisis is a product of consumer culture, even negligence.  

The big three e-waste drivers 

Dig a little deeper, however, and a more troubling picture begins to emerge. It quickly becomes apparent that consumers “improperly disposing” of their electronics are caught between much larger forces. 

These forces include governments failing to set up reliable, effective systems for recycling, repairing, and safely disposing of electronics. Additionally, the increasingly short lifespans of consumer goods; and the worldwide campaign by corporations against the repairability of their products are also driving the worsening e-waste crisis. 

Government inaction undercuts recycling efforts 

In the UK, the authors of the recent e-waste audit said they were “disappointed” that, while the UK government accepted 22 out of 27 of their recommendations. They added that “the measures on which the government is currently consulting do not appear to implement any of them.” 

Hauschke argues that “there is an urgent need for more consumer-friendly separate collection systems.” The EEB report advises the European Commission to “rapidly overhaul” the current directives governing e-waste disposal. Current gaps in data collection point to e-waste still being illegally disposed of as residual waste or illegal exports. 

Planned and negligent obsolescence. 

Another reason that e-waste is growing is that devices today are manufactured more cheaply with less emphasis on longevity. Multiple studies have identified that the lifespans of electronic devices are getting shorter. In 2015, a report commissioned by the German environment agency found that the proportion of all units sold to replace a defective appliance grew from 3.5% in 2004 to 8.3% in 2012. The report’s authors deemed this a “remarkable” increase. 

In 2020, a European Environment Agency report found something similar. Electronics like smartphones, televisions, washing machines and vacuum cleaners they studied had shorter lifespans than expected. On average these devices worked for at least 2.3 years less than their “designed or desired lifetimes.”

“Producers of electronics must bear more responsibility for the environmental problems caused by their products,” argues Barbara Metz, Executive Director of Environmental Action Germany. “Producers offering short-lived and poorly repairable equipment should also bear higher costs.” 

Fight for your right to repair 

The issue of repairability dovetails with this issue. Last year, landmark legislation in the EU granted consumers the “right to repair” their consumer goods

“Discarded products are often viable goods that can be repaired but are often tossed prematurely,” notes the EU commission. The commision adds that the premature disposal of goods like washing machines and televisions results in massive amounts of waste. This improper disposal causes 261 million tonnes of greenhouse gas emissions in the EU every year.

Right to repair faced strong opposition. The idea was resisted by an economic landscape where many manufacturers often maintained a monopoly over even simple replacement parts. Not only that, but these companies frequently only allowed their authorised service technicians to repair equipment. 

“We are no longer able to fix the things we buy,” Gay Gordon-Byrne, director of The Repair Association, said. While right to repair may have progressed in Europe, the movement faces visceral opposition elsewhere. For now, in many parts of the world, corporate interests driven by the desire to sell more cheaply made, disposable e-waste in waiting are the ones who decide who can repair their products and even what is considered irreparable. 

CLICK HERE to read Part Two of our e-waste series, which explores the human, environmental, and economic cost of e-waste. CLICK HERE for Part Three, investigating some of the more interesting solutions to the problem. 

  • Sustainability Technology

Deep decarbonisation and a holistic approach to sustainability are necessary for the creation of truly green data centres.

Click HERE to read part one of this two-part series on the need for green data centres and the obstacles standing between the industry and true decarbonisation. 

The “green data centre” doesn’t go far enough. Until now, access to renewable energy, water and power efficiency, and use of techniques like free cooling have been enough to qualify a data centre as sustainable. Low PUE and net-positive water usage have allowed data centre operators to advertise their ESG bonafides. 

However, some industry experts argue that these metrics are out of step with the industry’s very real and present need for more meaningful emissions reductions. “There is no truly green data centre until we achieve deep decarbonisation,” says Helen Munro, Head of Environment & Sustainability at Pulsant. “It’s important to recognise that this is a journey right now, and not a reality.” 

What makes a green data centre? 

“It’s often assumed that the greenest data centres are new builds. A new build can be designed to be as efficient and self-sufficient as possible, reducing the reliance on external power sources and promoting energy efficiency,” Munro explains. 

However, there’s a problem with building new infrastructure. No matter how energy efficient your data centre is, construction is an inescapably carbon-intensive activity. “We need to balance the efficiency gains of a new building with that impact and consider the improvements that we can make to existing assets,” she argues. 

There’s already some effort in the industry to lengthen upgrade cycles. Google (along with other hyperscalers) has started using its servers and IT equipment for significantly longer amounts of time. Between 2021 and early 2023, the company extended the lifespan of hardware like servers from four to six years. The move, in addition to saving Google as much as $3.4 billion per year, reduces the amount of e-waste considerably.   

The site selection question 

Where you put a data centre also has a meaningful impact on its environmental impact. Powered by local energy sources, plugged into a local grid, drawing water from the local supply, and built using local materials, codes and techniques. Regional power grids often have very different carbon intensities. “Data centres in areas of the Nordics, for example, are benefitting significantly from high availability of renewable power, as well as cooler climates which facilitate lower infrastructure power consumption,” Munro adds. “A well-sited data centre can also feed into local district heat networks, thereby avoiding emissions.” 

Optimising data centre location requires compromise on climate

However,the problem is that we can’t put all our data centres in Norway. Increasingly, as artificial intelligence, IoT and 5G increase localised computing, there’s more demand for lower latency connections. “It can be important for clients to have access to a data centre which is local to them,” says Munro. She adds that “data centres with a local focus should fall back on buying power in the greenest way they can; recognising that approaches such as physical PPAs can give stronger renewable power additionality than a 100% renewable tariff and be ready to engage with opportunities such as heat networks when they become locally feasible.”  

In short, there are many factors that affect a data centre’s sustainability that lie outiside the direct control of the company that builds it.  These factors include “not only the local power grid and climate, but the impacts of the upstream supply chain for infrastructure, hardware and services,” Munro explains. She emphasises how critical it is that organisations committed to building and operating greener data centres “develop a robust plan to maximise their positive influence and recognise that no site can be entirely sustainable unless the wider ecosystem is, too.”

Specifically, she adds that “Organisations should also ensure their concept of ‘sustainability’ includes the impacts beyond their site boundaries, and goes beyond only the carbon footprint.” Munro points out that hydrated vegetable oil (HVO) fuel can result in emissions reductions, its production has also been linked to an increased risk of deforestation. “Focusing on environmental sustainability beyond reducing carbon emissions and involving initiatives to protect local ecosystems and wildlife, will help organisations reinforce their focus on becoming greener,” she notes.

Where do we go from here? 

Is the green data centre a myth, then? Can data centre companies—even those taking a holistic view of their entire environmental impact—actually build facilities that have a light enough environmental footprint to avoid contributing to the climate crisis? 

Munro argues that “We need to consider efficiency in relation to the value of the computing workloads” that data centres host. She argues that “organisations generate and store huge amounts of data every day that is of little-to-no-value to them; so even using the greenest of data centres, this is a waste of power and hardware resources.”  

It’s a complex issue with no easy solution. However, the first step is changing the conversation around what constitutes a green data centre. Operational efficiency is no longer enough to call a data centre green. The entire project, including its impact up and down the supply chain, needs to be considered holistically if meaningful steps are to be taken to reduce environmental impact. 
A recent report by the World Bank argues that “Addressing the climate footprint of data centres requires a holistic approach, including design, manufacturing, procurement, operations, reuse, recycling, and e-waste disposal. Beyond increasing energy efficiency and reducing carbon emissions, these steps can reduce e-waste and limit the data centre’s environmental footprint throughout the data centre lifecycle.”

  • Infrastructure & Cloud
  • Sustainability Technology

There will be no “truly green” data centres until industry achieves “deep decarbonisation,” according to Pulsant’s Head of Environment & Sustainability.

The data centre’s role in the fight against climate change is an uncertain one. Globally, data centres consume between 1-1.5% of all electricity, according to data collected by the International Energy Agency (IEA). In 2022, data centres used approximately 460TWh of electricity. They also accounted for 3.5% of the global greenhouse gas emissions—more than the global aviation industry. 

This may sound bad, but the industry’s emissions figures and electricity consumption are actually something of a miracle. As the IEA notes, emissions from data have “grown only modestly despite rapidly growing demand for digital services,” since 2010. They credit energy efficiency improvements, renewable energy purchases, and the broader decarbonisation of electricity grids around the world. 

However, curtailing emissions growth through increased efficiency and renewable power purchase agreements is insufficient. Not only is demand for data centre capacity set to more than double by 2026 (exceeding 1000TWh) thanks to AI, but the IEA suggests that “to get on track with the Net Zero Scenario, [data centre] emissions must drop by half by 2030.” 

Data centre freezes are insufficient

With trillion of dollars in economic impact at stake, it’s highly unlikely data centre growth will be curtailed. 

Some countries, including Singapore, Ireland, and the Netherlands, have stepped in to regulate growth and even freeze their data centre industries, as the burden of massive hyperscale facilities on their nations’ power and water supplies begins to outweigh their economic benefits. One fifth of Irelan’s electricity was consumed by data centres in 2022. By 2026, it will rise to one third of the country’s energy consumption. In response, Ireland’s national electricity utility stepped in and, in May of 2022, effectively banned data centre construction in Dublin

However, regulating on a national or regional level like this only pushes demand elsewhere. The Dublin moratorium is a demonstration of this problem in miniature. Just six months after the moratorium took effect, there were 21 new facilities in the works outside the Dublin area. Many of them “as close to the capital as possible, roughly 80 km away at sites in Louth, Meath, Kildare, Kilkenny, and Wicklow.” 

The same process is being replicated at different scales around the world. Data centre capacity in aggregate isn’t going anywhere but up. The problem is holistic, and therefore the solution should be too. 

The fight for green digital infrastructure

The need for data centre infrastructure that consumes less power, less water, and has a reduced impact on both the local area and global emissions is gaining real traction. 

A report by the World Bank notes that while “Reliable, secure data hosting solutions are becoming increasingly important to support everyday functions across societies,” in order to “ensure sustainable digital transformation, efforts are needed to green digital infrastructure.” 

However, building data centres that are more energy efficient is only half the battle. 

“There is no truly green data centre until we achieve deep decarbonisation,” says Helen Munro, Head of Environment & Sustainability at Pulsant. Only when the industry “embeds respect for nature and resources,” at the site level, throughout the hardware supply chain, the infrastructure, construction process, and throughout supporting power utilities will there ever be such a thing as a “green data centre”. 

CONTINUES IN PART TWO. 

  • Infrastructure & Cloud
  • Sustainability Technology

Janina Bauer, Global Head of Sustainability at Celonis provides some expert insight into the reality of implementing sustainable practices…

Janina Bauer, Global Head of Sustainability at Celonis provides some expert insight into the reality of implementing sustainable practices…

Please tell us more about yourself and your role at Celonis.

I have been involved in sustainability long before it became mainstream. I did a Master’s in Business Administration with a focus on sustainability, and also worked at the U.N. analysing and researching the implementation of its Sustainable Development Goals. I bring that passion for exploring the big ideas around technology and sustainability to my role at Celonis, where I am Global Head of Sustainability. I took over Celonis’ sustainability programme in 2020, overseeing both our progress internally, and our external work where we help our customers using the Celonis Platform, enabling them to operationalise sustainability in all of their business processes.

Why is it essential businesses embed sustainability into business objectives, strategy and decision-making?

Going forward, there is no longer a separation between a business’ bottom line and their sustainability ‘green line’. To be a high-performing organisation in today’s business world, and tomorrow’s, companies need to be both profitable and sustainable. Future-proofed organisations are on top of both aspects, and ensure that sustainability and profitability are embedded into every single decision. There’s no contradiction between being a profitable business and not harming the environment you operate in. In fact, actions that boost sustainability also boost profitability by cutting waste. A key thing to remember is that everyone has a part to play in an organisation’s sustainability journey. It’s not something simply for people with ‘sustainability’ in their job title, but for leaders and workers in every part of the business.

Why are organisations struggling to implement their sustainability goals?

There are several barriers holding businesses back, including the inaccurate perception that sustainability is simply a cost, whereas it often goes hand in hand with profitability. Organisations which are struggling to implement their sustainability goals are often dealing with a people problem. If sustainability is seen as being the business of sustainability specialists only, rather than being integral to an organisation’s DNA, it can be hard to secure alignment and buy-in across the whole business.

Education, in the form of courses and clear communication within an organisation, can help to address fears and reluctance around sustainability, and the perceived cost of embracing change. The other key issue is siloed, unconnected systems which make it harder for business leaders to truly understand their carbon footprint. Many organisations have more than 300 IT systems, and the average business process runs across 10 different systems, with data buried in separate systems from transactional data in ERP (Enterprise Resource Planning) software to Excel Spreadsheets. Process mining can help business leaders unravel this and identify where efficiencies can be made. This in turn helps to support sustainability objectives and reduce costs and waste.

Should companies should place a greater emphasis on reducing Scope 3 emissions?

When it comes to sustainability objectives, do you think companies should place a greater emphasis on reducing Scope 3 emissions?

For most organisations, Scope 3 emissions, which includes those from all of their downstream and upstream activities, are the Holy Grail when it comes to having an impact on emissions. That’s where the biggest carbon footprint lies and that’s where the most exciting opportunities for rapid progress are.

The data required for a real-time view of Scope 3 emissions is already at businesses’ fingertips – it’s just that it’s buried inside siloed carbon-accounting tools and other software. The first step is to extract this data using process intelligence technologies, and then organisations can make real progress.

How important is technology in helping companies make sustainability gains?

The first step towards having genuine impact in sustainability is to measure the environmental impact of the organisation’s current operations. This is where technology plays a crucial role. Technologies such as process intelligence enable business leaders to make informed decisions around sustainability, working like an ‘X-Ray’ on existing data and highlighting value opportunities, as well as allowing business leaders to understand the full journey of the goods they sell, and all the emissions associated with this. This data allows IT leaders to measure and drive sustainability, comparing performance against best-practice models and collaborating with partners and suppliers to reduce emissions.

What actions do you hope to see from COP28 that will result in tangible outcomes for the corporate world in 2024?

COP28 has offered a unique opportunity to embrace real and meaningful action to combat climate change, as well as a crucial dialogue between politicians, business leaders and decision makers. What I hope to see is more organisations embracing technology and innovation to make strides towards real sustainable change, with a particular focus on decarbonising supply chains and different industry sectors. Many pledges have been made at COP28 with good intentions, but now it falls on leaders to back up those pledges with real, measurable outcomes, using technology to turn their words into actions. Of course, process mining is not a ‘silver bullet’ which can tackle climate change on its own, but it does offer a crucial way for business leaders to find hidden value opportunities and make real advancements in sustainability.

  • Sustainability Technology

Nigel Greatorex, Global Industry Manager at ABB, on how digital technologies can support decarbonisation and net zero goals

Nigel Greatorex is the Global Industry Manager for Carbon Capture and Storage (CCS) at ABB Energy Industries. He explains how digital technologies can play a critical role in the transition to a low carbon world by enabling global emissions reductions. Furthermore, he highlights the role of CCS and how challenges can be overcome through digitalisation.

Meeting our global decarbonisation goals is arguably the most pressing challenge facing humanity. Moreover, solving this requires concerted global action. However, there is no silver bullet to the global warming crisis. The solution requires a mix of investment, legislation and, importantly, innovative digital technologies.

Decarbonisation digital technologies

It’s widely recognised decarbonisation is essential to achieving net zero emissions by 2050. Decarbonisation technology is becoming an increasingly important, rapidly growing market. It is especially relevant for heavy industries – such as chemicals, cement and steel. These account for 70 percent of industrial CO2 emissions; equal to approximately six billion tons annually.

CCS digital technologies are increasingly seen as key to helping industries decarbonise their operations. Reaching our net zero targets requires industry uptake of CCS to grow 120-fold by 2050, according to analysis from McKinsey & Company. Indeed, if successful, it could be responsible for reducing CO2 emissions from the industrial sector by 45 percent.

A Digital Twin solution

ABB and Pace CCS joined forces to deliver a digital twin solution. It reduces the cost of integrating CCS into new and existing industrial operations. Simulating the design stage and test scenarios to deliver proof of concept gives customers peace of mind. Indeed, system designs need to be fit for purpose. Also, it demonstrates the smooth transition into CCS operations. Additionally, the digital twin models the full value chain of a CCS system.

Read the full story here

  • Sustainability Technology