TRENDS REPORT
2025
Business Insight
1
Contents
Welcome
Spatial Computing
Private 5G
Software Rationalisation
Hybrid Cloud
Security of AI
AI Evolution
Conclusion
CONTENTS
X
Business is moving at a rapid pace, and organisations face choices every day: balancing efficiency, operational excellence and innovation initiatives, evolving client and employee experiences, and diversifying into new technological advancements and revenue streams. No business can afford to stand still at a time of tremendous change. As a leading Solutions Integrator, Insight supports its clients at every stage of their technology decision-making journey, from business technology strategy through innovation, solutions development and management as well as the right hardware, software and services required to make those solutions successful. Home to a team of experts who are highly esteemed in their own particular areas of expertise, and a deep network of industry partners, Insight is ideally fitted to help future-proof organisations’ tech capabilities. Insight's 2025 EMEA Trends Report gives a unique viewpoint as to where businesses should focus their attention throughout the year ahead. It also offers a roadmap for CIOs and technology managers looking to get ahead of these trends.
2
The industry is buzzing with developers advancing AI technology at a high pace and organisations striving to capitalise on these advancements. Traditionally, technology supported business strategy, but since the advent of public cloud in 2006 and the evolution of AI, technology is now impacting business strategies across almost all organisations. Senior leaders realise that AI affects every corner of their business, from operational efficiency and quality to digital experience and creativity. Building the capabilities for constant, fast-paced ideation, testing, and solution commercialisation is a new challenge for many organisations. IT leaders should look to their partner ecosystems for help in accelerating the development of their innovation engine and technical capabilities.
There are many positives to this trend. More and more organisations are raising the profile of IT to board level, increasing the pace of innovation across all IT technology towers. The best IT innovation engines are democratised, populated by diverse teams from across the business, but centred around the CIO/CTO Office, ensuring information sharing, targeted funding, and a common, integrated technology roadmap.
Phil Hawkshaw EMEA CTO
Spacial Computing
4
Spatial computing is one of the hottest of hot topics right now.
A glimpse into the future
Read on
While it may not be something everyone is immediately familiar with, spatial computing is a subject that many of the biggest IT businesses are turning their attention to. And when companies such as Apple and Meta get involved, we know that we’re looking at a technology with a huge amount of potential. Coined by MIT researcher Simon Greenwold in 2003, he defined it as: “Human interaction with a machine in which the machine retains and manipulates referents to real objects and spaces.” It is now generally used as a synonym for virtual or augmented reality, which, although widely recognised, don’t really describe the full richness of the technology. Historically, spatial computing has mainly been seen as a way to attract gamers and for use in entertainment. Virtual reality (VR), augmented reality (AR), and mixed reality (MR) were considered somewhat niche technologies.
That’s all set to change. Research organisation Markets and Markets predicts that worldwide spatial computing is set to triple in five years, rising from $99 billion to $280.5 billion by 2028. According to research from Omdia, nine out of ten companies can see a business case for spatial computing within their own organisations, but only 20% have actually made the relevant investment. So why the disconnect? According to Camille Mendler, report author, it’s not because of a reluctance by employees to don their headsets or because of the cost of the equipment, but because there’s a lack of management will to invest in the technology. As with all Innovations, investment is required to properly identify use cases, readiness of platforms for integration and even development of bespoke applications. However, where spatial is concerned, the cost of this innovation is often tied directly to measurable business process improvement, and the returns can be substantial for the right use cases.
3
Perhaps what’s key to adoption is the educational element. Spatial computing is a new approach that businesses often struggle to find useful. The signs are there, however, that this is an exciting technology whose time has come. The question is: how to make this a reality? There are certainly many areas where spatial computing could be deployed effectively. We’ve already mentioned gaming, but there are other industrial sectors where this technology could make its mark. Spatial computing has many applications in health: as a means of training doctors or for remote diagnosis. There have also been several examples of it being used in therapeutic settings with mental health patients. Retail is
Retail is already bringing in spatial computing; customers can become virtual shoppers and examine different styles of clothes remotely
5
already bringing in spatial computing; customers can become virtual shoppers and examine different styles of clothes remotely, and it can be a part of manufacturing too. It could serve as training tool, but also help processes, for example, by monitoring remote devices in real time. This is an area where digital twins can play a part. This concept, where digital replicas of products are introduced so that companies can simulate their physical counterparts, is set to become more widely used. The introduction of this technology will mean cheaper and quicker ways of working in an industrial setting. According to a survey from Markets and Markets, 75% of executives said that further digitisation such as this would be a major part of new product development. We’re in early stages of really realising the possibilities with this technology, but there are undoubtedly new and innovative ways in which it will be used, particularly when combined with AI and tied in with the growing number of IoT devices.
We’re seeing a speed of innovation in this space from hardware and software vendors alike, and there is a lot of cross-over into IoT, AI, big data, and robotics where, when combined, they become more than the sum of its parts. VR quality and immersion is increasing, and we’re seeing it used for military training, in car showrooms, and in immersive learning environments for all manner of things. We’ve seen surgeons practising very delicate operations in VR with 3D haptic feedback robotics which increase the immersion and also the learning retention by engaging additional senses. AR is also progressing as the hardware adds faster processing to handle real-time 3D mapping of spaces and objects --- we’ve entered an era of AI augmented reality. People with visual impairment can be supported by lightweight glasses that constantly scan and identify things in front of them and allow them to ask questions in common language to understand more about what they’re “seeing” in front of them.
6
Companies are continuing to embrace and innovate in this space. It’s not just about the headsets themselves either. Vendors are looking at different input and interaction approaches, whether that’s with eye tracking, wrist-based haptics or AI-powered voice commands. There will be issues to address. As with AI, there are going to be some concerns about the way that the technology is applied and whether it can be used safely. There have been no government plans to regulate spatial computing so far, but that may change. There is a notable push to make headsets less intrusive and more comfortable. Additionally, cost remains a challenge. These devices are not inexpensive, but as usage increases, prices are expected to decrease. Ultimately business owners will have to think of ways in which spatial computing can be deployed comfortably within their organisations. It’s a technology whose time has come – the question is whether companies are able and willing to make the most of it.
7
Experienced in supporting clients with digital transformation and technology strategy for over a decade to improve employee experiences, accelerate business value and improve business processes. Workplace & Collaboration The Workplace & Collaboration practice aims to help empower organisations and improve end user experiences by helping design, implement and transform technology in the areas of productivity, collaboration, mobility and end user compute.
Anthony TaylorTechnology Lead
As spatial computing merges augmented reality (AR), virtual reality (VR), and mixed reality (MR), it is reshaping operational paradigms. To capitalise on this transformative technology, organisations can adopt the following key practices: 1. Invest in technology, processes, and people: Align technology with processes and personnel to optimise technology utilisation and enhance user experience. 2. Promote innovation: Cultivate an environment of innovation by supporting exploration of new spatial computing concepts through training and workshops. 3. Customise applications: Develop customised spatial computing applications in collaboration with experienced developers to address specific business requirements effectively.
8
Get more Insight
Connect
Articles
Events
Webinars
Case Studies
4. Integrate with existing systems: Ensure seamless compatibility of spatial computing solutions with current IT infrastructure to facilitate efficient data exchange. 5. Focus on user experience: Prioritise intuitive interface design and user feedback for continuous enhancement, ensuring the effectiveness and user-friendliness of applications. Insight stands ready to assist organisations in leveraging the transformative potential of spatial computing to drive innovation and operational efficiency.
Virtual reality for wellbeing: Apax helps to lead the way with Insight
Department for Transport En Route to an Immersive Future
10
The rise in private 5G connectivity
The rise in Internet of Things (IoT) looks almost unstoppable.
According to IoT Analytics, there will be 18.8 billion connected devices by the end of 2024. But we’re going to see a lot further growth. Research from Exploding Topics predicts that the number of IoT devices will rise astronomically to nearly 30 billion by 2030. The reason for this is simple: there’s a much greater need for (and greater awareness of) accurate data and, in many cases, real-time data. This could be about anything: security devices, RFID tags for stock control, information about how moving parts are performing meteorological sensors. At one level, this has led to various jokes about smart fridges warning that you’re about to run out of milk but, more seriously, has led to greater operational efficiencies, as companies with a vast range of remote devices can now monitor their state continuously. All this data can be
combined with bigger compute resources that are currently available and when integrated with AI software, businesses have a large arsenal of information. The debate has opened about how those devices are connected, and there are several choices. Within a campus area, Wi-Fi has been widely used and there have been an ever-increasing number of options. Currently, Wi-Fi6 connectivity is becoming available and that will be used more and more in the coming years. Another networking technology that is being widely used is LoRaWAN, which stands for low-power wide-area network. This is like Wi-Fi over a long distance and allows for long- range communication between IoT devices and base stations and is extremely useful for industrial data collection in low-power situations.
9
READ ON
It’s a technology that offers a larger arc of connectivity as it can handle connectivity at a distance of up to 15 km, although it doesn’t offer a particularly high bandwidth. Cellular technology is also widely used. Again, we’re getting used to faster speeds here. There’s been a move from 4G to 5G in recent years, often driven by the obsolescence of 2G and 3G services; this has meant a lot of older IoT devices now need replacing. Organisations, however, are looking to do more with 5G. We are now seeing an increasing drive to use it as a connectivity technology in industrial and commercial situations. While 5G has been widely associated with cell phone technology, organisations can benefit from the use of private 5G connectivity – the use of the same technology but confined to a sole organisation.
We are now seeing an increasing drive to use it as a connectivity technology in industrial and commercial situations.
11
There are many advantages of using private 5G; it will offer considerably higher speeds and a more reliable performance. What is particularly significant though, is the improved security that is offered by a private 5G set-up, a vital factor when dealing with intellectual property and customer data. It should be emphasised that there’s nothing inherently more secure about private 5G, it just allows companies to set their own controls. This greater customisation of 5G is a major asset for organisations. For example, it will allow for greater scalability when dealing with unusual traffic patterns, such as seasonal factors. It can allow for configurations related to a particular industry. For example, there may be a need for greater security within some vertical sectors, such as health organisations with the need to handle sensitive medical records or within the defence sector, with its obvious need for heightened protection. As well as some massive advantages to going down the private 5G route, there are also some challenges.
12
It’s relatively high when it comes to power consumption, something that many organisations are considering. It’s more expensive to operate than regular 5G. However, private 5G is lower power consumption than WiFi and by not running multiple radios on a single device, there’s an inherent power saving. When it comes to wireless technology, there can be issues with bandwidth allocation as the spectrum can be rationed. This is not necessarily true when it comes to private 5G, however. Licensing authorities have already allocated chunks of spectrum specifically for license-exempt use by private 5G solutions. That’s only half the story, though. Even when the spectrum has been secured, there’s a need to maintain it, and historically that has meant a need for specialist skills. However, Vodafone has already demonstrated a sub $1,000 base station that can be monitored using Raspberry Pi, so companies are already being equipped for an easier way to handle private 5G.
13
Despite the need for these considerations, any company that does go down the path to private 5G will have many advantages when it comes to future-proofing the infrastructure. Besides the higher bandwidth, tighter security and customisation mentioned earlier, there will also be a more robust network; one that will offer much greater service availability, and crucially will be available in urban and remote areas, where public cellular may not be available. In the future, this could mean that businesses will be able to buy cheaper real estate in off-grid areas and hook it up to private 5G coupled with satellite technology, in order to have a fully functional commercial property at a fraction of the cost of having to lay kilometres of cable under or over the ground. This would benefit industries such as mining and farming, which often operate in locations with poor communications infrastructure and limited ability to lay cable. Private 5G then allows IoT sensors that can be
14
by local machines, or even solar power, to collect operational and efficiency data that becomes invaluable to transforming the whole operation. When coupled with cloud based systems the possibility of greater robotic automation becomes a reality! Another example would be cruise ships and festivals where e-Sims could be offered to provide connectivity to guests and IoT sensors alike without the need to deploy hundreds of wireless access points. That’s just one step forward for private 5G. It’s a technology that has many built-in advantages that, with its approach to customisation and security, offer a more flexible way to handle data.
The Office of the CTO
This matters to organisations. We are seeing now how many companies are looking at how they gather, store, and use data. This is the lifeblood of many businesses, and ways to transfer that data will be looked at closely. Private 5G is going to offer a flexible method to deal with data more efficiently and is definitely technology for the future.
15
As businesses explore the adoption of private 5G networks, here are five actionable steps to consider: 1. Conduct needs analysis: Begin by assessing current network performance, identifying pain points, and forecasting future requirements to tailor private 5G deployment to organisational needs. 2. Invest in infrastructure: Allocate resources to procure 5G equipment, edge computing technologies, and IoT devices, ensuring seamless compatibility and scalable deployment. 3. Foster IT and OT collaboration: Facilitate collaboration between information technology (IT) and operational technology (OT) teams to enable smooth integration and optimise network performance.
4. Prioritise security: Implement robust security protocols, regularly update systems, and provide cybersecurity training to employees to safeguard sensitive data and ensure network integrity. 5. Partner with experienced vendors: Select reputable vendors with expertise in private 5G deployment and management to leverage their knowledge and technologies for successful implementation. By following these strategies, organisations can unlock the full potential of private 5G networks and drive operational efficiency in the evolving digital landscape.
Whitepaper
17
Streamlining apps is the path to smarter investment
Nowadays, there are countless apps available for our smartphones.
There seems to be an app for nearly everything, whether it's for tracking fitness, managing finances, learning languages, or ordering groceries. We often download these apps, and each one provides value in its own way, even if we don't always use all of them. This app consumption model has made its way into the enterprise. Now, teams can easily search for, test, buy, and use applications with no input from IT departments. The simplicity of SaaS apps — requiring no installation and open to any buyer — empowers teams to select tools that align with their unique workflows and objectives. This ease of access has resulted in an increase in the so-called "shadow IT". Applications are adopted without the knowledge and oversight of IT departments. While each application may fulfil a specific organisational need, this decentralised approach has led to considerable "app sprawl" -- the excessive proliferation of tools that can strain IT resources and budgets.
It’s not an insignificant problem, a Salesforce survey estimates that the total number (that’s cloud and on-premise) has jumped from 843 to 1061 within two years. In addition, Insight research suggests that companies are wasting about 30% of their budget on unwanted or unnecessary software. Managing a large number of applications is a significant challenge for IT departments. Each application needs a licence, security measures, integration with current systems, and regular updates. This adds complexity and costs to the organisation. Additionally, some SaaS applications may be underused, which further increases expenses. Because of this growing complexity, application rationalisation has become a key topic in IT discussions. The unchecked growth of the application portfolio is not sustainable. Organisations are increasingly looking to rationalise their application portfolios for several reasons. First, cost efficiency is crucial; as applications accumulate, so do licence fees,
16
maintenance costs, and management overhead—often for tools that are seldom used. Secondly, rationalisation of applications improves security posture and compliance with regulations. Every new application introduced into the business environment increases the potential for security risks and data breaches, especially when done without IT oversight. Finally, application rationalisation improves operational efficiency. A simplified software catalogue is easier to maintain and allows IT departments to concentrate their efforts on innovating rather than managing a complex mix of applications. There are numerous reasons why organisations are trying their best to reduce the number of applications in their IT landscape.
Every new application introduced into the business environment increases the potential for security risks and data breaches
18
First, the economic environment is forcing businesses to save quickly on costs. Second, companies must prioritise their digital transformation investment efforts on tools supporting their business objectives. Finally, reducing the number of applications and the number of subscriptions helps reduce digital waste, contributing to long-term sustainability goals (SaaS apps run in data centres that consume energy and water). Goals should be formalised before starting any application rationalisation initiative. For example, is the priority to reduce costs, increase operational efficiency, improve security, or align technology with business outcomes? Knowing the focus will help guide decisions throughout the process. The scope must also be defined to ensure the approach is manageable and targeted. For example, which business units, types of applications, or regions will be included or excluded? With goals and scope defined, the next step is to discover and create an inventory of all applications in use.
19
The inventory should categorise applications and include critical information such as the frequency of usage, and the licensing spend per application. The list of discovered applications can be extensive. We propose an iterative hypothesis-driven approach to prioritise further research efforts. For instance, during a first sprint, if the main objective was to reduce the number of apps with the same functionality, efforts should be focused on reviewing some of the categories with the most sprawl (eg, BI tools). But if data security were the main concern, efforts should be focused on investigating the applications that may pose the greatest risk to the security of our data, such as cloud storage, translators or Gen AI apps. The next step is to assess the value of the selected applications. This value will be determined by several factors, including functional alignment, technical compatibility, user satisfaction, cost considerations, security risks, and the level of support from the vendor, etc. It is important to gather input from stakeholders, including product owners, end-users, IT security, or enterprise
architecture among others. At this point, redundancies between applications that serve the same functional purpose will begin to emerge. Upon completion of the application evaluations, decisions will be made regarding which applications to retain, consolidate, replace, or retire. These actions should be included in an action plan. This plan may also include actions for consolidation of vendors, co-termination of licence expiration dates and contracts, and licence optimisation (eg., downgrading subscriptions, reducing quantities, etc.). Once the decisions have been made, it is time to execute them. This process involves managing changes effectively and helping users adopt new technologies. Therefore, planning and management of these changes are highly recommended. Finally, to ensure the sustainability of these efforts, we suggest establishing a system for governance and continuous monitoring. By implementing policies for
20
acquiring new software and requiring IT oversight, organisations can prevent application sprawl and ensure that their environments align with strategic objectives. This approach does not restrict the business from choosing its applications; rather, it provides the necessary support during the process of acquiring new software. As digital transformation and economic challenges continue, application rationalisation is a must for organisations that wish to make the most of their application portfolios. A clear understanding of business and IT goals, combined with a structured approach to inventory, evaluation and implementation of changes, can help create a leaner and more effective IT environment, one that closely aligns with the business strategy. A simplified and useful application portfolio not only provides cost savings, but it also helps organisations keeping up with innovation and remain competitive for the future.
21
Rene Sans Senior Manager Software Lifecycle Services
An IT asset management professional with over 12 years of experience, leading the ITAM service practice specialising in implementation and transformation projects, as well as managed services that help global clients optimise technology costs, mitigate risks and improve technology management operations. Technology Lifecycle Services We help clients to achieve value from existing and future investments in technology. Our services span from sourcing to renewals. We perform assessments and optimisation projects as well as delivery of tooling and management of processes to deliver clarity, reduce costs, mitigate risks and enable better decision making across procurement, IT / software asset management and commercial negotiations.
22
To effectively navigate the process of software rationalisation, organisations can implement the following steps: 1. Identify business goals: Begin by defining the overarching goals in the business, such as cost reduction, operational efficiencies enhancement, security improvement, and aligning technology to deliver business outcomes. 2. Define SaaS objectives: Determine the scope of software as a service (SaaS) objectives, considering factors like business units, regions, or types of applications. 3. Discover procurement and usage: Conduct a thorough analysis to identify the software being procured and utilised within the defined scope. 4. Assess application value: Evaluate the value of applications based on factors like functional alignment, technical compatibility, user satisfaction, cost-effectiveness, security risks, and support availability.
5. Create action plans: Develop actionable strategies such as retaining, consolidating, replacing, or retiring software applications based on the assessment outcomes. 6. Establish governance and monitoring: Implement a robust system for governance and continuous monitoring to ensure ongoing alignment with business objectives and efficient software management. Our SaaS Insights service offers a rapid and insightful approach to refine your SaaS management strategy and make informed decisions on key objectives. By analysing SaaS consumption data from a representative sample of end-user devices, we provide valuable insights to support strategic discussions and facilitate prompt decision-making. With quick turnaround times, our services, including discovery, vendor optimisation projects, and continuous managed services, cater to your specific needs efficiently and effectively.
Has FinOps forgotten SaaS?
Why modern organisations should be viewing FinOps and IT Asset Management through the same lens
24
How hybrid deployment will be the future of cloud
Hybrid cloud is now the new normal.
One of the most significant changes in the past few years has been the way that traditional IT vendors have turned to cloud to refresh their product portfolios. Perhaps that should be phrased more specifically as how they have turned to hybrid cloud as the way forward. Hybrid cloud is now the new normal. A mix of public and private cloud and positioning the right workloads in the right place. The established IT vendors --- HPE, Dell, Lenovo --- had an awkward relationship with cloud initially. Some dismissed it as something of a fad - or “vaporware”, as Oracle chief executive Larry Ellison put it. Others saw the potential but were loath to surrender their existing on-prem hold on the market, hence the emergence of so-called private cloud. All of that has changed in little more than a decade. While the public cloud providers are still going strong, the traditional on-prem companies have fought back by adapting on-prem solutions to be more cloud-like and agile to consume.
This change in approach by vendors is mainly being driven by one technological impetus: the increasing deployment of AI. This is a development that needs hybrid cloud to work properly. It needs the flexible capacity to handle the compute and GPU power needed for massive AI computation while, at the same time, ensuring that there are no latency issues, and that intellectual property is being held at the customer’s premise. HPE and Dell both have reference architecture solutions for private AI environments and clients can scale these hybrid environments for the AI use cases they need to run in production in a low latency, secure governed manner whilst connecting to their corporate data and intellectual property securely and safely. What we are increasingly seeing are clients using public cloud for proof of concept for a new project: to get a feeling
23
of how it will run to ascertain any difficulties but then frequently choosing to run corporate applications in a hybrid set-up. They are also increasingly looking to build applications that can be designed to be containerised to work with multicloud providers and provide portability to avoid lock in and the ability to choose where an application lives based on performance, security, governance, technical features/services and cost. A great example of this is retail locations. A large fast-food client uses hybrid cloud in each of its store locations. From the application running the food ordering kiosk devices to the apps and digital content needed for the digital display screens showing the menu choices, each of these are IoT devices running a specific application workload. These apps need to be fast, highly available and have low latency for consumer kiosk transactions and high bandwidth content
What has emerged is a new generation of as-a-service products that have changed the face of hybrid cloud.
25
display for menu choices. Each retail location has a small hyperconverged cluster installed to run these application workloads. They also need to be managed and to be future proof to allow data and AI services to be deployed to the store for management information and customer experience. Imagine the challenge of managing thousands of these platforms in stores. The operational management is crucial and needs to be cloud integrated and have AI ops and lifecycle management built in. This use case could not be delivered purely with public cloud, it's hybrid solutions that unlock and power the client experience. And this is where the traditional infrastructure vendors come into play by adapting and building as-a-service models for on-prem solutions. They have taken on board several elements of cloud technology --- the flexibility/agility of cloud-like procurement and billing, allowing on-prem expansion quickly and easily, emulating the cloud-like experience for on-prem. It became apparent that it wasn’t the cloud providers themselves that were proving to be attractive to users, but what they offered:
26
the self-selection, the quick provision and, of course, the ability to pay for what was being used. What has emerged is a new generation of as-a-service products that have changed the face of hybrid cloud. The vendors have pulled together all the aspects of a modern computing IT infrastructure and have made them their own. This is having a transformative effect on the way that corporate computing is being deployed. Historically, a customer would buy its servers from one company, its storage from another and its networking kit from another, but now, everything has changed. Companies have the option of buying everything within one organisation, pre-engineered to work together as a reference architecture solution. HPE’s Green Lake and Dell’s Apex are engineering reference architectures that simplify the design and deployment of on-prem solutions. These offerings are simplifying hybrid on-prem: workloads such as virtual machines, containerised apps, storage,
modern workplace, databases, data and AI can now be handled by a single vendor, while retaining the benefits of cloud computing. What unites all these products is that there is an acceptance that hybrid is the new normal with a fair split between public cloud and private cloud needs, clients need a platform that allows workloads to be deployed in the right place and managed by an orchestration platform: an example of this is Microsoft Azure Stack HCI. This allows hybrid deployment of Azure resources on-prem and at the edge for customers. HPE claims to have added 3,000 new GreenLake customers in the past quarter. That’s a dizzying rise that reflects the interest in this type of hybrid implementation. According to IDC, 65% of organisations recognise the need to simplify cloud infrastructure to improve resiliency and reduce operational costs and the likes of HPE GreenLake and Dell Apex leading the way. It’s a radical change from the way that cloud services used to be offered and will shape corporate infrastructures for some time to come.
27
There’s another factor that is driving this type of hybrid set-up --- the increasing rise of edge computing. We’ve moved away from the days when all compute was carried out at the centre of an organisation. There is an increasing amount of compute being carried out at the edge, with the need to put the compute, storage, and workloads close to the use case. Factories, manufacturing plants, retail outlets are all examples of where the infrastructure needs to be at the edge locations, linked seamlessly to the public cloud and performing analytics and computation locally. There’s little doubt that this type of hybrid computing is the way that much computing will continue in the future. There were pundits, back in the day, who predicted that all IT would be in the public cloud, but businesses need choice and all the options will be offered by hybrid cloud.
28
Enterprise architect and strategist with over 30 years experience in complex cloud and hybrid on-prem solutions, IT transformation and managed services. Cloud & On-Prem Modern infrastructure based compute and storage is key for underpinning application & data success for our clients. The cloud & on-prem practice focuses on enabling our clients to move to the best modern, multi-hybrid cloud platforms for their application and data needs. It also enables our clients to continually evolve those platforms to optimise innovation, performance, availability, cost and cashflow.
Lee WilkinsonTechnology Lead
29
For organisations looking to embrace hybrid cloud solutions, a starting point can involve implementing the following strategic steps: 1. Develop a cloud strategy: Craft a comprehensive strategy that defines objectives, expectations, and success criteria for hybrid cloud deployment. This should include an assessment of the current IT landscape and a well-defined roadmap for implementation. 2. Ensure security and compliance: Prioritise robust security measures and compliance adherence to safeguard data. Implement encryption, access controls, and regular audits while maintaining continuous monitoring to ensure regulatory compliance.
3. Optimise workload placement: Evaluate workloads to determine the most suitable environment (public cloud, private cloud, or on-premises) based on factors like performance, cost, and security. This strategic approach enhances operational efficiency and cost-effectiveness. 4. Foster communication: Promote open communication and collaboration between IT teams and other departments to address challenges, share best practices, and align the hybrid cloud strategy with overarching business objectives. 5. Leverage automation: Harness the power of automation tools and orchestration platforms to streamline processes, resource management, and performance optimisation. By reducing human error and enhancing efficiency, automation plays a pivotal role in successful hybrid cloud deployment.
31
AI Revolution: Navigating the Security Crossroads
AI is currently at the forefront of technological innovation, captivating everyone with its unprecedented potential.
With nearly every tech company showcasing their AI solutions or integrations, this surge in activity promises to revolutionise the business landscape. According to McKinsey, 65% of organisations now regularly utilise generative AI, a number that has nearly doubled within a year. The majority of these organisations foresee significant transformations within their industries thanks to AI. The future appears bright: enhanced efficiency, streamlined operations, and substantial improvements to the bottom line. However, amidst this optimism, several critical issues emerge that warrant the attention of IT and business leaders worldwide. Firstly, the ethical dimension of AI deployment presents a significant challenge. Policymakers are grappling with the need to foster innovation while minimising risks. This situation echoes the 'tragedy of the commons', where
individual actions deplete shared resources, ultimately harming all. In the context of AI, the shared resource is global safety and ethical development. The race to develop AI for economic, military, and technological superiority often leads countries to bypass crucial safeguards, prioritising speed over security. This haste can result in adverse outcomes such as data misuse, inequality, job displacement, and ethical breaches. Moreover, this race has led to fragmented approaches, with countries and states pursuing their own agendas for short-term gains. This lack of international cooperation on AI regulation poses a long-term threat. For instance, President Trump’s executive order on AI and the EU's AI Act highlight the divergent paths being taken. While the US is focusing on in investing in AI infrastructure, the EU is taking a more cautious approach and working to enforce robust security and data protection measures first. The EU AI Act, with its potential fines of up to €35 million, or seven percent of global annual turnover for breaches, exemplifies this rigorous approach.
30
Critics, however, argue that these strict regulations could create legal uncertainties and slow AI progress in Europe. Another pressing concern is that while AI enhances corporate efficiency and enables new innovation, it simultaneously empowers cybercriminals to craft more sophisticated and harmful threats. AI has democratised hacking, enabling attackers without advanced skills to launch cyber-attacks, making them harder to detect. The UK's Department for Science, Innovation and Technology conducted a comprehensive study in early 2024, identifying several vulnerabilities in many companies:
To mitigate these risks, companies should prioritise data permissions, ensuring that AI models operate within strict data boundaries.
32
Inadequate threat modelling, hindering the identification of all potential threats. Insufficient data privacy safeguards, causing some organisations to halt their AI initiatives. Insecure authentication and authorisation, leading to impersonation of legitimate users. Poor input validation and sanitisation, enabling manipulation of database queries. Inadequate output encoding, exposing web pages to malicious scripts. Weak encryption, risking data interception during transmission.
Lack of robust security architecture, allowing for the injection of malicious code
These vulnerabilities underscore the necessity for companies to implement stringent security measures when deploying generative AI. To mitigate these risks, companies should prioritise data permissions, ensuring that AI models operate within strict data boundaries. Additionally, organisations should recognise their limitations. For instance, deploying a chatbot using existing models might inadvertently suggest competitors to clients — a commercially dangerous outcome. Chief Information Security Officers (CISOs) can turn to resources like the OWASP Foundation's "OWASP Top 10 for LLM" guide. This guide highlights common AI-related attacks and provides strategies for prevention and remediation. Prompt Injection, the first attack listed, is thoroughly detailed, offering valuable insights for defence.
33
The OWASP guide illustrates the wealth of support available to companies navigating this transformative technological shift. Partnering with experts like Insight can further bolster organisational resilience, drawing on their extensive experience with complex changes and security challenges. In conclusion, while AI deployment promises tremendous benefits, it also introduces significant security challenges. By staying informed, leveraging resources like the OWASP guide, and recognising their limitations, companies can safely harness AI's potential and drive future success.
34
Nearly 20 years of experience in cyber security across a number of roles in engineering, design, design authority, pre-sales, and strategic leadership. Security & Compliance The Security and Compliance practice brings together the combined capabilities of the Insight EMEA region to excel as an end-to-end security partner - safeguarding our clients’ operations and securing their digital transformation. We help our clients to assess their current maturity through the use of assessments, we provide a roadmap to help them achieve their Zero Trust ambitions and help to manage and maintain security posture with our managed services.
Rob O'Connor Technology Lead
35
To address the critical aspect of AI security, organisations can undertake the following strategic actions: 1. Assessment: Conduct a comprehensive review of your existing security controls and identify any residual risks. Collaborate to develop a prioritised roadmap aimed at achieving the desired level of security. 2. Plan & design: Translate your business challenges into actionable security projects with assistance. Receive guidance on selecting suitable vendors, products, and services. Engage in envisioning workshops and technical design sessions to align security initiatives with business objectives.
3. Build & implement: Bring security plans to fruition by moving from design to the complete implementation of security controls. Each project is carefully considered within the broader context of your security roadmap. Transition the management of security controls to your internal teams or seamlessly transition to our managed services. 4. Security operations management: Ensure optimal performance of your security controls with ongoing support services. Consider leveraging managed services where Insight assumes responsibility for the management and maintenance of your security controls.
Insight's Data-Centric Security Guide
Insight's Cyber Security Capabilities Overview
Blogs
37
The Evolution of AI Models
Looking back at 2024, it became clear that there has been a real change in the way that AI is being perceived within organisations.
Much of the discussion in 2023 was still at an early stage, driven by early adopters and enthusiasts. Now, we’re beginning to see a real shift that’s being driven by some fundamental changes to AI. The emergence of a new open-source AI model DeepSeek, for example, has highlighted the long-standing debate between open source and proprietary systems as well as challenging the norm around AI compute efficiency. Businesses now have the possibility of cheaper AI deployment by running open-source models on-prem or on the edge. But recent developments have also exacerbated the importance of security, approach, and good governance around open-source systems. Beyond this, there are five key areas in which the AI landscape is going to change:
Multimodal capabilities: AI will increasingly integrate audio, video, and other modalities, driving the technology to new frontiers and making natural language processing more versatile. AI orchestration: Combining multiple AI models and techniques to solve complex problems, including multi-agent systems where different AI agents collaborate to complete tasks. Sustainable AI: While not the primary focus, sustainability in AI for training and inference, for example, is an important consideration, particularly in terms of reducing costs and improving efficiency. Integration with RPA (Robotic Processing Automation): Generative AI will enhance the hyper automation movement, extending its scope and capabilities in making decisions and taking action. This will be achieved through a comprehensive understanding of natural written and spoken language, interpreting the required outcomes, and triggering APIs for action.
36
Evolution of Gen AI technology: Advances in AI and Gen AI models include better training techniques, more efficient models, and the use of specialised hardware.
Some of these areas will be integrated: AI orchestration will be a fit with RPA in a variety of situations, and multimodal capability will be part of all types of AI implementation. AI model evolution The evolution of the models results in more advanced precision or speed, depending on the use case. An efficient LLM can achieve end-task performance comparable to that of a full-precision Transformer LLM with the same model size and training tokens. Additionally, it offers advantages in latency, memory usage, throughput, and energy consumption. Another impactful change is the adoption of specialised hardware, designed to support AI. In addition to the evolution of the GPU, there are even new types of processors called LPU. These are set up to tackle the processing required to understand and generate natural language.
Another impactful change is the adoption of specialised hardware, designed to support AI.
38
AI orchestration Perhaps one of the biggest changes seen is the move from Gen AI tools, the most basic form of chatbots, to a new generation of AI agents; software that has the ability to execute a much more sophisticated chain of control. To give an illustration of what we’re talking about, consider this: a business trip booked from home. To book travel, I will start by talking to my travel Alexa, and ask it to coordinate with my Outlook. I will then have another agent coordinating all the necessary documentation. I may need another agent to manage my house while I’m away, automatically switch lighting and heating off for example. The agents' abilities to do all this requires multi-modal capability, understanding email, telephone conversations, images on websites and more. Previously, that process has been very difficult. Agent-based systems use a new principle, based on very large, unstructured data sets, that can adapt to different scenarios, rather than following set responses to a series
39
of prompts. By using agents, users can direct an AI-driven workflow through natural language processing, rather than through code. This means that in the future we’ll be able to receive (and react to) AI-generated phone calls; an example of the way that changes can be integrated. RPA and Gen AI integration RPA (Robotic Process Automation) is a technique for automating simpler business processes. With many industries already using RPAs, it’s a market that’s growing quickly. According to Statista, the RPA market will reach $13 billion by 2030, more than doubling in size from 2020. RPA’s capability to manage a range of repetitive tasks allows it to be used in various scenarios, including those involving natural language, such as phone calls, which were previously not feasible. Combined with AI agents,
the scope and complexity of business processes which can be automated will now grow exponentially. There are already several low-code business workflow automation platforms integrating LLM agent nodes into their capabilities. Multimodal AI Multimodal AI refers to systems that process text, images, audio, and video. This is a more complex process than previous generations of Gen AI, as there’s a need for the software to interpret data from multiple input forms. It’s not just about ‘understanding’ different modes, it can ‘create’ in different modes too. As an example, multimodal AI could compose or produce a video from a pre-written storyboard. Business sectors that could use multimodal AI include education, for personalised learning, and entertainment, in video production. As the technology improves, its application is sure to multiply.
40
Sustainability AI has many advantages, but one thing it’s not known for is being sustainable. The carbon footprint of generative AI is huge. It’s not just energy consumption, there’s also the amount of water required to run AI. How can a business manage to introduce AI into its culture while keeping carbon emissions low? The most obvious answer is to use the technology itself. By using AI, companies could reduce their carbon footprint. For example, AI could examine the supply chain and work out new delivery routes that are more sustainable. Alternatively, it could examine areas where a lot of waste has been generated and explore ways to reduce that. In other words, look at how processes could be made more efficient.
For all these changes to take place ... ... there will have to be a change in business processes; a concerted effort for different areas of organisations to work together. For example, to get the best out of AI within a legal environment, a company will need to ensure that someone with legal expertise will work with the AI developer. While this sounds obvious, it is new to many organisations which are typically structured too rigidly. There needs to be a willingness to change. At Insight, we are using AI for optimising internal processes, for example, handling RFPs, better supporting our clients, and increasing the effectiveness of our services. We’re clearly at an exciting time for AI development; already observing how it can have an impact on the way that companies operate. Some of the coming developments will really take us into a new world.
41
Consulting and technology manager with more than 20 years of experience in shaping complex IT solutions and delivery transformation programs. Application & Data The Application and Data practice mission is to address some of the biggest IT challenges faced by our clients, from the development of cloud-native applications, modernising existing monolithic applications to using their data and new AI services to monitor, manage and advance their business.
Santo OrlandoPractice Director
42
To explore the evolution of AI and its impact on security measures, organisations can consider the following strategic approaches: 1. Invest in AI training: Offer comprehensive AI training to ensure employees can effectively use AI tools and foster innovation. 2. Integrate AI in operations: Use AI to streamline processes, increase productivity, and reduce costs. 3. Collaborate with experts: Partner with AI specialists like Insight to stay updated on advancements and gain insights.
4. Ensure data quality: Implement strong data governance to maintain accurate and secure data, enhancing AI effectiveness. 5. Adopt ethical practices: Follow ethical guidelines for AI to ensure transparency, fairness, and accountability, building stakeholder trust.
43
The trends discussed in this report, such as sustainable AI, hybrid cloud, security of AI, spatial computing and private 5G, are reshaping the technological landscape, and in the case of AI, changing business strategies. To navigate this environment, organisations must embrace innovation and adapt to the rapid pace of technological advancement. We have seen that keeping ahead of the curve is critical, given the competition in the IT industry. Failure to embrace new technologies may leave companies vulnerable to competitors who are faster to adapt. This report has highlighted that, with the right investment and focus, these technologies have the potential to drive or revolutionise efficiency, enhance security, improve customer experiences, and foster sustainability. However, it is important to acknowledge the challenges,
particularly the ethical and security considerations surrounding AI deployment. Organisations must adopt a balanced approach, harnessing these technologies while mitigating potential risks. By exploring the use cases in this report, and understanding how these trends apply to their situations, organisations can position themselves for success in the evolving IT landscape. Proactive adaptation and strategic implementation of these technologies will be key differentiators in the years to come.
Phil Hawkshaw EMEA CTO & Director of Technology Services