The world is rapidly moving toward a post-digital era, where leaders will need to set their sights beyond their ongoing digital transformations. With digital capabilities alone no longer serving as a differentiator, future-minded business leaders will need more in their technology arsenals to succeed.
Innovative technologies are catalysts for change, offering businesses extraordinary new capabilities. ‘Distributed ledger technology’, ‘Artificial intelligence’, ‘Extended reality’, and ‘Quantum computing’ (known collectively as DARQ technologies) will be the next set of emerging technologies to spark profound change, letting businesses reimagine entire industries. In fact, individual DARQ technologies are already making a difference across industries today. But collectively, the DARQ technologies will also power the innovation and opportunity uniquely associated with the coming post-digital era. As the business landscape becomes increasingly dominated by digital natives and companies that have undergone successful digital transformations, DARQ is the key that will open unimagined new pathways into the future.
Artificial intelligence (AI) today includes a whole host of technologies from data science to computer science, to electronics and social disciplines. It is a very broad field within information technology that is enabling the digital transformation of industry and society by creating computers that have the ability to learn as they are programmed to perform tasks normally requiring human intelligence. This includes reasoning, problem-solving, understanding language, making predictions or inferences, and perceiving situations or environments. Essentially, it involves computers being able to provide better, deeper, and otherwise practically unachievable insights in an efficient way by leveraging computer-learning algorithms.
This trend is about the rapid adoption of AI technologies whose increased capabilities and applications have the potential to reshape almost every industry and profession as they fundamentally change the ways in which humans interact with machines. This should be considered a megatrend because of the scale and geographic reach of its potential economic and societal impacts.
The projected impacts of AI are significant. Many different projections and estimates exist, but to provide an example of the magnitude of these projections, UN Conference on Trade and Development’s Digital Economy Report estimates that AI has the potential to generate USD 13 trillion of additional global economic output by 2030, contributing an additional 1.2% to annual GDP growth. The Future Today Institute predicts that the global AI market will grow at a CAGR of 42.2% from 2021 to2027.
AI enabling next generation applications
Over the coming decades, applications of AI have the potential to change our lives for the better in many ways. Some key examples include:
- Changing the labour market: With AI’s huge potential to boost productivity and economic growth, it will certainly have a significant effect on the labour market. In the short term, automation driven by AI (e.g. ‘Robotics’) could introduce some disruption to many current jobs. But in the longer term, AI promises to create a significant number of new jobs; jobs which can remove the need for humans to do unsafe and repetitive tasks (see ‘Effects of automation’).
- Better healthcare: AI is already transforming healthcare in areas such as pathology and radiology, by improving the speed and accuracy of diagnosing diseases such as breast cancer. In future, AI will facilitate personalized medicine and drug development (e.g. allowing tailored, drug treatments based on an individual’s genetic markers), help to eradicate infectious diseases and even to predict future disease outbreaks that may originate in animals.[9,10]
- Personalized education: Like for personalized healthcare, AI could be used for personalized learning – targeting learning to the gaps in an individual student’s knowledge and creating customized learning content.
- More efficient production and consumption: AI is widely predicted to increase industry and worker productivity, which is why so many companies are interested in adopting it in some form or other, looking to achieve a competitive advantage. McKinsey estimates that 70% of companies may adopt at least one AI technology by 2030. AI will also increasingly be used to identify more efficient delivery routes or supply chains and to maximize efficiency and sustainability in agriculture, the outcome of which could lead to significant economic growth.
- More efficient and effective governance: AI could help formulate and evaluate the effectiveness of government policies and even be used to perform legal tasks that require the sifting and analysis of huge amounts of data.
Challenges and risks
Although the expected benefits of AI are enormous, good governance of the technology will be essential if we are to realize them. The development of appropriate legal and ethical frameworks for AI will be critical to build societal trust and to mitigate the potential risks and challenges. International Standards will have an important role to play as part of such frameworks to ensure the responsible adoption of AI.
- Social implications: Because the ability to adopt and benefit from AI is dependent on the presence of adequate digital infrastructures, relevant technical skills in the workforce, and appropriate regulatory systems, AI has the potential to widen the technology gap between those that have the capabilities to benefit from it, and those who do not. This is the case in terms of both countries and companies (large corporations may be able to determine who has access to AI and its benefits). Those with access to the technology could potentially use it for malicious purposes, for example, by creating deepfakes (using AI to alter videos of people) and tailored, online communications (a sort of ‘personalized propaganda’) to radicalize or manipulate people. But even without malicious intentions, just the fact that AI consumes so much data creates potential privacy issues – “As AI evolves, it magnifies the ability to use personal information in ways that can intrude on privacy interests by raising analysis of personal information to new levels of power and speed”. Indeed, AI is advancing so rapidly that it could even one day evoke difficult questions about what it means to be ‘human’. For example, AI is learning to do things that even humans often find difficult – reading human expressions, interpreting the emotions behind them, and analyzing a person’s level of emotional engagement. Researchers are even working on teaching AI to convincingly exhibit human emotions.[7,9]
- Legal and ethical implications: Advanced AI that can make autonomous decisions could be applied to medical diagnoses, legal judgements or even used in warfare. This could be problematic because of the risk of bias in AI – if given incorrect or skewed data, this could lead to algorithmic discrimination being deployed on a large scale (e.g. some facial analysis AI has been shown to be less accurate at identifying minorities and women, because the data it was trained on was not representative).[16,17] AI has the potential to reduce the impact of human biases, but only if humans can identify and adequately address bias in AI.
What is on the horizon for AI? (only a few examples of many…)
- Edge computing: This is a system of moving computation nearer to the sources of data or the ‘Edge’. Moving AI workloads to the Edge (AI processing and decision making is performed nearer to the source of the data generation, rather than in the Cloud) to make it faster and safer.
- System on a chip: Development of advanced chips with a complex series of components that are designed to work on AI projects and deliver faster and more secure processing.
- Digital twins: The use of AI to significantly improve ‘digital twin’ technology (virtual representations of real-world environments or products).
- AI to detect AI: New measures to regulate creation and detection of deepfakes will be complemented by AI systems designed to identify deepfakes, whether these counterfeits are text or imagery.
- Emotion AI: Software that can read human vocal and facial expressions, understand human emotions and the cognitive states underlying them. Uses will include telehealth, online learning, and virtual meetings/events.
- Published 20 Standards | Developing 32 Projects
- Information technologyArtificial intelligence (AI)Bias in AI systems and AI aided decision making
- Information technologyArtificial intelligenceOverview of trustworthiness in artificial intelligence
- Information technologyArtificial intelligence (AI)Use cases
Extended reality (XR) refers to environments that combine the real and the virtual, through the use of computer technology and wearable devices. XR technologies consist of virtual, augmented and mixed reality (respectively VR, AR and MR). Each of VR, AR, and MR defines a specific technology to reach XR, including the Metaverse. VR is fully digital and immersive, AR can digitally enhance our view of the real world and more recently, MR can create a hybrid reality where virtual and real worlds coexist.
XR technologies are transforming the way that people interact, live and work by offering access to a new mode of social interactions within the digital space.[2,8] The endgame is the full development of the Metaverse, an online digital world where people can interact with each other and with the computerized environment to do a variety of activities as an extension of reality.
Part of the emerging DARQ technologies, XR is a building block in many companies’ innovation strategies, with the power to significantly transform industries. With the combined global spending in AR and VR expected to reach USD 160 billion by 2023, and repercussion in both leisure and business sectors, this is a trend with rapidly increasing significance.
The experience economy: from ownership to usership in the digital space
AR and VR immersive technologies have been in use for some time already (especially, in online games), but their application is increasingly business-focused, helping the field’s rapid expansion. Epic Games’ Unreal Engine for example, used in the popular Fortnite game, created an online, digital space for users to exchange and participate in a multiplicity of experiences. This gaming engine can also be used for business purposes, with architectural firms using it to showcase their designs to clients, or Finnair using it to build a digital twin of Helsinki Airport for staff training purposes, for example.
The uptake of XR technology in business can be linked back to other societal trends, such as the development of ‘The experience economy’. The experience economy is slowly replacing consumerism, where businesses sell experiences rather than a product. The customer is fully involved in the customization process, shifting its role from ownership to usership. With XR technologies becoming cheaper and more sophisticated, the opportunities for customization are endless, allowing users to immerse themselves in places or situations, whether it is to shop, interact, work, or travel.
In addition, the COVID-19 pandemic is accelerating the need to move everyday experiences to the digital space in order to limit physical interactions, and technologies are rapidly evolving to offer virtual access to a multiplicity of experiences in response to mobility restrictions and isolation policies such as teleworking. With such technologies, customers can try on outfits in the virtual space and see themselves from multiple angles, or travel and work from the comfort of their armchairs. XR technologies thus open the door to alternative approaches to address current social needs, from wellness tourism to training opportunities and even criminal rehabilitation, simulating real life scenarios to prepare offenders before their reintegration into society.
Innovation in communication and visual technologies accelerating the uptake of XR technologies
This expansion in XR use is driven largely by innovations in communication and visual technologies that are improving the user experience and making these technologies more popular and accessible to the general public. Key enabling developments include portability, high speed Internet access, graphic and sound quality as well as GPS data, which increases the potential reach of those technologies. With the evolution of wearable XR technologies such as smart-glasses or contact lenses that include quality sensors, users can experience their surroundings with additional computer-generated inputs that appear real.
In manufacturing, for example, the development of such sensors and AR glasses can help workers with efficiency and safety by giving hands-free access to user manuals and audio instructions, helping them locate items, tracking stock in real time or warning the wearer of equipment needing maintenance or showing defects. The most recent devices, such as the HoloLens 2 (a pair of MR smart-glasses developed and manufactured by Microsoft), can now understand the characteristics of items in their field of vision rather than simply attesting that they exist, which means they can identify and warn the wearer of hazards rather than simply point to the presence of objects. The extensive applications of such technologies lead experts to predict a wide increase in the use of VR and AR at an annual growth rate of over 80% over the next few years. In fact, in the next few decades, electronic communication and information sharing using AR and VR, such as livestream or videos, are expected to take over from traditional text and images.
The increasing use of cyberspace to perform everyday activities could give more influence and authority to non-traditional actors, possibly leading to the creation of new forms of authority beyond the individual countries. Already, such groups can use social media to exert significant societal pressures – one example of this is how users of Reddit (a social news aggregation, Web content rating and discussion Website), managed to shake the stock market with ‘meme-stocks’ and the coordinated buying of GameStop stocks by retail investors in 2021.
In a pessimistic scenario, expanding the competitive space to the digital realm could also provide a new medium for conflict and warfare, which is already seen with the rise in cyber-terrorism. The virtual arena and XR technologies provide increasing opportunities for misinformation and propaganda, as well as avenues for cyber-attacks and hybrid forms of conflict.
Another risk is that XR technologies might complicate pre-existing issues linked with digital technologies and social media, such as those related to the protection of identity and ownership, as well as the risk of misinformation and bias. Examples of fake news and deepfake videos using deep learning technologies highlight the risks of XR innovations and the increasing difficulty to distinguish what is real and what is digitally constructed.[13,21]
Extended realities technologies such as AR and VR, and the constant evolution of the digital space towards the Metaverse is a promising field that can further user experience in business and leisure alike. Additional trending technologies such as the roll out of ‘5G’ will further support the development of XR experiences, by enabling more people to be connected at the same time to enjoy a quality experience with minimal latency.
With other DARQ technologies, it holds the power to radically modify how we behave and interact and will be directly dependent on innovation in communication and visual technologies, with which it shares similar risks that must be addressed.
- Published 90 Standards | Developing 14 Projects
- Information technologyComputer graphics and image processingThe Virtual Reality Modeling LanguagePart 1: Functional specification and UTF-8 encoding
- Information technologyComputer graphics and image processingThe Virtual Reality Modeling Language (VRML)Part 2: External authoring interface (EAI)
- Information technologyComputer graphics, image processing and environmental representationSensor representation in mixed and augmented reality
- Information technologyComputer graphics, image processing and environmental data representationMixed and augmented reality (MAR) reference model
- Information technologyComputer graphics, image processing and environmental data representationLive actor and entity representation in mixed and augmented reality (MAR)
Blockchain technology is a form of distributed ledger technology (DLT), which provides unprecedented potential for removing intermediaries by allowing participating parties to exchange not only information but also value (money, contracts, property rights) without necessitating trust in specific, pre-determined intermediaries such as banks or servers.[5,6,22] This is because DLT enables transaction data to be validated within a system wherein control is distributed among multiple, independent participants and stored in a manner that is tamper-evident and immutable by design. By ensuring system-wide agreement about the state of the ledger, DLT can be used to promote privacy, safety, transparency, and integrity of the transaction process.[22,23]
Distributed ledgers open up many new possibilities; for example, for monitoring the supply chain or managing digital rights. DLT is therefore regarded as a central enabler for digital, self-executing contracts, so-called smart contracts.
Many industry leaders have already achieved significant business benefits, including greater transparency, enhanced security, improved traceability, increased efficiency, faster transactions, and reduced costs by DLTs. Financial services and banking are the most frequently targeted sectors for DLT service providers. Capital markets are clearly dominating, followed by insurance, and trade finance. Research from Gartner says that 300 million blockchain transactions were processed through the end of 2017 and assets worth more than USD 270 billion were being managed using DLT.[23,26]
Blockchain is best known as the technology behind cryptocurrencies (see ‘New business models’)[1,6], but is increasingly known for its role facilitating the trading of non-fungible tokens (NFTs). While cryptocurrencies (like physical money) are ‘fungible’, meaning they are equal in value and can be traded or exchanged for one another (one dollar is always worth another dollar; one Bitcoin is always equal to another Bitcoin), NFTs each have their own digital signature that makes it impossible for NFTs to be exchanged, as no two are equal (hence, non-fungible). NFTs are digital assets with programmed scarcity, and as such are ideal to represent ownership of unique virtual assets and digital identities in Web 3.0 and the Metaverse.
But blockchain can be used for a much wider variety of applications beyond cryptocurrency and the financial services and banking sectors. Although much focus is still put on monetary uses, there is an increasing interest in non-monetary uses and applications, e.g. digital identity, healthcare, supply chain, and energy. For example:
- In West Africa and Kenya, blockchain has enabled the efficient verification of property records and transactions, and expanded access to credit in some previously informal sectors of the economy.
- The London-based start-up Resonance uses blockchain to automate the transfer of product information between brands, manufacturers, and retailers. According to Resonance, over 30% of product data in product catalogues is wrong, with each error costing an average of USD 60 to fix. The innovative technology ensures that only trustworthy information is forwarded and is done so anonymously. The recipients first check the data sheets they receive before integrating information into their internal systems – such as for material requirements planning.
- In Switzerland Streamr has developed an anti-theft sticker that protects valuable goods without revealing their location. The sticker is fitted with an array of sensors that identify issues such as location, acceleration, and temperature. The data collected in this way is managed by Streamr’s blockchain network and based on smart contracts. The stickers can be used in the transport of goods, for example. Customers would only find out where they are currently located if the forwarder violates the previously agreed terms and conditions of transportation.
- In Australia Power Ledger has developed a blockchain-based platform that enables users to invest in major, renewable-energy projects. This allows users who want to invest in the expansion of renewable energy to buy small stakes in projects and accelerate their growth. The first offers are parts of a commercial solar park and a grid-connected battery storage project in Australia, which will be offered via cryptocurrencies in the blockchain.
- And CSIRO has explored using blockchain to verify food provenance, so consumers can know exactly where their food came from and what has happened to it at each step of the chain.
According to Gartner’s value forecast for the blockchain business, after the first phase of a few high-profile successes in 2018–2021, there will be larger, focused investments and many more successful models in 2022–2026. And these are expected to explode in 2027–2030, reaching more than USD 3 trillion globally. In 2018, China alone accounted for nearly 50% of all patent applications for technology families relating to blockchains, and, together with the United States, represents more than 75% of all such patent applications.
- Published 11 Standards | Developing 5 Projects
Cloud technology allows users to access scalable technology services immediately via the Internet’s existing network, promoting lower costs for infrastructure and inventory, reducing overheads, and creating leaps in computing power and speed, data storage, and bandwidth. However, it has one major problem – the latency (time lag or communication delay over the network) that results from the physical distance between users and the data centres hosting cloud-based services. This problem can be overcome using Edge computing; this is a different technology from cloud computing and its relevance is set to increase as the ‘Internet of Things’ becomes ubiquitous and the sheer amount of data that needs to be moved and processed increases exponentially.
This is because edge computing allows users to overcome the latency issue by performing computations near or at the source of data – data is processed directly on-site using dedicated hardware. Edge thus provides an important advantage when processing time-sensitive data, or when data processing is needed in a remote location where there is limited connectivity. In future, edge computing will be important for health care, automotive and manufacturing applications, because of the increased speed and security of processing data directly on devices (as opposed to sending it into the Cloud).[7,23]
Aside from reducing latency, edge computing has several other advantages, such as saving bandwidth and network costs, and enhancing security and privacy. Microsoft, for example, claims that edge computing enables more industries to safely use the cloud and still meet their compliance requirements. McKinsey finds that the industries with the most edge computing use cases are travel, transportation, and logistics; energy; retail; healthcare; and utilities. Here are just a few examples of applications of edge computing:
- ‘Autonomous vehicles’ can gather the data produced by vehicle sensors and cameras, process it, analyze it and make decisions in just a few milliseconds to keep vehicles and pedestrians safe.
- Intelligent transportation systems enable passenger information systems, vehicle monitoring and tracking systems, intelligent surveillance of transportation vehicles and stations, intelligent traffic management systems and more. Fleet management allow organizations to intelligently manage their vehicle fleets with a variety of rich information.
- Remote monitoring of oil and gas assets can be deployed in oil and gas fields where process conditions (such as extreme temperature variations) can be effectively and safely monitored and managed offsite.
- Patient conditions can be tracked in real time and treatment can be improved through better patient treatment compliance and early identification of health complications.
According to a report by Grand View Research, the global edge computing market size is anticipated to reach USD 61.14 billion by 2028, exhibiting at a CAGR of 38.4% over the forecast period.
- Published 26 Standards | Developing 6 Projects
- Cloud computing and distributed platforms ─ Data flow, data categories and data usePart 1: Fundamentals
- Information technologyCloud computingEdge computing landscape
- Information technologyCloud computingTaxonomy based data handling for cloud services
- Information technologyCloud computingCommon technologies and techniques
Quantum technologies rely on the principles of quantum physics and cover a broad range of applied areas like quantum communication, quantum computing, quantum cryptography, quantum imaging, quantum metrology, quantum sensors, and quantum simulation.
Quantum computing, in particular, could be a game changer and revolutionize the way we perform calculations. Quantum computers are the next generation of computers, which operate based on the laws of quantum mechanics and are made up of quantum circuits. The fundamental building-block of the quantum computer is the quantum bit or ‘qubit’, the quantum analogue of the binary digit or classical computing bit. The qubit can exist in two states (analogous to the ‘1’ and ‘0’ of the classical bit) as well as a superposition state (where it is both ‘1’ and ‘0’ at the same time). Because qubits can exist in multiple states at the same time, the quantum computer has the potential to be a hundred million times faster than a traditional computer. With its help, databases can be searched faster, complex systems such as molecular-level behaviour can be modelled and simulated to make better medicines and today’s encryption technologies can be strengthened or cracked.[13,23]
In the future, it will be possible to book and obtain quantum computing power via the Cloud from providers such as Amazon and IBM, triggering the era of hypercomputation.
Even though quantum could be considered the most nascent DARQ technology, investment has been growing rapidly and this investment is happening at multiple levels, e.g. from companies through to supranational institutions and countries. For example, China set up the world’s first quantum cryptographic network (Jinan Project) in 2017. Meanwhile, the European Union launched a quantum flagship initiative in 2018 covering quantum communication, quantum simulation, quantum computing, quantum metrology, and sensing as well as the basic science behind quantum technologies. With a budget of at least EUR 1 billion over ten years, the long-term vision of the flagship initiative is to develop a quantum Web in Europe, where quantum computers, simulators and sensors are interconnected via quantum communication networks.
In terms of private sector advancements, major players like Google, Alibaba, IBM, Baidu and Hewlett Packard are all busy doing their own research. In 2021, IBM Quantum unveiled the Eagle chip, delivering 127 qubits on a single IBM quantum processor for the first time with breakthrough packaging technology. Eagle broke the 100-qubit processor barrier and is leading quantum computers into a new era. IBM anticipates that, with Eagle, users will be able to explore uncharted computational territory – and experience a key milestone on the path towards practical quantum computation.
Despite the excitement and investment, however, quantum technologies are in their very early stages, and it will be a long time before they take over the market. For example, the quantum computer market of the future is only predicted to grow to about the size of today’s supercomputer market, worth around USD 50 billion (as compared to today’s market for classical computing devices, which was already worth over USD 1 trillion in 2019) and, even by 2030, none of the smartphones, tablets and computers in use will be quantum powered.
- Understanding the DNA of DARQ (Accenture, 2020)
- Technology vision 2020. We, the post-digital people (Accenture, 2020)
- Technology vision 2019. The post-digital era is upon us (Accenture, 2019)
- Stanford University launches the institute for human-centered artificial intelligence (Stanford University, 2019)
- Digital megatrends. A perspective on the coming decade of digital disruption (Commonwealth Scientific and Industrial Research Organisation, 2019)
- Digital economy report 2019. Value creation and capture: Implications for developing countries (UN Conference on Trade and Development, 2019)
- 2021 Tech trends report. Strategic trends that will influence business, government, education, media and society in the coming year (Future Today Institute, 2021)
- Beyond the noise. The megatrends of tomorrow's world (Deloitte, 2017)
- Ten trends that will shape science in the 2020s. Medicine gets trippy, solar takes over, and humanity—finally, maybe—goes back to the moon (Smithsonian Magazine, 2020)
- Foresight Africa. Top priorities for the continent 2020-2030 (Brookings Institution, 2020)
- AI in education. Change at the speed of learning (UN Educational, Scientific and Cultural Organization, 2020)
- Global trends 2020. Understanding complexity (Ipsos, 2020)
- Global strategic trends. The future starts today (UK Ministry of Defence, 2018)
- Global risks 2035 update. Decline or new renaissance? (Atlantic Council, 2019)
- Protecting privacy in an AI-driven world (Brookings institution, 2020)
- The global risks report 2021 (World Economic Forum, 2021)
- What do we do about the biases in AI (Harvard Business Review, 2019)
- A review of extended reality (XR) technologies for manufacturing training (Technologies, 2020)
- Future possibilities report 2020 (UAE Government, 2020)
- The Reddit revolt. GameStop and the impact of social media on institutional investors (The TRADE, 2021)
- When seeing is no longer believing. Inside the Pentagon’s race against deepfake videos (CNN Business, 2019)
- Global connectivity outlook to 2030 (World Bank, 2019)
- AGCS trend compass (Allianz, 2019)
- Top five blockchain benefits transforming your industry (IBM, 2018)
- Global blockchain benchmarking study (University of Cambridge, 2017)
- Will blockchain disrupt financial services (Gartner, 2017)
- What is an NFT? Non-fungible tokens explained (Forbes, 2022)
- Significance of NFTs in Web 3.0 and the Metaverse (Selfkey, 2022)
- Blockchain Opens Up Kenya’s $20 Billion Informal Economy (Bloomberg, 2018)
- Is your honey faking it? (Commonwealth Scientific and Industrial Research Organisation, 2018)
- Forecast. Blockchain business value, Worldwide, 2017-2030 (Gartner, 2017)
- World trade report 2018. The future of world trade: How digital technologies are transforming global commerce (World trade Organisation, 2018)
- Patent analytics report on blockchain innovation (IP Australia, 2018)
- 3 Advantages (and 1 disadvantage) of edge computing (Forbes, 2020)
- Edge computing. What it is and how it's a game-changer (CMS WIRE, 2018)
- New demand, new markets: What edge computing means for hardware companies (McKinsey, 2018)
- Examples of edge computing (Premio, 2021)
- Edge computing market growth & trends (Grand View Research, 2021)
- Future technology for prosperity. Horizon scanning by Europe's technology leaders (European Commission, 2019)
- China set to launch an 'unhackable' internet communication (BBC, 2017)
- IBM unveils breakthrough 127-qubit quantum processor (IBM, 2021)
- Quantum computers. The next supercomputers, but not the next laptops (Deloitte, 2018)