Vocabulary of Emerging Technologies

We collect and provide definitions of all modern emerging technologies. Enjoy! If you see something missed, please do let us know!

Technology Definition
  • Actionable Analytics

    Actionable Analytics is getting access to relevant data in the correct context; the ability to take action immediately, the power to acquire meaningful outcomes. The fast evolving arena of Business Intelligence and Analytics aims to make analytics ‘invisible’ or more accessible and comprehensible to employees across the organization to enable better, faster and more productive decisions.

  • Adaptive Machine Learning

    In machine learning many algorithms are adaptive or have adaptive variants, which usually means that the algorithm parameters are automatically adjusted according to statistics about the optimisation thus far (e.g. the rate of convergence). Examples include adaptive simulated annealing, adaptive coordinate descent, AdaBoost, and adaptive quadrature.

  • Adaptive automatic learning

    See Adaptibe Machine Learning

  • Advanced Antibodies
  • Advanced driver-assistance systems (ADAS)

    Advanced driver-assistance systems (ADAS), are electronic systems that help the vehicle driver while driving or during parking.

  • Affective Computing

    Affective computing technologies sense the emotional state of a user (via sensors, microphone, cameras and/or software logic) and respond by performing specific, predefined product/service features, such as changing a quiz or recommending a set of videos to fit the mood of the learner.

  • AgTech

    Agtech is the application of new digital technologies with the intention of rapidly improving how efficiently activities at the different stages of agricultural value chains are conducted. Agtech is also seen to represent the application of technology – especially software and hardware technology – to the field of farming.[1] Agtech is a relatively new phenomenon emerging in the last decade to address increasing concerns about food security for the growing global population and the diminishing viability of farming.

  • AI Assistant

    See virtual assistant

  • AI Cloud Services

    AI cloud services are hosted services that allow development teams to incorporate the advantages inherent in AI and machine learning.

  • AI Developer Toolkits

    AI Developer Toolkits are applications and software development kits (SDKs) that abstract data science platforms, frameworks and analytic libraries to enable software engineers to deliver AI-enabled applications. They cover 4 maturing categories: cloud-based AI as a service (AIaaS), toolkits for virtual assistants, device development kits, and AI serving SDKs.

  • AI Governance

    AI governance is the idea that there should be a legal framework for ensuring that machine learning (ML) technologies are well researched and developed with the goal of helping humanity navigate the adoption of AI systems fairly.

  • AI Marketplaces

    AI Marketplace is an easily accessible place supported by a technical infrastructure that facilitates the publication, consumption, and billing of reusable algorithms. Some marketplaces are used within an organization to support the internal sharing of prebuilt algorithms among data scientists.

  • AI PaaS

    AI PaaS is a set of AI and machine learning (ML) platform services for building, training, and deploying AI-powered functionalities for applications.

  • AI for marketing

    AI for marketing comprises systems that change behaviors without being explicitly programmed based on data collected, usage analysis and other observations for marketing use cases. Unprecedented insight, intuition and scale fueled by AI will help marketers deliver relevant experiences to prospects and customers with increasing effectiveness and efficiency.

  • AI-Related C&SI Services

    Artificial Intelligence (AI)-Related Consulting and System Integration (C&SI) Services are a subset of intelligent automation services to help clients ideate use cases, design business or IT processes, select technologies, curate data, build and train models, deploy solutions, assess and mitigate risks, and adapt talent mix to successfully incorporate intelligent solutions. Intelligent solutions must involve one or more advanced technologies, such as maching learning, deep learning or NLP.

  • Analytics and BI Platform as a Service

    Analytics and business intelligence (ABI) platforms are characterized by easy-to-use functionality that supports a full analytic workflow — from data preparation to visual exploration and insight generation — with an emphasis on self-service and augmentation. ABI platforms are no longer differentiated by their data visualization capabilities, which are becoming commodities. Instead, differentiation is shifting to: Integrated support for enterprise reporting capabilities, Augmented analytics. ABI platform functionality includes the following 15 critical capability areas: security, Manageability, Cloud, Data source connectivity, Data preparation, Model complexity, Catalog, Automated insights, Advanced analytics, Data Visualization, Natural language query, Data storytelling, Embedded analytics, Natural language generation (NLG), Reporting

  • Analytics and Business Intelligence (ABI)

    Analytics and business intelligence (ABI) is an umbrella term that includes the applications, infrastructure and tools, and best practices that enable access to and analysis of information to improve and optimize decisions and performance.

  • Appendage and Biological Function Augmentation

    In appendage and biological function augmentation, exoskeletons and prosthetics are used to replace or enhance such capabilities. We can see this in various forms, including surgical augmentation of the eyes of professional golfers and cochlear implants replacing nonfunctioning auditory nerves. Both the cosmetics and pharmaceutical industries are prime examples of what this type of augmentation looks like. Passive implants are used to enhance nails, hair, and even reshape body parts. Nootropics involves the use of natural or synthetic substances that has the potential to enhance a human’s mental skills, which sparks controversy.

  • Application Data Management

    Application data management (ADM) is a technology-enabled business discipline in which business and IT work together to ensure the uniformity, accuracy, stewardship, governance, semantic consistency and accountability for data in a business application or suite, such as ERP, custom-made or core banking. Application data is the consistent and uniform set of identifiers and extended attributes maintained and/or used within an application or suite. Examples of such entities include customers, suppliers, products, assets, site, and prices.

  • AR Cloud

    AR Cloud is a machine-readable, 1:1 scale models of the world that is continuously updated in real-time. It is a collection of billions of machine-readable datasets, point clouds, and descriptors, aligned with real- world coordinates; a living, shared, “soft copy” of the world created by scanning physical features around us in which persistent augmented reality experiences reside.

  • Artificial Emotional Intelligence

    See Affective Computing

  • Artificial General Intelligence

    Artificial General Intelligence (AGI) is AI that is designed to work with people to help solve currently intractable multidisciplinary problems, including global challenges such as climate change, more personalised healthcare and education etc. Modern AI systems work well for the specific problem on which they’ve been trained, but getting AI systems to help address some of the hardest problems facing the world today is argued to require generalisation and deep mastery of multiple AI technologies.

  • Artificial Intelligence (AI)

    Artificial intelligence (AI) applies advanced analysis and logic-based techniques, including machine learning, to interpret events, support and automate decisions, and take actions

  • Augmented Analytics

    Augmented analytics is the use of enabling technologies such as machine learning and AI to assist with data preparation, insight generation and insight explanation to augment how people explore and analyze data in analytics and BI platforms. It also augments the expert and citizen data scientists by automating many aspects of data science, machine learning, and AI model development, management and deployment.

  • Augmented Data Management

    Augmented data management leverages ML capabilities and AI engines to make enterprise information management categories including data quality, metadata management, master data management, data integration as well as database management systems (DBMSs) self-configuring and self-tuning.

  • Augmented Intelligence

    Augmented intelligence is a human-centered partnership model of people and artificial intelligence (AI) working together to enhance cognitive performance, including learning, decision making, and new experiences.

  • Augmented Intelligence Scenarios

    AI working with humans

  • Augmented Reality (AR)

    Augmented reality (AR) is the real-time use of information in the form of text, graphics, audio and other virtual enhancements integrated with real-world objects.

  • AutoML

    Automated machine learning (AutoML) is the capability of automating the process of building, deploying, and managing machine learning models.

  • Automated Business Processes Discovery (ABPD)

    Automated business process discovery (ABPD) as a complementary approach overcomes many of these shortcomings to create a business process model at a fraction of the time and cost involved in the traditional way. One major benefit of ABPD is process discovery across the “white space,” the process knowledge gap that exists between departments and functions and at the edges of processes. Modeling can be done by hand, so there may be no cash-flow impact, unlike ABPD, where a tool is necessary. By observing the details of how processes are executed in the supporting technology, ABPD uses unstructured, event-level data to automatically build process definitions and models, and explore process variations. Because of the quantitative nature of this data, the derived process models enable rich and interactive analysis. ABPD techniques start from event logs (audit trails, messages, transactions, databases and so forth), and try to discover patterns to fit a process model to the information obtained from the events. The underlying techniques are strong enough so that users don’t have to specify a process model; it is created from scratch by pattern recognition. Moreover, ABPD delivers information on bottlenecks, variances, root causes and the distribution of the process instances, thus enabling meaningful simulation. It’s all about capturing what has happened in a real-life process. ABPD is a form of process optimization. Of course, ABPD does not capture informal human tasks that are not automated and represented as events in an audit trail or transaction log. This is why ABPD is best combined with techniques from social BPM and BPA for the masses. However, ABPD accelerates process discovery and identifies previously unseen process patterns. Planners must still conduct interviews to capture informal work practices, but organizations no longer need to spend as much time using interviews to discover the implicit processes previously hidden in automated solutions.

  • Autonomous Driving

    Autonomous driving normally refers to self-driving vehicles or transport systems that move without the intervention of a human driver.

  • Autonomous Driving Level 4

    Level 4 vehicles can operate in self-driving mode. But until legislation and infrastructure evolves, they can only do so within a limited area (usually an urban environment where top speeds reach an average of 30mph). This is known as geofencing. As such, most Level 4 vehicles in existence are geared toward ridesharing. Level 4 vehicles can intervene if things go wrong or there is a system failure. In this sense, these cars do not require human interaction in most circumstances. However, a human still has the option to manually override.

  • Autonomous Driving Level 5

    This level of autonomous driving requires absolutely no human attention. There is no need for pedals, brakes, or a steering wheel, as the autonomous vehicle system controls all critical tasks, monitoring of the environment and identification of unique driving conditions like traffic jams.

  • Autonomous Robot

    An autonomous robot is a robot that performs behaviors or tasks with a high degree of autonomy (without external influence). Autonomous robotics is usually considered to be a subfield of artificial intelligence, robotics, and information engineering

  • Autonomous Vehicles

    An autonomous vehicle is one that can drive itself from a starting point to a predetermined destination in “autopilot” mode using various in-vehicle technologies and sensors, including adaptive cruise control, active steering (steer by wire), anti-lock braking systems (brake by wire), GPS navigation technology, lasers and radar.

  • Battery recycling

    Battery recycling is a recycling activity that aims to reduce the number of batteries being disposed as municipal solid waste. Batteries contain a number of heavy metals and toxic chemicals and disposing of them by the same process as regular trash has raised concerns over soil contamination and water pollution.

  • Bifacial Solar Panel

    A bifacial solar panel is a double-sided energy factory that transforms sunlight into electrical energy on both its top and bottom sides. They are different from monofacial solar panels which only use one side for solar energy production.

  • Biochips

    Biochips relate to a number of technologies that involve the merging of semiconductor and biological sciences. The most common form is based on an array of molecular sensors arranged on a small surface _ typically referred to as “lab-on-chip.” The underlying mechanism utilizes microfluidic micro-electromechanical systems (MEMS) technology. These devices are used to analyze biological elements such as DNA, ribonucleic acid and proteins, in addition to certain chemicals.

  • Blockchain

    Blockchain is a type of distributed ledger for maintaining a permanent and tamper-proof record of transactional data. A blockchain functions as a decentralized database that is managed by computers belonging to a peer-to-peer (P2P) network. Each of the computers in the distributed network maintains a copy of the ledger to prevent a single point of failure (SPOF) and all copies are updated and validated simultaneously.

  • Blockchain Business Models

    A blockchain business model is decentralized, allows peer-to-peer transactions, and helps create a trusted network. Blockchain-based business models use tokens to make profits. Anyone who helps blockchain reach consensus will receive a token which carries some value and this value depends on the market conditions.

  • Blockchain Society

    Blockchain Society is a network which aims to develop the Blockchain ecosystem in the country.

  • Blockchain for Lead Generation

    The technology offers sales organizations the ability to provide users with the most current and highest-quality leads via the exchange of personal and/or business information based on crowdsourcing data collection methods.

  • Blue Collar Software

    It is software targeted at workers, often blue-collar, who do their jobs outside corporate offices. For example, software for billing, scheduling, navigation and many other business workflows.

  • Blue Hydrogen

    Blue Hydrogen is hydrogen that meets the low-carbon threshold but is generated using non-renewable energy sources (e.g. nuclear)

  • Brain Augmentation

    In addition to treating seizures, brain implants are being used for extraordinary things, such as memory storage, decoding neural patterns, and synthesizing speech. The ability for anything to alter the human brain in such a way was once unheard of, so this is a major turning point in the technology world.

  • Business Intelligence

    Business intelligence (BI) comprises the strategies and technologies used by enterprises for the data analysis of business information. BI technologies provide historical, current, and predictive views of business operations. Common functions of business intelligence technologies include reporting, online analytical processing, analytics, data mining, process mining, complex event processing, business performance management, benchmarking, text mining, predictive analytics, and prescriptive analytics. BI technologies can handle large amounts of structured and sometimes unstructured data to help identify, develop, and otherwise create new strategic business opportunities.

  • Business Intelligence (bi) Platforms

    Business intelligence (BI) platforms enable enterprises to build BI applications by providing capabilities in three categories: analysis, such as online analytical processing (OLAP); information delivery, such as reports and dashboards; and platform integration, such as BI metadata management and a development environment.

  • Business Intelligence (bi) Services

    Business intelligence (BI) services are offerings to design, develop and deploy enterprise processes and to integrate, support and manage the related technology applications and platforms. These include business and infrastructure applications for BI platforms, analytics needs and data warehousing infrastructure. Solutions include areas such as corporate performance management (CPM) and analytics, in addition to the traditional BI platform, data warehouse/data infrastructure and data quality areas.

  • Chatbots

    A chatbot is a domain-specific conversational interface that uses an app, messaging platform, social network or chat solution for its conversations. Chatbots vary in sophistication, from simple, decision-tree-based marketing stunts, to implementations built on feature-rich platforms. They are always narrow in scope. A chatbot can be text- or voice-based, or a combination of both.

  • Cloud Computing

    Cloud computing is a general term for anything that involves delivering hosted services over the internet. These services are divided into three main categories: infrastructure as a service (IaaS), platform as a service (PaaS) and software as a service (SaaS).

  • Cloud Federation

    See Federated Cloud

  • Cloud Object Storage

    Cloud object storage is a format for storing unstructured data in the cloud. Object storage is considered a good fit for the cloud because it is elastic, flexible and it can more easily scale into multiple petabytes to support unlimited data growth. The architecture stores and manages data as objects compared to block storage, which handles data as blocks, and logical volumes and file storage which store data in hierarchical files.

  • Cloud Office Migration

    Cloud office migration refers to the process of consolidating and transferring a collection of workloads. Workloads can include emails, files, calendars, document types, related metadata, instant messages, applications, user permissions, compound structure and linked components. Migration of such workloads from at least one permanent on-premises platform or cloud office to a new cloud office environment, across a variety of similar product classes, is typical. During the migration process, enterprises will choose to cleanse their existing platforms by archiving old and outdated data. Migration tools are able to conduct an analysis of the respective workloads, identify if the data is deemed suitable for migration while maintaining its integrity, migrate the data between the source and target, and ensure governance in the subsequent platform.

  • Cloud operations (CloudOps)

    Cloud operations (CloudOps) is the management, delivery and consumption of software in a computing environment where there is limited visibility into an app’s underlying infrastructure. In the enterprise, CloudOps uses the DevOps principals of continuous integration and continuous deployment (CI/CD) to create best practices for achieving high availability by refining and optimizing business processes that run in a public cloud.

  • Cobots

    Cobots, or collaborative robots, are robots intended to interact with humans in a shared space or to work safely in close proximity.

  • Cognitive Agents

    Cognitive agents are kind of agents that use Belief-Desire-Intention (BDI) based architecture, which are recently becoming popular due to their human kind of reasoning. Cognitive agents are normally static agents that require higher computation and more databases. These support autonomic computing.

  • Cognitive Computing

    Cognitive computing is the use of computerized models to simulate the human thought process in complex situations where the answers may be ambiguous and uncertain. The phrase is closely associated with IBM’s cognitive computer system, Watson. Cognitive computing overlaps with AI and involves many of the same underlying technologies to power cognitive applications, including expert systems, neural networks, robotics and virtual reality (VR).

  • Cognitive augmentation

    Cognitive Augmentation can occur through accessing information and exploiting applications on traditional computer systems and the emerging multiexperience interface in smart spaces. This includes augmented intelligence scenarios, where humans and artificial intelligence work together to enhance cognitive performance, including decision making and learning. Cognitive Augmentation improves the human thought and decision-making process, and carries technological elements of the brain augmentation category “as they are physical implants that deal with cognitive reasoning.

  • Column-oriented DBMS

    A column-oriented DBMS  is a database management system (DBMS) that stores data tables by column .

  • Computer Vision

    Computer vision is a field of computer science that works on enabling computers to see, identify and process images in the same way that human vision does, and then provide appropriate output. It is like imparting human intelligence and instincts to a computer. In reality though, it is a difficult task to enable computers to recognize images of different objects. Computer vision is closely linked with artificial intelligence, as the computer must interpret what it sees, and then perform appropriate analysis or act accordingly.

  • Computer-Brain Interface

    Computer-brain interface is a type of user interface, whereby the user voluntarily generates distinct brain patterns that are interpreted by the computer as commands to control an application or device. The best results are achieved by implanting electrodes into the brain to pick up signals. Noninvasive techniques are available commercially that use a cap or helmet to detect the signals through external electrodes.

  • Connected Home

    A connected home is networked to enable the interconnection and interoperability of multiple devices, services and apps, ranging from communications and entertainment to healthcare, security and home automation. These services and apps are delivered over multiple interlinked and integrated devices, sensors, tools and platforms. Connected, real-time, smart and contextual experiences are provided for the household inhabitants, and individuals are enabled to control and monitor the home remotely as well as within it.

  • Content Migration

    Content Migration is the process of moving information stored on a Web content management system (CMS), Digital asset management (DAM), Document management system (DMS), or flat HTML based system to a new system. Flat HTML content can entail HTML files, Active Server Pages (ASP), JavaServer Pages (JSP), PHP, or content stored in some type of HTML/JavaScript based system and can be either static or dynamic content.

  • Continuous Integration Ops (CIOps)

    Continuous Integration Ops (CIOps) require CI operators or administrators to configure the IT infrastructure required to support the new codes before deployment continues. The CI system is designed to run build and tests then deploy at varying levels of sophistication according to the complexity of the pipeline.

  • Continuous Intelligence

    Continuous intelligence is a design pattern in which real-time analytics are integrated into business operations, processing current and historical data to prescribe actions in response to business moments and other events.

  • Controlled-environment agriculture (CEA)

    Controlled-environment agriculture (CEA) is a technology-based approach toward food production. The aim of CEA is to provide protection and maintain optimal growing conditions throughout the development of the crop. Production takes place within an enclosed growing structure such as a greenhouse or building. Plants are often grown using hydroponic methods in order to supply the proper amounts of water and nutrients to the root zone. CEA optimizes the use of resources such as water, energy, space, capital and labor. CEA technologies include hydroponics, aeroponics, aquaculture, and aquaponics.[1] Different techniques are available for growing food in controlled environment agriculture. The more viable option is vertical farming. Vertical farming has the ability to produce crops all year round in a controlled environment, with the possibility of increased yield by adjusting the amount of carbon and nutrients the plants receive

  • Controlled-environment agriculture (CEA)

    Controlled-environment agriculture (CEA) is a technology-based approach toward food production. The aim of CEA is to provide protection and maintain optimal growing conditions throughout the development of the crop. Production takes place within an enclosed growing structure such as a greenhouse or building. Plants are often grown using hydroponic methods in order to supply the proper amounts of water and nutrients to the root zone. CEA optimizes the use of resources such as water, energy, space, capital and labor. CEA technologies include hydroponics, aeroponics, aquaculture, and aquaponics.[1] Different techniques are available for growing food in controlled environment agriculture.

  • Conversational Marketing

    Conversational marketing technologies enable interactions between companies and customers that mimic human dialogue and do so at scale.

  • Conversational User Interfaces

    A conversational user interface is the ability of artificial intelligence-supported chatbots to have verbal and written interactions with human users.

  • Corporate Cloud

    See Private Cloud

  • Cryptocurrencies

    A cryptocurrency is a digital medium of exchange that uses encryption to secure the processes involved in generating units and conducting transactions.

  • Cryptocurrency Custody Services (Solutions)

    Cryptocurrency custody solutions are independent storage and security systems used to hold large quantities of tokens. Custody solutions are one of the latest innovations to come out of the cryptocurrency ecosystem and have been expected to herald the entry of institutional capital into the industry.

  • Data Catalog

    A Data Catalog is a collection of metadata, combined with data management and search tools, that helps analysts and other data users to find the data that they need, serves as an inventory of available data, and provides information to evaluate fitness data for intended uses.

  • Data Catalog

    A data catalog is a metadata management tool designed to help organizations find and manage large amounts of data – including tables, files and databases – stored in their ERP, human resources, finance and e-commerce systems as well as other sources like social media feeds.

  • Data Classification

    Data classification is the process of organizing data into categories that make it is easy to retrieve, sort and store for future use.

  • Data Fabric

    Data Fabric is an architecture and set of data services that provide consistent capabilities across a choice of endpoints spanning on-premises and multiple cloud environments.

  • Data Hub

    A data hub is a collection of data from multiple sources organized for distribution, sharing, and often subsetting and sharing. Generally this data distribution is in the form of a hub and spoke architecture.

  • Data Integration

    The discipline of data integration comprises the practices, architectural techniques and tools for achieving the consistent access and delivery of data across the spectrum of data subject areas and data structure types in the enterprise to meet the data consumption requirements of all applications and business processes.

  • Data Integration Tools

    The market for data integration tools includes vendors that offer software products to enable the construction and implementation of data access and data delivery infrastructure for a variety of data integration scenarios. For vendors, the demand for traditional data integration capabilities alongside the demand for innovative solutions requires robust, consistent delivery of highly developed solutions. Similarly, data integration tools interoperate and integrate with master data tools, data governance tools and data quality tools.

  • Data Labeling

    Data labeling, in the context of machine learning, is the process of detecting and tagging data samples. The process can be manual but is usually performed or assisted by software.

  • Data Lakes

    A data lake is a concept consisting of a collection of storage instances of various data assets. These assets are stored in a near-exact, or even exact, copy of the source format and are in addition to the originating data stores.

  • Data Literacy

    Data literacy is the ability to read, write and communicate data in context, including an understanding of data sources and constructs, analytical methods and techniques applied, and the ability to describe the use-case application and resulting value.

  • Data Preparation

    Data preparation is an iterative-agile process for exploring, combining, cleaning and transforming raw data into curated datasets for self-service data integration, data science, data discovery, and BI/analytics. T

  • Data Quality

    Data quality is a measure of the condition of data based on factors such as accuracy, completeness, consistency, reliability and whether it’s up to date. Measuring data quality levels can help organizations identify data errors that need to be resolved and assess whether the data in their IT systems is fit to serve its intended purpose.

  • Data Quality Tools

    Data quality tools are the processes and technologies for identifying, understanding and correcting flaws in data that support effective information governance across operational business processes and decision making. The packaged tools available include a range of critical functions, such as profiling, parsing, standardization, cleansing, matching, enrichment and monitoring.

  • Data Storytelling

    The ability to combine interactive data visualization with narrative techniques in order to package and deliver insights in a compelling, easily understood form for presentation to decision makers.

  • Data Virtualization

    Data virtualization is an umbrella term used to describe an approach to data management that allows an application to retrieve and manipulate data without requiring technical details about the data, such as how the data is formatted or where it is physically located. The goal of data virtualization is to create a single representation of data from multiple, disparate sources without having to copy or move the data.

  • Data and Analytics Services

    Data and analytics services are consulting, implementation and managed services for decision, analytics and information capabilities, executed on a technology platform that supports an organization’s fact-based decision making. Solutions contain information analysis and information management (IM) competencies, with commercial off-the-shelf and proprietary assets. Information analysis solutions focus on use cases and outcomes; IM solutions focus on information infrastructure and governance.

  • Data for Good

    Data for good is a movement in which people and organizations transcend organizational boundaries to use data to improve society.

  • DataOps

    DataOps is a collaborative data management practice focused on improving the communication, integration and automation of data flows between data managers and data consumers across an organization. The goal of DataOps is to deliver value faster by creating predictable delivery and change management of data, data models and related artifacts. DataOps uses technology to automate the design, deployment and management of data delivery with appropriate levels of governance, and it uses metadata to improve the usability and value of data in a dynamic environment.

  • Database Encryption

    Database encryption tools are used to protect data within relational database management systems (RDBMSs). Encryption can be implemented using native DBMS tools, third-party software tools and network-based appliances, or implemented within storage networks via fabric-based encryption. Database encryption is increasingly being implemented as a regulatory check box by clients that enable built-in encryption. There are two operational scenarios: encrypting the entire database or encrypting specific columns or fields within the RDBMS.

  • DBMS (Database Management System)

    A DBMS is a product used for the storage and organization of data that typically has defined formats and structures. DBMSs are categorized by their basic structures and, to some extent, by their use or deployment.

  • Decentralized Applications (dApps)

    A decentralized application is a computer application that runs on a distributed computing system. DApps have been popularized by distributed ledger technologies (DLT) such as the Ethereum Blockchain, where DApps are often referred to as smart contracts.

  • Decentralized Autonomous Corporation (DAC)

    See Decentralized Autonomous Organization (DAO)

  • Decentralized Autonomous Organization (DAO)

    A decentralized autonomous organization (DAO) is an organization represented by rules encoded as a computer program that is transparent, controlled by shareholders and not influenced by a central government. A DAO’s financial transaction record and program rules are maintained on a blockchain. The precise legal status of this type of business organization is unclear.

  • Decentralized Web

    See Web 3.0.

  • Decentralized finance (commonly referred to as DeFi)

    Decentralized finance (commonly referred to as DeFi) is a blockchain-based form of finance that does not rely on central financial intermediaries such as brokerages, exchanges, or banks to offer traditional financial instruments, and instead utilizes smart contracts on blockchains, the most common being Ethereum.

  • Decision Intelligence

    Decision intelligence is a practical domain framing a wide range of decision-making techniques bringing multiple traditional and advanced disciplines together to design, model, align, execute, monitor and tune decision models and processes. Those disciplines include decision management (including advanced nondeterministic techniques such as agent-based systems) and decision support as well as techniques such as descriptive, diagnostics and predictive analytics.

  • Decision Management

    Advanced nondeterministic techniques such as agent-based systems

  • Deep Neural Network ASIC

    Deep Neural Network (DNN) application-specific integrated circuit (ASIC) is a purpose-specific processor that accelerates DNN computations.

  • Deep Neural Networks

    A deep neural network is a neural network with a certain level of complexity, a neural network with more than two layers. Deep neural networks use sophisticated mathematical modeling to process data in complex ways.

  • Deepfake

    Deep fake (also spelled deepfake) is a type of artificial intelligence used to create convincing image, audio and video hoaxes.

  • Democratization

    Democratization of technology means providing people with easy access to technical or business expertise without extensive (and costly) training. It focuses on four key areas — application development, data and analytics, design and knowledge — and is often referred to as “citizen access,” which has led to the rise of citizen data scientists, citizen programmers and more.

  • DevOps

    DevOps. The word DevOps is a combination of the terms development and operations, meant to represent a collaborative or shared approach to the tasks performed by a company’s application development and IT operations teams. In its broadest meaning, DevOps is a philosophy that promotes better communication and collaboration between these teams — and others — in an organization. In its most narrow interpretation, DevOps describes the adoption of iterative software development, automation and programmable infrastructure deployment and maintenance. The term also covers culture changes, such as building trust and cohesion between developers and systems administrators and aligning technological projects to business requirements. DevOps can change the software delivery chain, services, job roles, IT tools and best practices.

  • DevSecOps

    DevSecOps (development plus security plus operations) is a management approach that combines application development, security, operations and infrastructure as a code (IaaS) in an automated, continuous delivery cycle. The main objective of DevSecOps is to automate, monitor and apply security at all phases of the software lifecycle, i.e., plan, develop, build, test, release, deliver, deploy, operate and monitor. Applying security at every stage of the software development process enables continuous integration, reducing the cost of compliance and delivering software more rapidly. DevSecOps means that every employee and team is responsible for security from the outset, and they must make decisions efficiently and put them into action without forfeiting security.

  • Digital Assistant

    See virtual assistant

  • Digital Biomarkers

    Digital biomarkers are defined as objective, quantifiable physiological and behavioral data that are collected and measured by means of digital devices such as portables, wearables, implantables or digestibles. The data collected is typically used to explain, influence and/or predict health-related outcomes. Digital biomarkers also represent an opportunity to capture clinically meaningful, objective data.

  • Digital Ethics

    Digital ethics are the norms dedicated to ensuring the autonomy and dignity of users is respected on the internet. digital ethics prescribe how two individuals communicating online should behave, how two corporations should responsibly conduct internet commerce, and how companies should treat their users.

  • Digital Human

    Digital humans are AI-powered lifelike beings that can recreate the best parts of human interaction—conversation, communication and emotional connection. They can work in virtually any industry and across a huge number of use cases.

  • Digital Ops

    Digital is a business process framework designed to simplify, measure, and manage processes across the enterprise. DigitalOps includes RPA, business process management (BPM), workflow engines, decision management suites, process mining, low-code application platforms (LCAPs), and others.

  • Digital Therapeutics (DTx)

    Digital therapeutics (DTx) deliver evidence-based therapeutic interventions to patients that are driven by high quality software programs to prevent, manage, or treat a broad spectrum of physical, mental, and behavioral conditions. Digital therapeutics form an independent category of evidence-based products within the broader digital health landscape, and are distinct from pure-play adherence, diagnostic, and telehealth products.

  • Digital Transformation

    Digital transformation can refer to anything from IT modernization (for example, cloud computing), to digital optimization, to the invention of new digital business models. The term is widely used in public-sector organizations to refer to modest initiatives such as putting services online or legacy modernization.

  • Digital Twin

    A digital twin is a digital representation of a real-world entity or system. The implementation of a digital twin is an encapsulated software object or model that mirrors a unique physical object, process, organization, person or other abstraction. Data from multiple digital twins can be aggregated for a composite view across a number of real-world entities, such as a power plant or a city, and their related processes.

  • Digital Water / Smart Water / Internet of Water / Water 4.0

    Maybe definied as a group of emerging technology solutions that help water managers operate more effectively. These solutions harness state-of-the-art hardware and software to provide increasing levels of system intelligence, visibility, automation and control, and customer service.

  • Digital Workplace

    The digital workplace is the concept that businesses should use digital transformation (DX) to align technology, employees and business processes to improve operational efficiency and meet organizational goals. Meant to be the virtual compliment to the physical office space, the digital workplace is a broad term that includes any device, software or platform that staff members use to execute their jobs.

  • Digital cloning

    Digital cloning is an emerging technology, that involves deep-learning algorithms, which allows one to manipulate currently existing audio, photos, and videos that are hyper-realistic.

  • Distributed Denial of Service (DDoS) Attack

    A type of cyber-attack in which the perpetrator continuously overwhelms the system with requests in order to prevent service of legitimate requests.

  • Distributed Ledger

    A type of database which spreads across multiple sites, countries, or institutions. Records are stored sequentially in a continuous ledger. Distributed ledger data can be either “permissioned” or “unpermissioned” to control who can view it.

  • Distributed Ledger Technology

    Distributed ledger technology (DLT) is a digital system for recording the transaction of assets in which the transactions and their details are recorded in multiple places at the same time. Unlike traditional databases, distributed ledgers have no central data store or administration functionality.

  • Distributed Ledger Technology (DLT)

    Distributed ledger technology (DLT) is a digital system for recording the transaction of assets in which the transactions and their details are recorded in multiple places at the same time. Unlike traditional databases, distributed ledgers have no central data store or administration functionality.

  • Distributed development

    Distributed development is a software development model in which IT teams spread across geographical lines collaborate on applications or various software. These teams are often separated by mini-projects that are brought together for a final software buildout.

  • Document Store DBMSs

    A document store database (document store) is a database that uses a document-oriented model to store data.mDocument store databases store each record and its associated data within a single document. Each document contains semi-structured data that can be queried against using various query and analytics tools of the DBMS.

  • Domotics

    See Home Automation

  • Drone Swarms

    Drone swarm technology is the ability of drones to autonomously make decisions based on shared information.

  • Edge AI

    Edge AI refers to the use of AI techniques embedded in IoT endpoints, gateways, and edge devices, in applications ranging from autonomous vehicles to streaming analytics.

  • Edge Analytics

    Edge analytics is an approach to data collection and analysis in which an automated analytical computation is performed on data at a sensor, network switch or other device instead of waiting for the data to be sent back to a centralized data store.

  • Edge Computing

    Edge computing is part of a distributed computing topology where information processing is located close to the edge, where things and people produce or consume that information.

  • Electronic Health Record (HER)

    An electronic health record (EHR) is an individual’s official health document that is shared among multiple facilities and agencies. The role of EHRs is becoming increasing influential as more patient information becomes digital and larger numbers of consumers express a desire to have mobile access to their health records.

  • Emissions Upcycling

    When we burn coal, oil, or natural gas, CO2 (carbon dioxide) is produced as a waste product. If the CO2 is disposed of in the atmosphere, it contributes to global warming, an extremely serious environmental problem. A better option is to capture the waste CO2, and then to use it to make useful products, like concrete-replacement materials, for example. This is the meaning of ’emissions upcycling’ (note the similarity to the more familiar term ‘recycling’). Current carbon-capture technology is capable of capturing 85-95% of the CO2 produced by a coal-fired power plant, although it comes at the cost of diverting 10-40% of the energy generated to the CO2 capture and storage process itself.*

  • Emotion AI

    See Affective Computing

  • Enterprise automation

    Enterprise automation is the systematic review of an organization’s processes to ensure optimal performance while looking for manual processes that can be replaced with intelligent digital alternatives.

  • Event Stream Processing

    Event stream processing, or ESP, is a set of technologies designed to assist the construction of event-driven information systems. ESP technologies include event visualization, event databases, event-driven middleware, and event processing languages, or complex event processing (CEP).

  • Explainable AI

    Explainable AI (XAI) is artificial intelligence that is programmed to describe its purpose, rationale and decision-making process in a way that can be understood by the average person. XAI is often discussed in relation to deep learning and plays an important role in the FAT ML model (fairness, accountability and transparency in machine learning).

  • Extended Reality (XR)

    Extended reality (XR) is a term referring to all real-and-virtual combined environments and human-machine interactions generated by computer technology and wearables, where the ‘X’ represents a variable for any current or future spatial computing technologies.

  • Federated Cloud

    A federated cloud is the deployment and management of multiple external and internal cloud computing services to match business needs. A federation is the union of several smaller parts that perform a common action.

  • Field-Programmable Gate Array (FPGA) Accelerators

    Field-Programmable Gate Array (FPGA) Accelerators is a server-based reconfigurable computing accelerator that delivers extremely high performance by enabling programmable hardware-level application acceleration.

  • File Analysis Software

    File analysis (FA) products analyze, index, search, track and report on file metadata and file content, enabling organizations to take action on files according to what was identified. FA provides detailed metadata and contextual information to enable better information governance and organizational efficiency for unstructured data management. FA is an emerging solution, made of disparate technologies, that assists organizations in understanding the ever-growing volume of unstructured data, including file shares, email databases, enterprise file sync and share, records management, enterprise content management, Microsoft SharePoint and data archives.

  • Flying Autonomous Vehicles

    Flying Autonomous Vehicles are flying vehicles (planes, dones, etc.) that can fly itself from a starting point to a predetermined destination in “autopilot” mode using various in-vehicle technologies and sensors

  • Fork

    A fork creates an alternative version of a blockchain, and are often enacted intentionally to apply upgrades to a network. Soft Forks render two chains with some compatibility, while Hard Forks create a new version of the chain that must be adopted to continue participation. In the instance of a contentious Hard Fork, this can create two versions of a blockchain network.

  • General-Purpose Machine Intelligence

    See Artificial General Intelligence

  • Generative Adversarial Networks (GANs)

    A generative adversarial network (GAN) is a machine learning (ML) model in which two neural networks compete with each other to become more accurate in their predictions. GANs typically run unsupervised and use a cooperative zero-sum game framework to learn.

  • Genetic Augmentation

    With genetic augmentation, somatic gene and cell therapies are used to treat children with severe combined immune deficiency. CRISPR technologies may soon become more easily accessible and affordable, which will enable broad genetic engineering.

  • Geographic Information System (GIS)

    A GIS (geographic information system) enables you to envision the geographic aspects of a body of data. Basically, it lets you query or analyze a database and receive the results in the form of some kind of map. Since many kinds of data have important geographic aspects, a GIS can have many uses: weather forecasting, sales analysis, population forecasting, and land use planning, to name a few.

  • GitOps

    GitOps is a paradigm designed around Kubernetes that allows developers and IT operations teams to use Git for cluster management and application delivery. Applying a Git workflow to operations and development teams allows developers to use Git pull requests to manage infrastructure provisioning and software deployment. Git is an open source distributed code management and Version control system. Git is used to track changes in source code throughout software development and is also used for configuration and content management. A portion of Git, related to GitOps, includes pull requests. A pull request lets users tell others what changes they’ve pushed to a new branch in a repository through GitHub. When a user opens a pull request, they can review, discuss or follow up on commits before any changes are merged into a base branch.

  • Graph Analytics

    Graph analytics is a category of tools used to apply algorithms that will help the analyst understand the relationship between graph database entries.

  • Graph DBMSs

    Graph DBMS represent data in graph structures as nodes and edges, which are relationships between nodes. They allow easy processing of data in that form, and simple calculation of specific properties of the graph, such as the number of steps needed to get from one node to another node. Graph DBMSs usually don’t provide indexes on all nodes, direct access to nodes based on attribute values is not possible in these cases.

  • Graph Database

    See Graph DBMSs

  • Graph-Oriented DBMS

    See Graph DBMSs

  • Graphics-processing units (GPU) Accelerators

    GPU Accelerated computing is the use of graphics processing unit (GPU) to accelerate highly parallel compute-intensive portions of the workloads in conjunction with a CPU.

  • Haptic Technology / Kinesthetic Communication / 3D Touch

    Haptic technology, also known as kinaesthetic communication or 3D touch,[1] refers to any technology that can create an experience of touch by applying forces, vibrations, or motions to the user.[2] These technologies can be used to create virtual objects in a computer simulation, to control virtual objects, and to enhance remote control of machines and devices (telerobotics).

  • Hardware Accelerator

    Silicon chip designed to perform highly parallel operations required by AI; also enables simultaneous computations

  • Head-Mounted Display (HMD)

    Head-mounted displays (HMDs) are small displays or projection technology integrated into eyeglasses or mounted on a helmet or hat. Heads-up displays are a type of HMD that does not block the user’s vision, but superimposes the image on the user’s view of the real world. An emerging form of heads-up display is a retinal display that “paints” a picture directly on the sensitive part of the user’s retina. Although the image appears to be on a screen at the user’s ideal viewing distance, there is no actual screen in front of the user, just special optics (for example, modified eyeglasses) that reflect the image back into the eye. Other heads-up displays that are not worn by the user but are projected on a surface (for example, on a car or plane windshield) are not covered in this discussion. Some HMDs incorporate motion sensors to determine direction and movement (for example, to provide context-sensitive geographic information) or as the interface to an immersive virtual reality application.

  • Holographic 3D Printing

    See 3D Volumetric Printing

  • Home Automation

    Home automation is building automation for a home, called a smart home or smart house. A home automation system will control lighting, climate, entertainment systems, and appliances. It may also include home security such as access control and alarm systems. When connected with the Internet, home devices are an important constituent of the Internet of Things (“IoT”).

  • Home Hub

    Smart home-enabled devices can include appliances like refrigerators, washing machines, dryers, and toaster ovens, as well as heating and air conditioning units and lighting devices. Some examples of smart home-enabled electronic devices are audio and video entertainment systems, camera and security systems, and computers, laptops and other electronics mobile devices.

  • Human Augmentation

    The field of human augmentation focuses on creating cognitive and physical improvements as an integral part of the human body. An example is using active control systems to create limb prosthetics with characteristics that can exceed the highest natural human performance.

  • Human-Machine Interface (HMI)

    Human-machine interface (HMI) is a component of certain devices that are capable of handling human-machine interactions. The interface consists of hardware and software that allow user inputs to be translated as signals for machines that, in turn, provide the required result to the user. Human-machine interface technology has been used in different industries like electronics, entertainment, military, medical, etc. Human-machine interfaces help in integrating humans into complex technological systems.

  • Hybrid Cloud

    Hybrid cloud is a cloud computing environment that uses a mix of on-premises, private cloud and third-party, public cloud services with orchestration between the two platforms. By allowing workloads to move between private and public clouds as computing needs and costs change, hybrid cloud gives businesses greater flexibility and more data deployment options.

  • Hybrid automation

    Hybrid automation combines attended and unattended RPA and enables the passing of work to robots automatically or by employee prompt, allowing the robot and employee to work on different tasks simultaneously.

  • Hyper-automation

    Hyper-automation “deals with the application of advanced technologies, including artificial intelligence (AI) and machine learning (ML), to increasingly automate processes and augment humans. Hyper-automation extends across a range of tools that can be automated, but also refers to the sophistication of the automation (i.e., discover, analyse, design, automate, measure, monitor, reassess.)” In simple terms, hyper-automation refers to the mixture of automation technologies that exist to augment and expand human capabilities.

  • Hyperautomation

    Hyperautomation is the combination of multiple machine learning, packaged software and automation tools to deliver work and it deals with the application of advanced technologies including AI and machine learning to increasingly automate processes and augment humans.

  • iBPMSs

    Intelligent business process management suites (iBPMS) combine business process management (BPM) software with additional capabilities such as artificial intelligence (AI) to help companies dynamically automate more types of start-to-finish experiences. These suites are often cloud-enabled and provide low-code tools that help citizen developers create workflow solutions very quickly.

  • iBPMSs Market

    The intelligent business process management suite (iBPMS) market is the natural evolution of the earlier BPMS market, adding more capabilities for greater intelligence within business processes. Capabilities such as validation (process simulation, including “what if”) and verification (logical compliance), optimization, and the ability to gain insight into process performance have been included in many BPMS offerings for several years. iBPMSs have added enhanced support for human collaboration such as integration with social media, mobile-enabled process tasks, streaming analytics and real-time decision management.

  • Immediate Payments

    See Instant Payments

  • Immersive Analytics

    Immersive Analytics investigates how new interaction and display technologies can be used to support analytical reasoning and decision making. The aim is to provide multi-sensory interfaces for analytics approaches that support collaboration and allow users to immerse themselves in their data. Immersive Analytics builds on technologies such as large touch surfaces, immersive virtual and augmented reality environments, haptic and audio displays, and modern fabrication techniques

  • Immersive Workspaces

    Immersive workspace, come from the word immersive experience, the perception of being in one place when you are actually in another (in particular make use of immersive technology and devices, such as wearables 3d headset as an example). Immersive workspaces is new modern workplace that emerging as the result of technology advancement. The immersive workspaces enables people and contrasting environmental locations to come together in one place for the purpose of collaboration and communication in the media rich 3d environment. It allow screen sharing to co-working by collaborate working together on documents and perform live edits in real time. Actions can be created during a meeting and notes can be taken all of which will be stored on immersive workspaces website for review later on.

  • In-Memory Computing

    In-Memory Computing (IMC) is a computing style where applications assume all the data required for processing is located in the main memory of their computing environment.

  • In-Memory DBMS

    An IMDBMS is a DBMS that stores the entire database structure, including the applications, “in memory” and accesses all the data directly.

  • In-Memory Data Grid Software

    In-Memory Data Grid Software (IMDGs) provide a lightweight, distributed, scale-out in-memory object store — the data grid. Multiple applications can concurrently perform transactional and/or analytical operations in the low-latency data grid, thus minimizing access to high-latency, hard-disk-drive-based or solid-state-drive-based data storage. IMDGs maintain data grid durability across physical or virtual servers via replication, partitioning and on-disk persistence. Objects in the data grid are uniquely identified through a primary key, but can also be retrieved via other attributes. The most typical use of IMDGs is for web-scale transaction processing applications. However, adoption for analytics, often in combination with Apache Spark and Hadoop or stream analytics platforms, is growing fast — for example, for fraud detection, risk management, operation monitoring, dynamic pricing and real-time recommendation management.

  • In-database Analytics

    In-database analytics is a technology that allows data processing to be conducted within the database by building analytic logic into the database itself. Doing so eliminates the time and effort required to transform data and move it back and forth between a database and a separate analytics application.

  • Information Stewardship Applications

    Information stewardship applications are business solutions used by business users acting in the role of information steward. For the most part, these developing solutions represent a combination of a number of disparate, previously IT-centric tools that are already on the market. They are, however, organized and presented in such a way that information stewards can support the work of information policy enforcement — as part of their normal, business-centric day-to-day work — in a range of use cases.

  • Information Technologies Operations (ITOps)

    Information Technologies Operations (ITOps) is the process responsible for acquiring, designing, deploying, configuring, and maintaining the physical and virtual components that comprise your IT infrastructure.

  • Initial Coin OfferingS (ICOs)

    An initial coin offering (ICO) or initial currency offering is a type of funding using cryptocurrencies. It is often a form of crowdfunding, however a private ICOs which does not seek public investment is also possible. In an ICO, a quantity of cryptocurrency is sold in the form of “tokens” (“coins”) to speculators or investors, in exchange for legal tender or other (generally established and more stable)

  • Insights Engines

    Insight engines apply relevancy methods to describe, discover, organize and analyze data. This allows existing or synthesized information to be delivered proactively or interactively, and in the context of digital workers, customers or constituents at timely business moments. Products in this market use connectors to crawl and index content from multiple sources. They index the full range of enterprise content, from unstructured content such as word processor and video files through to structured content, such as spreadsheet files and database records. Various “pipelines” are used to preprocess content according to type, and to derive from it data that can be indexed for query, extraction and use via a range of touchpoints. Insight engines differ from search engines in terms of capabilities that enable richer indexes, more complex queries, elaborated relevancy methods, and multiple touchpoints for the delivery of data (for machines) and information (for people).

  • Instant Payments

    Instant Payments are electronic retail payment solutions available 24/7/365 and resulting in the immediate or close-to-immediate interbank clearing of the transaction and crediting of the payee’s account with confirmation to the payer (within seconds of payment initiation).

  • Integration Platform as a Service (iPaaS)

    Integration Platform as a Service (iPaaS) is a suite of cloud services enabling development, execution and governance of integration flows connecting any combination of on premises and cloud-based processes, services, applications and data within individual or across multiple organizations.

  • Intelligent Applications

    Intelligent applications are enterprise applications with embedded or integrated AI technologies to support or replace human-based activities via intelligent automation, data-driven insights, and guided recommendations to improve productivity and decision making.

  • Intelligent Avatar Platforms (IAP)

    Intelligent Avatar Platform (IAP) can be defined as an online platform supported by artificial intelligence that allows one to create a clone of themselves.

  • Intelligent Drug Discovery

    Intelligent Drug Discovery is a drug discovery process using AI technologies. If adopted at the drug discovery stage, AI solutions have the potential to kick-start the productivity of the entire R&D process. AI has the potential to: Reduce timelines for drug discovery and improve the agility of the research process; Increase the accuracy of predictions on the efficacy and safety of drugs; Improve the opportunity to diversify drug pipelines

  • Intelligent Logistics

    See Smart Logistics

  • Intelligent Sensor

    Intelligent sensors utilize advanced signal processing techniques, data fusion techniques, intelligent algorithms, and artifficial intelligence concepts to better understand sensor data, for better integration of sensors and better feature extraction, leading to measures that can be used in smart sensing applications.

  • Intelligent Virtual Agent (IVA)

    See Virtual Agent

  • Intelligent process automation (IPA)

    Intelligent process automation (IPA) is a combination of technologies used to manage and automate digital processes. IPA should increase operational efficiency, worker performance and response efficiency to customers. They should also reduce operational risk, with the goal of creating an effective operations environment.IPA is designed to assist human workers by doing manual, repetitive and routine tasks that were previously performed by humans. Technologies combined in IPA include robotic process automation (RPA), artificial intelligence (AI), machine learning and digital process automation (DPA). With these technologies — especially with AI and machine learning — an IPA tool should be able to learn how to adjust and improve the process flow to create an intelligent process. It should be able to learn and improve over time.

  • Internal Cloud

    See Private Cloud

  • IT process automation (ITPA)

    IT process automation (ITPA), also known as run book automation (RBA), is the ability to orchestrate and integrate tools, people and processes through workflow.

  • Knowledge Graphs

    Knowledge Graphs encode information (“knowledge”) as data arranged in a network (“graph”) of nodes and links (“edges”) rather than tables of raws and columns.

  • Ledger DBMS

    Ledger DBMS is a distributed ledger platform which enables the information sharing (not copied) across a network.

  • Leveraging AI to Enhance Cybersecurity Defense

    Security tool vendors are exploiting ML to make improvements to their tools, decision support and response operations. Thanks to research, frameworks and compute power, vendors can more easily access well-designed ML if they can get their hands on large quantities of relevant, high-quality training data. They are able to assess solutions and architecture, and вуфд цшер the latest ML-related attack techniques, including data poisoning, adversarial inputs, generative adversarial networks and other security-relevant innovations in ML.”

  • Living Maps

    Living maps are maps that are updated in rea time.

  • Location Intelligence

    Location intelligence (LI), is the process of deriving meaningful insight from geospatial data relationships to solve a particular problem. It involves layering multiple data sets spatially and/or chronologically, for easy reference on a map, and its applications span industries, categories and organizations.

  • Location Intelligence (LI)

    Location intelligence (LI) is a business intelligence (BI) tool capability that relates geographic contexts to business data. Like BI, location intelligence software is designed to turn data into insight for a host of business purposes. Such tools draw on a variety of data sources, such as geographic information systems (GIS), aerial maps, demographic information and, in some cases, an organization’s own databases.

  • Logical Data Warehouse

    The Logical Data Warehouse (LDW) is a data management architecture for analytics combining the strengths of traditional repository warehouses with alternative data management and access strategy.

  • Machine Learning (ML)

    Machine learning (ML) is a type of artificial intelligence (AI) that allows software applications to become more accurate at predicting outcomes without being explicitly programmed to do so. Machine learning algorithms use historical data as input to predict new output values.

  • Master Data Management (MDM)

    Master data management (MDM) is a technology-enabled discipline in which business and IT work together to ensure the uniformity, accuracy, stewardship, semantic consistency and accountability of the enterprise’s official shared master data assets.

  • Metadata Management Solutions

    Metadata management is about an organization’s management of its data and information assets. Metadata describes the various facets of an information asset that can improve its usability throughout its life cycle. Enterprise metadata management (EMM) encompasses the roles, responsibilities, processes, organization and technology necessary to ensure that the metadata across the enterprise adds value to that enterprise’s data. EMM is different to metadata management, which only operates at the level of a single program, project or initiative; EMM works across all. The market for metadata management solutions comprises vendors that include one or many metadata management capabilities such as

    • Metadata repositories
    • Business glossary
    • Data lineage
    • Impact analysis
    • Rules management
    • Semantic
    • Metadata ingestion and translation
  • Micro-Electromechanical Systems (MES)

    A MEMS (micro-electromechanical system) is a miniature machine that has both mechanical and electronic components. The physical dimension of a MEMS can range from several millimeters to less than one micrometer, a dimension many times smaller than the width of a human hair.

  • Microbiome Therapeutics

    The microbiome is the natural bacterial colonisation of the body – including skin, gut and stomach. Scientists believe that this is a complex network of interactions, which can affect metabolism, neurology, infection, gut disease and other chronic health conditions. Microbiome therapy aims to establish a healthy gut microbiome to restore gut function and prevent dysregulation. Microbiome therapies (MBT) are currently in development to prevent recurrent C. difficile infections, which are primarily contracted in hospitals, leading to diarrhea, dehydration, fever and weight loss.

  • Microservice

    A microservice is a service-oriented application component that is tightly scoped, strongly encapsulated, loosely coupled, independently deployable and independently scalable.

  • Mining

    The process by which “blocks” or transactions are verified and added to a blockchain. In order to verify a block a miner must use a computer to solve a cryptographic problem. Once the computer has solved the problem, the block is considered “mined” or verified. In the Bitcoin or Ethereum blockchain, the first computer to mine or verify the block receives bitcoin or ether, respectively.

  • Mixed Reality (MR)

    Mixed reality (MR) is the merging of real and virtual worlds to produce new environments and visualizations, where physical and digital objects co-exist and interact in real time.

  • Momentary Markets

    The rise of “momentary markets” means that every discrete piece of time is an opportunity: a chance to deliver a new on-demand product or service designed to meet a specific customer’s needs.

  • Multiexperience Development Platforms (MXDP)

    MXDPs serve to centralize life cycle activities — designing, developing, testing, distributing, managing and analyzing — for a portfolio of multiexperience apps. Multiexperience refers to the various permutations of modalities (e.g., touch, voice and gesture), devices and apps that users interact with on their digital journey across the various touchpoints. Multiexperience development involves creating fit-for-purpose apps based on touchpoint-specific modalities, while at the same time ensuring a consistent user experience across web, mobile, wearable, conversational and immersive touchpoints.

  • Multimodal Deep Semantic Understanding

    Multimodal deep semantic understanding takes the information of different models such as voice, image, and text as input, and integrates perception and cognition technologies to achieve a multi-dimensional deep understanding of information. With the rapid development and large-scale application of computing vision, speech, natural language understanding, and knowledge graph, multimodal deep semantic understanding is gradually mature, which leads to a broader application scenario.

  • Multimodel DBMSs

    Multimodel DBMSs support several types of data model, such as relational, XML and the common nonrelational types (e.g., document, key-value, search and graph), in a single DBMS. They are intended to reduce complexity in the data access layer by using a single language to access different persistent types, although multiple languages can also be used.

  • Nanohybrids

    A nanohybrid is combination of two nanomaterials through chemical bonding while a nanocomposite is addition of nanomaterial to a matrix for example a polymer which in itself is not a nanomaterial.

  • Nanoscale 3D Printing

    Nanoscale 3D printing is a  technique that can fabricate tiny structures a thousand times faster than conventional two-photon lithography (TPL) techniques, without sacrificing resolution.

  • Natural Language Generation

    Natural-language generation (NLG) is a software process that transforms structured data into natural language. It can be used to produce long form content for organizations to automate custom reports, as well as produce custom content for a web or mobile application. It can also be used to generate short blurbs of text in interactive conversations (a chatbot) which might even be read out by a text-to-speech system.

  • Natural Language Query

    Natural Language Query enables users to query data using business terms that are either typed into a search box or spoken.

  • Natural-language Processing (NLP)

    Natural-language processing (NLP) technology involves the ability to turn text or audio speech into encoded, structured information, based on an appropriate ontology. The structured data may be used simply to classify a document, as in “this report describes a laparoscopic cholecystectomy,” or it may be used to identify findings, procedures, medications, allergies and participants.

  • Neobank

    A neobank (also known as an online bank, internet-only bank, or digital bank) is a type of direct bank that operates exclusively online without traditional physical branch networks.

  • Neuromorphic Hardware

    Neuromorphic hardware uses specialized computing architectures that reflect the structure (morphology) of neural networks from the bottom up: dedicated processing units emulate the behavior of neurons directly in hardware, and a web of physical interconnections (bus-systems) facilitate the rapid exchange of information.

  • Next-Generation Memory

    Next generation memory can be defined as a common label applied to a major upgrade of hardware or a software product. The next generation memory market has grown over the last few years because of the increasing demand for faster, efficient, and cost-effective memory solutions. The emerging Big Data and artificial intelligence (AI) applications, including machine learning, drive innovations across many industries and this is accompanying new memory technologies in various end-users such as BFSI, Consumer Electronics, etc.

  • NoOps

    NoOps (no operations) is the concept that an IT environment can become so automated and abstracted from the underlying infrastructure that there is no need for a dedicated team to manage software in-house.

  • NoSQL

    NoSQL is an approach to database design that can accommodate a wide variety of data models, including key-value, document, columnar and graph formats. NoSQL, which stands for “not only SQL,” is an alternative to traditional relational databases in which data is placed in tables and data schema is carefully designed before the database is built. NoSQL databases are especially useful for working with large sets of distributed data.

  • Non-fungible tokens or NFTs

    Non-fungible tokens or NFTs are cryptographic assets on blockchain with unique identification codes and metadata that distinguish them from each other. Unlike cryptocurrencies, they cannot be traded or exchanged at equivalency. This differs from fungible tokens like cryptocurrencies, which are identical to each other and, therefore, can be used as a medium for commercial transactions.

  • Optical Character Recognition (OCR)

    Optical character recognition or optical character reader (OCR) is the electronic or mechanical conversion of images of typed, handwritten or printed text into machine-encoded text, whether from a scanned document, a photo of a document, a scene-photo (for example the text on signs and billboards in a landscape photo) or from subtitle text superimposed on an image (for example from a television broadcast)

  • Persistent Memory Server

    New persistent-memory technologies will help reduce costs and complexity of adopting in-memory computing (IMC)-enabled architectures. Persistent memory represents a new memory tier between DRAM and NAND flash memory that can provide cost-effective mass memory for high-performance workloads. It has the potential to improve application performance, availability, boot times, clustering methods and security practices, while keeping costs under control. It will also help organizations reduce the complexity of their application and data architectures by decreasing the need for data duplication.

  • Personification

    Personification is the delivery of relevant digital experiences to individuals based on their inferred membership in a defined customer segment, rather than their personal identity.

  • Physical Augmentation

    Physical augmentation alters the person’s inherent physical capability by implanting or hosting a technology within their bodies. This falls into four categories:

  • Platform as a Service (PaaS)

    Platform as a service (PaaS) is a cloud computing model in which a third-party provider delivers hardware and software tools — usually those needed for application development — to users over the internet. A PaaS provider hosts the hardware and software on its own infrastructure. As a result, PaaS frees developers from having to install in-house hardware and software to develop or run a new application.

  • Powered Exosuit

    A powered exoskeleton (also known as power armor, powered armor, powered suit, exoframe, hardsuit or exosuit) is a wearable mobile machine that is powered by a system of electric motors, pneumatics, levers, hydraulics, or a combination of technologies that allow for limb movement with increased strength and endurance.Its design aims to provide back support, sense the user’s motion, and send a signal to motors which manage the gears. The exoskeleton supports the shoulder, waist and thigh, and assists movement for lifting and holding heavy items, while lowering back stress.

  • Predictive Analytics

    Predictive analytics encompasses a variety of statistical techniques from data mining, predictive modelling, and machine learning, that analyze current and historical facts to make predictions about future or otherwise unknown events.

  • Prescriptive Analytics

    Prescriptive Analytics is a form of advanced analytics which examines data or content to answer the question “What should be done?” or “What can we do to make _______ happen?”, and is characterized by techniques such as graph analysis, simulation, complex event processing, neural networks, recommendation engines, heuristics, and machine learning.

  • Private Cloud

    Private cloud is a type of cloud computing that delivers similar advantages to public cloud, including scalability and self-service, but through a proprietary architecture. Unlike public clouds, which deliver services to multiple organizations, a private cloud is dedicated to the needs and goals of a single organization.

  • Private Cloud dbPaaS

    Private cloud dbPaaS offerings merge the isolation of private cloud database platforms with the self-service and scalability of the public cloud. Private cloud dbPaaS can play the role of a transition technology as organizations develop their long-term cloud strategy. Private cloud dbPaaS is an option for organizations that are unable or not ready to move to public cloud offerings, due to security, regulatory or other concerns,

  • Proof of Stake (PoS)

    An alternative consensus protocol, in which an individual or “validator” uses their own cryptocurrency to validate transactions or blocks. Validators “stake” their cryptocurrency, such as ether, on whichever transactions they choose to validate. If the individual validates a block (group of transactions) correctly then the individual receives a reward. Typically, if a validator verifies an incorrect transaction then they lose the cryptocurrency that they staked. Proof of Stake requires a negligible amount of computing power compared to Proof of Work consensus (see also Hybrid ConsenSys Model).

  • Proof of Work (PoW)

    A protocol for establishing consensus across a system that ties mining capability to computational power. Hashing a block, which is in itself an easy computational process, now requires each miner to solve for a set, difficult variable. In effect, the process of hashing each block becomes a competition. This addition of solving for a target increases the difficulty of successfully hashing each block. For each hashed block, the overall process of hashing will have taken some time and computational effort. Thus, a hashed block is considered Proof of Work (see also Hybrid ConsenSys Model).

  • Protecting AI-Powered Systems

    Application leaders should closely monitor ML algorithms and the data they consume. This will help them determine whether “there are extant or potential corruption (“poisoning”) issues.” Once infected, data manipulation is very likely to compromise data-driven decisions, in which data quality, integrity, confidentiality and privacy are an absolute must. There are five phases of ML pipelines that require protection: data ingestion; preparation and labeling; model training; inference validation; and production deployment. There are several risks associated with each phase, and enterprises need to ensure that they’re well prepared and armed with knowledge.

  • Quantum Computing

    Quantum computing is an area of study focused on the development of computer technologies based on the principles of quantum theory. Quantum theory which explains the nature and behavior of energy and matter on the quantum (atomic and subatomic) level. Quantum computing uses a combination of bits to perform specific computational tasks at a much higher efficiency than classical computers. Development of quantum computers mark a leap forward in computing capability, with massive performance gains for specific use cases — specifically surrounding any sort of computation heavy process like simulations.

  • Real-time Payments

    See Instant Payments

  • Regtech

    RegTech (regulatory technology) is a class of software applications for managing regulatory compliance. Companies invest in RegTech as a way to save time and money, allowing resources that were once devoted to regulatory compliance to be diverted elsewhere, once RegTech takes over those functionalities.

  • Reinforcement Learning

    Reinforcement learning is a training method based on rewarding desired behaviors and/or punishing undesired ones. The learning method has been adopted in artificial intelligence (AI) as a method of directing unsupervised machine learning through rewards and penalties. Reinforcement learning is used in operations research, information theory, game theory, control theory, simulation-based optimization, multi-agent systems, swarm intelligence, statistics and genetic algorithms.

  • Robotic Process Automation Software

    Robotic process automation (RPA) tools perform “if, then, else” statements on structured data, typically using a combination of user interface (UI) interactions or by connecting to APIs to drive client servers, mainframes or HTML code. An RPA tool operates by mapping a process in the RPA tool language for the software “robot” to follow, with runtime allocated to execute the script by a control dashboard.

  • Robotic process automation (RPA)

    Robotic process automation (RPA) is a productivity tool that allows a user to configure one or more scripts (which some vendors refer to as “bots”) to activate specific keystrokes in an automated fashion. The result is that the bots can be used to mimic or emulate selected tasks (transaction steps) within an overall business or IT process. These may include manipulating data, passing data to and from different applications, triggering responses, or executing transactions. RPA uses a combination of user interface  interaction and descriptor technologies. The scripts can overlay on one or more software applications.

  • Sensor Fusion

    Sensor fusion is combining of sensory data or data derived from disparate sources such that the resulting information has less uncertainty than would be possible when these sources were used individually.

  • Sensory Augmentation

    Sensory augmentation involves hearing, vision, and perception. Examples include virtual reality (VR), augmented reality (AR), and mixed reality (MR). We are witnessing several companies delve into human augmentation, whether it be smart contact lenses that are able to detect glucose levels in tears and intraocular pressure, or an “electronic nose” that is identical to the human nose.

  • Servware

    See Software-Defined Services

  • Shadow AI

    Shadow IT refers to IT devices, software and services outside the ownership or control of IT organizations.

  • Silicon Anode Batteries

    Silicon anode batteries are an extension of widely used lithium ion (Li-Ion) batteries. Early generation Li-Ion batteries used lithium as the anode material. This was replaced with carbon/graphite following a number of widely reported overheating and explosion incidents. Next-generation Li-Ion batteries are likely to make use of silicon anodes that utilize silicon nanotubes, or a comparable coating process. This will result in significantly higher energy storage and longer battery life.

  • Smart Assets

    Smart assets are unique virtual currency tokens that may represent a tangible real-world asset or non-tangible ownership that can be purchased, sold, or exchanged as defined by the rules of smart contracts on the blockchain network.

  • Smart Contract

    A smart contract, also known as a cryptocontract , is a computer program that directly controls the transfer of digital currencies or assets between parties under certain conditions. A smart contract not only defines the rules and penalties related to an agreement in the same way that a traditional contract does, but it can also automatically enforce those obligations.

  • Smart Dust

    In nanotechnology, smart dust is an ad hoc network of tiny devices equipped with wireless micro-electromechanical sensors (MEMS).

  • Smart Home Devices

    Smart home-enabled devices can also operate in conjunction with other devices in the home and communicate information to other smart devices.

  • Smart Logistics

    Smart logistics makes supply chains more effective and efficient at each step. With connected devices in the supply chain and intelligent asset tracking tools, smart logistics brings end-to-end visibility and improves the way companies transport goods, control inventory and mobile assets, replenish stock, and manage the retail experience. For example, sensor-driven asset tracking tools can provide insight at each stage of shipment, tracking moisture, heat, and vibration to give companies real-time visibility as a product moves through the supply chain.

  • Smart Matter

    See Smart Dust

  • Smart Robot

    A smart robot is an artificial intelligence (AI) system that can learn from its environment and its experience and build on its capabilities based on that knowledge.

  • Smart Robotics

    Smart robotics is a technology comprises robots that can collaborate with human beings, learn from its environment/experience and refine or modify its actions accordingly

  • Smart Speaker

    A smart speaker is a wireless and smart audio playback device that uses several types of connectivity for additional functions. Smart speakers have special features to enhance ease of use, connect to multiple types of audio sources and provide additional functionality.

  • Social Robots

    Social Robots use artificial intelligence (AI) to decide how to act on information received through cameras and other sensors. The ability to respond in ways that seem lifelike has been informed by research into such issues as how perceptions form, what constitutes social and emotional intelligence, and how people can deduce others’ thoughts and feelings. Advances in AI have enabled designers to translate such psychological and neuroscientific insights into algorithms that allow robots to recognize voices, faces and emotions, interpret speech and gestures, respond appropriately to complex verbal and nonverbal cues, make eye contact, speak conversationally, and adapt to people’s needs by learning from feedback, rewards and criticisms.

  • Software as a Service (SaaS)

    Software as a service (SaaS) is a software distribution model in which a third-party provider hosts applications and makes them available to customers over the Internet. SaaS is one of three main categories of cloud computing, alongside infrastructure as a service (IaaS) and platform as a service (PaaS).

  • Software-Defined Services

    In order to address specific business processes or market verticals, service providers are adding software to their services. The results are software-defined services, or “servware,” often taking the form of a platform and/or solution

  • Spark

    Spark is a general-purpose distributed data processing engine that is suitable for use in a wide range of circumstances. On top of the Spark core data processing engine, there are libraries for SQL, machine learning, graph computation, and stream processing, which can be used together in an application. Programming languages supported by Spark include: Java, Python, Scala, and R. Application developers and data scientists incorporate Spark into their applications to rapidly query, analyze, and transform data at scale. Tasks most frequently associated with Spark include ETL and SQL batch jobs across large data sets, processing of streaming data from sensors, IoT, or financial systems, and machine learning tasks.

  • Spectrum Sensing

    Spectrum Sensing is a wireless communications technique that is able to pick a free channel for transmission.

  • Speech Recognition

    Speech recognition is the ability of a machine or program to identify words and phrases in spoken language and convert them to a machine-readable format. Rudimentary speech recognition software has a limited vocabulary of words and phrases, and it may only identify these if they are spoken very clearly. More sophisticated software has the ability to accept natural speech.

  • SQL-on-Hadoop

    SQL-on-Hadoop is a class of analytical application tools that combine established SQL-style querying with newer Hadoop data framework elements.

  • Stablecoin

    Stablecoin is a fixed-price cryptocurrency whose market value is attached to another stable asset. Differing from normal cryptocurrencies, stablecoin can be pegged to assets such as certain fiat currencies that can be traded on exchanges, including the U.S. dollar or the Euro. Some stablecoins can be pegged to other types of assets, including precious metals, such as gold, and even other cryptocurrencies.

  • Strong AI

    See Artificial General Intelligence

  • Sustainable Manufacturing

    Sustainable manufacturing is the creation of manufactured products through economically-sound processes that minimize negative environmental impacts while conserving energy and natural resources. Sustainable manufacturing also enhances employee, community and product safety.

  • Synthetic Data

    Synthetic data is information that’s artificially manufactured rather than generated by real-world events. Synthetic data is created algorithmically, and it is used as a stand-in for test datasets of production or operational data, to validate mathematical models and, increasingly, to train machine learning models.

  • Synthetic Food

    Synthetic food products are derived from chemically synthesized food substances. They are produced by new technological methods from such nutritive substances as proteins or their component amino acids, carbohydrates, fats, vitamins, and trace elements. Synthetic and artificial food products usually imitate natural food products in appearance, taste, and odor.

  • Techquilibrium

    TechQuilibrium is the balance point where the enterprise has the right mix of traditional and digital capabilities and assets, to power the business model needed to compete most effectively, in an industry that is being digitally revolutionized.

  • Text Analytics

    Text analytics is the automated process of translating large volumes of unstructured text into quantitative data to uncover insights, trends, and patterns. Combined with data visualization tools, this technique enables companies to understand the story behind the numbers and make better decisions.

  • The Digital Twin of an Organization (DTO)

    It is predicated on using a digital representation of an organization (its business model, strategies etc.) to better plan and execute a business transformation initiative. The whole idea behind the digital twin concept, and the reason why it is so useful, is that it offers a virtual model that can be analyzed and tweaked more easily than the real thing. The new insights and efficiencies you uncover this way can in turn be used to improve the organization. replicate what works, or address issues before they turn into real-world problems.

  • The internet of things, or IoT

    The internet of things, or IoT, is a system of interrelated computing devices, mechanical and digital machines, objects, animals or people that are provided with unique identifiers (UIDs) and the ability to transfer data over a network without requiring human-to-human or human-to-computer interaction.

  • Time Series DBMS

    A Time Series DBMS is a database management system that is optimized for handling time series data: each entry is associated with a timestamp. For example, time series data may be produced by sensors, smart meters or RFIDs in the so-called Internet of Things, or may depict the stock tickers of a high frequency stock trading system.

  • Token

    A Token represents an asset built on an existing blockchain (different from a coin). Tokens are designed to be unique, liquid, secure, instantly transferable, and digitally scarce.

  • Tokenization

    Tokenization refers to a process by which a piece of sensitive data, such as a credit card number, is replaced by a surrogate value known as a token. The sensitive data still generally needs to be stored securely at one centralized location for subsequent reference and requires strong protections around it. The security of a tokenization approach depends on the security of the sensitive values and the algorithm and process used to create the surrogate value and map it back to the original value.

  • Transfer Learning

    Transfer learning is the application of knowledge gained from completing one task to help solve a different, but related, problem. The development of algorithms that facilitate transfer learning processes has become a goal of machine learning technicians as they strive to make machine learning as human-like as possible.

  • Virtual Agent

    A virtual agent is a software program that uses scripted rules and, increasingly, artificial intelligence applications to provide automated service or guidance to humans.

  • Virtual Assistants

    A virtual assistant is an application program that understands natural language voice commands and completes tasks for the user.

  • Virtual Reality (VR)

    Virtual reality (VR) provides a computer-generated 3D environment (including both computer graphics and 360-degree video) that surrounds a user and responds to an individual’s actions in a natural way, usually through immersive head-mounted displays.

  • Visual Data Discovery

    Visual Data Discovery is the process of using visualizations and iterative visual exploration of data to improve time to insight. Visual data discovery improves business decision-making by enabling the exploration of data through interaction with charts and images. Processes such as in-memory computing and mashing or fusing of multiple data sources allow visualizations to be robust and real time; while the exponential growth of data has made their use essential in many organizations.

  • VPA-Enabled Wireless Speakers

    VPA-Enabled Wireless Speakers is cloud-enabled far-field voice computing devices connecting the user to a virtual personall assistant service.

  • Web 3.0

    Web 3.0 is the third generation of internet services for websites and applications that will focus on using a machine-based understanding of data to provide a data-driven and semantic web. The ultimate goal of Web 3.0 is to create more intelligent, connected and open websites.

  • Wide-Column DBMSs

    See Column-oriented DBMS

  • 3D Printing

    3-D printing is a manufacturing process that builds layers to create a three-dimensional solid object from a digital model.

  • 3D Volumetric Printing

    It’s a resin-based process that doesn’t use lasers or even a layer-by-layer approach. Instead, a pattern is projected into a vat of transparent photopolymer liquid repeatedly from all angles. It’s must like a CT scan, except in reverse: the pattern is projected to form the object instead of scanning the object.

  • 5G Networks

    Fifth-generation wireless (5G) is the latest iteration of cellular technology, engineered to greatly increase the speed and responsiveness of wireless networks. With 5G, data transmitted over wireless broadband connections can travel at multigigabit speeds, with potential peak speeds as high as 20 gigabits per second (Gbps) by some estimates. These speeds exceed wireline network speeds and offer latency of 1 millisecond (ms) or lower for uses that require real-time feedback. 5G will also enable a sharp increase in the amount of data transmitted over wireless systems due to more available bandwidth and advanced antenna technology.