Vocabulary of Emerging Technologies
We collect and provide definitions of all tech trends. Enjoy! If you see something missed, please do let us know! Or mail to into.tesler@intetics.com
- Actionable Analytics
Actionable Analytics is getting access to relevant data in the correct context; the ability to take action immediately, the power to acquire meaningful outcomes. The fast-evolving arena of Business Intelligence and Analytics aims to make analytics ‘invisible’ or more accessible and comprehensible to employees across the organization to enable better, faster, and more productive decisions.
- Adaptive Machine Learning
In machine learning, many algorithms are adaptive or have adaptive variants, which usually means that the algorithm parameters are automatically adjusted according to statistics about the optimization thus far (e.g. the rate of convergence). Examples include adaptive simulated annealing, adaptive coordinate descent, AdaBoost, and adaptive quadrature.
- Adaptive automatic learning
- Advanced Antibodies
Advanced Antibodies are specialized proteins engineered with powerful binding abilities and improved therapeutic properties. Using advanced techniques like genetic engineering, these antibodies precisely target specific antigens. They offer targeted therapies for infectious diseases, cancer, autoimmune disorders, and neurological disorders, leading to more effective treatment strategies.
- Advanced driver-assistance systems (ADAS)
Advanced driver-assistance systems (ADAS), are electronic systems that help the vehicle driver while driving or during parking.
- Affective Computing
Affective computing technologies sense the emotional state of a user (via sensors, microphone, cameras and/or software logic) and respond by performing specific, predefined product/service features, such as changing a quiz or recommending a set of videos to fit the mood of the learner.
- AgTech
Agtech is the application of new digital technologies with the intention of rapidly improving how efficiently activities at the different stages of agricultural value chains are conducted. Agtech is also seen to represent the application of technology – especially software and hardware technology – to the field of farming.[1] Agtech is a relatively new phenomenon emerging in the last decade to address increasing concerns about food security for the growing global population and the diminishing viability of farming.
- AI Assistant
See Virtual Assistants.
- AI Cloud Services
AI cloud services are hosted services that allow development teams to incorporate the advantages inherent in AI and ML.
- AI Developer Toolkits
AI Developer Toolkits are applications and software development kits (SDKs) that abstract data science platforms, frameworks, and analytic libraries to enable software engineers to deliver AI-enabled applications. They cover 4 maturing categories: cloud-based AI as a service (AIaaS), toolkits for virtual assistants, device development kits, and AI serving SDKs.
- AI Governance
AI governance is the idea that there should be a legal framework for ensuring that machine learning technologies are well researched and developed with the goal of helping humanity navigate the adoption of AI systems fairly.
- AI Marketplaces
AI Marketplace is an easily accessible place supported by a technical infrastructure that facilitates the publication, consumption, and billing of reusable algorithms. Some marketplaces are used within an organization to support the internal sharing of prebuilt algorithms among data scientists.
- AI PaaS
AI PaaS is a set of AI and machine learning (ML) platform services for building, training, and deploying AI-powered functionalities for applications.
- AI for marketing
AI for marketing comprises systems that change behaviors without being explicitly programmed based on data collected, usage analysis, and other observations for marketing use cases. Unprecedented insight, intuition, and scale fueled by AI will help marketers deliver relevant experiences to prospects and customers with increasing effectiveness and efficiency.
- AI-Related C&SI Services
Artificial Intelligence (AI)-Related Consulting and System Integration (C&SI) Services are a subset of intelligent automation services to help clients ideate use cases, design business or IT processes, select technologies, curate data, build and train models, deploy solutions, assess and mitigate risks, and adapt talent mix to successfully incorporate intelligent solutions. Intelligent solutions must involve one or more advanced technologies, such as machine learning, deep learning, or NLP.
- Altcoin
Altcoin, also known as alternative coins, is used to denote all cryptocurrencies from Ethereum to Dogecoin other than Bitcoin. Most altcoins were created to address bitcoin’s drawbacks and to provide newer versions with competitive benefits.
- Ambient Intelligence
A technology that uses sensors and machine learning algorithms to create intelligent, adaptive environments that respond to the user’s needs and preferences. Ambient intelligence can be used to create immersive and personalized digital environments that adapt to the user’s behavior and preferences.
- An Order-to-Cash Transformation
It provides you and your organization with the ability to work according to common principles that are true to your brand, strategy, and customer needs. You will need to build an agile model that provides the ability to deliver new products to new markets and ultimately the ability to revamp your revenue and delivery model. The order-to-cash process is the sequence of events and data flow occurring when a customer places an order, one of your products or services is purchased and payment and cash collection are finalized. It is a complex process that cuts across many parts of the organization and relies on several handshakes, a multitude of systems, and data with different attributes and requirements. If not appropriately structured, the order-to-cash process can upset your customers, your people and ultimately hurt your bottom line.
It is complex and can be transformed into a process that is fit for you, your products, and your customers, minimizing inefficiencies and optimizing your customer experience.
- Analytics and BI Platform as a Service
Analytics and business intelligence (ABI) platforms are characterized by easy-to-use functionality that supports a full analytic workflow — from data preparation to visual exploration and insight generation — with an emphasis on self-service and augmentation. ABI platforms are no longer differentiated by their data visualization capabilities, which are becoming commodities. Instead, differentiation is shifting to: Integrated support for enterprise reporting capabilities, Augmented analytics. ABI platform functionality includes the following 15 critical capability areas: security, Manageability, Cloud, Data source connectivity, Data preparation, Model complexity, Catalog, Automated insights, Advanced analytics, Data Visualization, Natural language query, Data storytelling, Embedded analytics, Natural language generation (NLG), Reporting.
- Analytics and Business Intelligence (ABI)
Analytics and business intelligence (ABI) is an umbrella term that includes the applications, infrastructure and tools, and best practices that enable access to and analysis of information to improve and optimize decisions and performance.
- Angle of Arrival (AOA)
The Angle of Arrival (AOA) is a method used for positioning in providing services such as E911, and for other military and civil radio-location applications, such as sonars and radars.
- Angular
Angular is an open-source, modern MVVC framework and platform that is used to build enterprise Single-page Web Applications (or SPAs) using HTML and TypeScript. As a framework, Angular implements core and optional functionality as a set of TypeScript libraries that you import into your apps.
- Appendage and Biological Function Augmentation
In appendage and biological function augmentation, exoskeletons and prosthetics are used to replace or enhance such capabilities. We can see this in various forms, including surgical augmentation of the eyes of professional golfers and cochlear implants replacing nonfunctioning auditory nerves. Both the cosmetics and pharmaceutical industries are prime examples of what this type of augmentation looks like. Passive implants are used to enhance nails, hair, and even reshape body parts. Nootropics involves the use of natural or synthetic substances that has the potential to enhance a human’s mental skills, which sparks controversy.
- Application Data Management
Application data management (ADM) is a technology-enabled business discipline in which business and IT work together to ensure the uniformity, accuracy, stewardship, governance, semantic consistency and accountability for data in a business application or suite, such as ERP, custom-made or core banking. Application data is the consistent and uniform set of identifiers and extended attributes maintained and/or used within an application or suite. Examples of such entities include customers, suppliers, products, assets, site, and prices.
- Application Performance Monitoring (APM)
Application performance monitoring (APM) is a suite of monitoring software comprising digital experience monitoring (DEM), application discovery, tracing and diagnostics, and purpose-built artificial intelligence for IT operations.
- Application Programming Interface (API) In the context of APIs, the word Application refers to any software with a distinct function. The interface can be thought of as a contract of service between two applications. This contract defines how the two communicate with each other using requests and responses.
- Application Security Orchestration and Correlation (ASOC) Application security orchestration and correlation (ASOC) is a category of application security, “AppSec”, solution that helps streamline vulnerability testing and remediation through workflow automation.
- Application security (AppSec)
AppSec is the process of finding, fixing, and preventing security vulnerabilities at the application level, as part of the software development processes. This includes adding application measures throughout the development life cycle, from application planning to production use.
- AR Cloud
AR Cloud is a machine-readable, 1:1 scale model of the world that is continuously updated in real-time. It is a collection of billions of machine-readable datasets, point clouds, and descriptors, aligned with real-world coordinates; a living, shared, “soft copy” of the world created by scanning physical features around us in which persistent augmented reality experiences reside.
- Artificial Emotional Intelligence
See Affective Computing.
- Artificial General Intelligence
Artificial General Intelligence (AGI) is AI that is designed to work with people to help solve currently intractable multidisciplinary problems, including global challenges such as climate change, more personalized healthcare and education etc. Modern AI systems work well for the specific problem on which they’ve been trained, but getting AI systems to help address some of the hardest problems facing the world today is argued to require generalization and deep mastery of multiple AI technologies.
- Artificial Intelligence (AI)
Artificial intelligence (AI) applies advanced analysis and logic-based techniques, including Machine Learning, to interpret events, support and automate decisions, and take action.
- Artificial Intelligence IT Operations (AIOps)
Artificial intelligence for IT operations (AIOps) is an umbrella term for the use of big data analytics, machine learning (ML), and other artificial intelligence (AI) technologies to automate the identification and resolution of common IT issues. The systems, services, and applications in a large enterprise produce immense volumes of log and performance data. AIOps uses this data to monitor assets and gain visibility into dependencies within and outside of IT systems.
- Augmented Analytics
Augmented analytics is the use of enabling technologies such as machine learning and AI to assist with data preparation, insight generation, and insight explanation to augment how people explore and analyze data in analytics and BI platforms. It also augments the expert and citizen data scientists by automating many aspects of data science, machine learning, and AI model development, management, and deployment.
- Augmented Data Management
Augmented data management leverages ML capabilities and AI engines to make enterprise information management categories including data quality, metadata management, master data management, data integration as well as database management systems (DBMSs) self-configuring and self-tuning.
- Augmented Intelligence
Augmented intelligence is a human-centered partnership model of people and artificial intelligence (AI) working together to enhance cognitive performance, including learning, decision making, and new experiences.
- Augmented Intelligence Scenarios
AI working with humans.
- Augmented Reality (AR)
Augmented reality (AR) is the real-time use of information in the form of text, graphics, audio and other virtual enhancements integrated with real-world objects.
- Augmented data discovery (ADD)
ADD describes tools that allow the user to easily use ML to enable citizen data scientists to find, visualize and narrate findings, without having to manually build models or write algorithms without the help of IT or data experts.
- Aurelia.js
Aurelia is a collection of modern JavaScript modules, which when used together, function as a powerful platform for building browser, desktop, and mobile applications, all open source and built on open web standards.
- AutoML
Automated machine learning (AutoML) is the capability of automating the process of building, deploying, and managing machine learning models.
- Automated Business Processes Discovery (ABPD)
Automated business process discovery (ABPD) as a complementary approach overcomes many of these shortcomings to create a business process model at a fraction of the time and cost involved in the traditional way. One major benefit of ABPD is process discovery across the “white space,” the process knowledge gap that exists between departments and functions and at the edges of processes. Modeling can be done by hand, so there may be no cash-flow impact, unlike ABPD, where a tool is necessary. By observing the details of how processes are executed in the supporting technology, ABPD uses unstructured, event-level data to automatically build process definitions and models, and explore process variations. Because of the quantitative nature of this data, the derived process models enable rich and interactive analysis. ABPD techniques start from event logs (audit trails, messages, transactions, databases and so forth), and try to discover patterns to fit a process model to the information obtained from the events. The underlying techniques are strong enough so that users don’t have to specify a process model; it is created from scratch by pattern recognition. Moreover, ABPD delivers information on bottlenecks, variances, root causes and the distribution of the process instances, thus enabling meaningful simulation. It’s all about capturing what has happened in a real-life process. ABPD is a form of process optimization. Of course, ABPD does not capture informal human tasks that are not automated and represented as events in an audit trail or transaction log. This is why ABPD is best combined with techniques from social BPM and BPA for the masses. However, ABPD accelerates process discovery and identifies previously unseen process patterns. Planners must still conduct interviews to capture informal work practices, but organizations no longer need to spend as much time using interviews to discover the implicit processes previously hidden in automated solutions.
- Automated Storage Tiering
Automated storage tiering (AST) is a storage software management feature that dynamically moves information between different disk types and RAID levels to meet space, performance and cost requirements.
Automated storage tiering features use policies that are set up by storage administrators. For example, a data storage administrator can assign infrequently used data to slower, less-expensive SATA storage but allow it to be automatically moved to higher-performing SAS or solid-state drives (SSDs) as it becomes more active (and vice versa). - Automotive Hacking
Automotive hacking is the exploitation of vulnerabilities within the software, hardware, and communication systems of automobiles.
- Autonomous Driving
Autonomous driving normally refers to self-driving vehicles or transport systems that move without the intervention of a human driver.
- Autonomous Driving Level 4
Level 4 vehicles can operate in self-driving mode. But until legislation and infrastructure evolves, they can only do so within a limited area (usually an urban environment where top speeds reach an average of 30mph). This is known as geofencing. As such, most Level 4 vehicles in existence are geared toward ridesharing. Level 4 vehicles can intervene if things go wrong or there is a system failure. In this sense, these cars do not require human interaction in most circumstances. However, a human still has the option to manually override.
- Autonomous Driving Level 5
This level of autonomous driving requires absolutely no human attention. There is no need for pedals, brakes, or a steering wheel, as the autonomous vehicle system controls all critical tasks, monitoring of the environment and identification of unique driving conditions like traffic jams.
- Autonomous Robot
An autonomous robot is a robot that performs behaviors or tasks with a high degree of autonomy (without external influence). Autonomous robotics is usually considered to be a subfield of artificial intelligence, robotics, and information engineering.
- Autonomous Vehicles
An autonomous vehicle is one that can drive itself from a starting point to a predetermined destination in “autopilot” mode using various in-vehicle technologies and sensors, including adaptive cruise control, active steering (steer by wire), anti-lock braking systems (brake by wire), GPS navigation technology, lasers and radar.
- Autonomous technology
Autonomous technology is a technology that can operate or perform tasks with little or no human intervention, using predefined rules, machine learning algorithms, or artificial intelligence.
- Backbone.js
Backbone.js is a lightweight JavaScript library or framework that allows to develop and structure of the client-side applications that run in a web browser. It is based on the MVC pattern which abstracts data into models, and DOM into views, and binds these two using events.
- Battery recycling
Battery recycling is a recycling activity that aims to reduce the number of batteries being disposed as municipal solid waste. Batteries contain a number of heavy metals and toxic chemicals and disposing of them by the same process as regular trash has raised concerns over soil contamination and water pollution.
- Benchmarking
Benchmarking is the process of measuring products, services, and processes against those of organizations known to be leaders in one or more aspects of their operations.
- Bifacial Solar Panel
A bifacial solar panel is a double-sided energy factory that transforms sunlight into electrical energy on both its top and bottom sides. They are different from monofacial solar panels which only use one side for solar energy production.
- Big Data
Big Data refers to massive complex structured and unstructured data sets that are rapidly growing, generated, and transmitted from a wide variety of sources. These data sets are so voluminous that traditional data processing software just can’t manage them, but these data can be used to address business problems you wouldn’t have been able to tackle before.
- Biochips
Biochips relate to a number of technologies that involve the merging of semiconductor and biological sciences. The most common form is based on an array of molecular sensors arranged on a small surface, typically called “lab-on-chip.” The underlying mechanism utilizes microfluidic micro-electromechanical systems (MEMS) technology. These devices are used to analyze biological elements such as DNA, ribonucleic acid, and proteins, in addition to certain chemicals.
- Biometric Authentication Methods
Biometric authentication methods use biometric characteristics or traits to verify users’ claimed identities when users access endpoint devices, networks, networked applications or Web applications. Across a wide range of use cases, any biometric authentication method may be used in one-to-one comparison mode (when the user enters a user ID), or one-to-many search mode (when the user simply presents hais or her biometric characteristic, with no explicit claim of identity, and the system determines his or her user ID from a range of candidates).
- Bioprinting
A technology that uses 3D printing techniques to create functional living tissues and organs from cells. Bioprinting has potential applications in areas like regenerative medicine and drug development.
- Blockchain
Blockchain is a type of distributed ledger for maintaining a permanent and tamper-proof record of transactional data. A blockchain functions as a decentralized database that is managed by computers belonging to a peer-to-peer (P2P) network. Each of the computers in the distributed network maintains a copy of the ledger to prevent a single point of failure (SPOF) and all copies are updated and validated simultaneously.
- Blockchain Business Models
A blockchain business model is decentralized, allows peer-to-peer transactions, and helps create a trusted network. Blockchain-based business models use tokens to make profits. Anyone who helps blockchain reach consensus will receive a token that carries some value and this value depends on the market conditions.
- Blockchain Platforms
A blockchain platform allows users and developers to create novel uses of existing blockchain infrastructure.
- Blockchain Society
Blockchain Society is a network that aims to develop the Blockchain ecosystem in the country.
- Blockchain for Lead Generation
The technology offers sales organizations the ability to provide users with the most current and highest-quality leads via the exchange of personal and/or business information based on crowdsourcing data collection methods.
- Blue Collar Software
It is software targeted at workers, often blue-collar, who do their jobs outside corporate offices. For example, software for billing, scheduling, navigation and many other business workflows.
- Blue Hydrogen
Blue Hydrogen is hydrogen that meets the low-carbon threshold but is generated using non-renewable energy sources (e.g. nuclear).
- Brain Augmentation
In addition to treating seizures, brain implants are being used for extraordinary things, such as memory storage, decoding neural patterns, and synthesizing speech. The ability for anything to alter the human brain in such a way was once unheard of, so this is a major turning point in the technology world.
- Brain-Computer Interface (BCI)
A technology that allows direct communication between the brain and an external device, such as a computer or prosthetic limb. BCI has potential applications in areas like medicine and gaming.
- Bring Your Own Everything (BYOE)
Bring Your Own Everything (BYOE) is a trend among businesses to allow employees to use their own personal mobile phones, tablets, and laptops for work.
- Brochureware
Brochureware is a site that contains general information but offers no user interaction. The website contains some content from print or other offline promotional material such as a brochure and offers links to other sites.
- Build Navigation System
Build Navigation System is a technology that allows the tracking and plotting of directions based on global positioning satellite (GPS) technology. The GPS system maps the current location with a destination to determine the fastest route.
- Business Intelligence (BI)
Business intelligence (BI) comprises the strategies and technologies used by enterprises for the data analysis of business information. BI technologies provide historical, current, and predictive views of business operations. Common functions of business intelligence technologies include reporting, online analytical processing, analytics, data mining, process mining, complex event processing, business performance management, benchmarking, text mining, predictive analytics, and prescriptive analytics. BI technologies can handle large amounts of structured and sometimes unstructured data to help identify, develop, and otherwise create new strategic business opportunities.
- Business Intelligence (BI) Platforms
Business intelligence (BI) platforms enable enterprises to build BI applications by providing capabilities in three categories: analysis, such as online analytical processing (OLAP); information delivery, such as reports and dashboards; and platform integration, such as BI metadata management and a development environment.
- Business Intelligence (BI) Services
Business intelligence (BI) services are offerings to design, develop, and deploy enterprise processes and to integrate, support, and manage the related technology applications and platforms. These include business and infrastructure applications for BI platforms, analytics needs, and data warehousing infrastructure. Solutions include areas such as corporate performance management (CPM) and analytics, in addition to the traditional BI platform, data warehouse/data infrastructure, and data quality areas.
- Carrier Network Infrastructure (CNI)
CNI is the physical infrastructure used to pass data between locations. It includes phone cable and fiber lines, satellite, radio, and microwave transmissions.
- Cell ID (CID)
A GSM Cell ID (CID) is a generally unique number used to identify each base transceiver station (BTS) or sector of a BTS within a location area code (LAC) if not within a GSM network.
- Chaos Engineering
Chaos engineering is a method of testing distributed software that deliberately introduces failure and faulty scenarios to verify its resilience in the face of random disruptions and let application remains intact in an emergency.
- Chatbots
A chatbot is a domain-specific conversational interface that uses an app, messaging platform, social network or chat solution for its conversations. Chatbots vary in sophistication, from simple, decision-tree-based marketing stunts, to implementations built on feature-rich platforms. They are always narrow in scope. A chatbot can be text- or voice-based, or a combination of both.
- Circular Economy
The Circular Economy is an evolution of the way the world produces and consumes both goods and services. The circular model redefines the economy around principles of designing out waste, biodiversity loss, pollution, and keeping products and materials in use for as long as possible.
- Cloud Access Security Brokers (CASBs)
A cloud access security broker (CASB) is a software tool or service that sits between an organization’s on-premises infrastructure and a cloud provider’s infrastructure. CASBs are available as both an on-premises or cloud-based software as well as a service.
- Cloud Computing
Cloud computing is a general term for anything that involves delivering hosted services over the internet. These services are divided into three main categories: infrastructure as a service (IaaS), platform as a service (PaaS), and software as a service (SaaS).
- Cloud Data Warehouse
A cloud data warehouse is a database delivered in a public cloud as a managed service that is optimized for analytics, scale and ease of use.
- Cloud ERP
Cloud ERP is a type of enterprise resource planning software that runs on a provider’s cloud computing platform, rather than on premises in an enterprise’s own data center.
ERP is a modular software system designed to integrate an organization’s business processes, such as accounting, HR, inventory management and purchasing, into a unified system. Before the popularization of modern cloud computing in the late 1990s, ERP systems ran inside the owner’s facility, or “on premises.” The cloud ERP era began in 1998 with the debut of the first ERP system to be delivered over the internet, NetLedger, later renamed NetSuite. - Cloud Federation
See Federated Cloud.
- Cloud Management Tools & Platforms
Cloud management tools and platforms enable organizations to manage multi-cloud (i.e., on-premises, public cloud, and edge) services and resources. This includes providing governance, life cycle management, brokering, and automation for managed cloud infrastructure resources across multiple functional areas. The tooling can be procured and operated by central organizations (for example, I&O, cloud center of excellence [COE], platform engineering/operations) or within specific lines of business, and can be deployed as an on-premises application or as a SaaS product.
- Cloud Object Storage
Cloud object storage is a format for storing unstructured data in the cloud. Object storage is considered a good fit for the cloud because it is elastic, flexible and it can more easily scale into multiple petabytes to support unlimited data growth. The architecture stores and manages data as objects compared to block storage, which handles data as blocks, and logical volumes and file storage which store data in hierarchical files.
- Cloud Office Migration
Cloud office migration refers to the process of consolidating and transferring a collection of workloads. Workloads can include emails, files, calendars, document types, related metadata, instant messages, applications, user permissions, compound structure and linked components. Migration of such workloads from at least one permanent on-premises platform or cloud office to a new cloud office environment, across a variety of similar product classes, is typical. During the migration process, enterprises will choose to cleanse their existing platforms by archiving old and outdated data. Migration tools are able to conduct an analysis of the respective workloads, identify if the data is deemed suitable for migration while maintaining its integrity, migrate the data between the source and target, and ensure governance in the subsequent platform.
- Cloud Security Posture Management (CSPM)
Cloud Security Posture Management (CSPM) is a market segment for IT security tools that are designed to identify misconfiguration issues and compliance risks in the cloud. An important purpose of CSPM programming is to continuously monitor cloud infrastructure for gaps in security policy enforcement.
- Cloud Workload Protection (CWP)
Cloud workload protection is a process of safeguarding workloads spread out across multiple cloud environments. Businesses that use public and private clouds can use a cloud workload protection platform to help defend themselves against cyber attacks.
- Cloud hosting
Clouding hosting provides hosting on a virtual server which can be located across many physical hardware servers and even different locations. It can be useful for managing the need for extra capacity at times of peak website usage or for running different instances of a website for testing.
- Cloud operations (CloudOps)
Cloud operations (CloudOps) is the management, delivery and consumption of software in a computing environment where there is limited visibility into an app’s underlying infrastructure. In the enterprise, CloudOps uses the DevOps principals of continuous integration and continuous deployment (CI/CD) to create best practices for achieving high availability by refining and optimizing business processes that run in a public cloud.
- Cloud-Managed Networks (CMNs)
A cloud-managed network is a network that is managed from the cloud. It represents an innovative network management mode that utilizes the cloud management platform to manage and monitor network devices and service experiences based on emerging technologies such as big data, artificial intelligence, and cloud computing.
- Cloud-Native Platforms (CNPs)
Cloud-native platforms enable teams to develop, run and manage applications without the manual labor associated with building or maintaining the underlying infrastructure. It’s a self-service approach that reduces handovers and potential delays which can otherwise hinder progress. Cloud-native platforms combine systems or concepts such as microservices, cloud platforms, smart data analysis tools, declarative APIs, and continuous delivery into a single seamless ecosystem.
- Cloud-native application
Cloud-native application refers to building and running applications to take advantage of the distributed computing offered by the cloud delivery model. Cloud-native apps are designed and built to exploit the scale, elasticity, resiliency, and flexibility the cloud provides.
- Co-innovation ecosystems
Co-innovation ecosystems refer to collaborative networks of organizations and individuals from different sectors and disciplines that work together to develop new products, services, and technologies.
- Cobots
Cobots, or collaborative robots, are robots intended to interact with humans in a shared space or to work safely in close proximity.
- Cognitive Agents
Cognitive agents are kind of agents that use Belief-Desire-Intention (BDI) based architecture, which are recently becoming popular due to their human kind of reasoning. Cognitive agents are normally static agents that require higher computation and more databases. These support autonomic computing.
- Cognitive Augmentation
Cognitive Augmentation can occur through accessing information and exploiting applications on traditional computer systems and the emerging multiexperience interface in smart spaces. This includes augmented intelligence scenarios, where humans and artificial intelligence work together to enhance cognitive performance, including decision-making and learning. Cognitive Augmentation improves the human thought and decision-making process, and carries technological elements of the brain augmentation category as they are physical implants that deal with cognitive reasoning.
- Cognitive Computing
Cognitive computing is the use of computerized models to simulate the human thought process in complex situations where the answers may be ambiguous and uncertain. The phrase is closely associated with IBM’s cognitive computer system, Watson. Cognitive computing overlaps with AI and involves many of the same underlying technologies to power cognitive applications, including expert systems, neural networks, robotics, and virtual reality (VR).
- Column-Oriented DBMS
A column-oriented DBMS is a database management system (DBMS) that stores data tables by column.
- Computational Storage
Computational storage is an information technology (IT) architecture in which data is processed at the storage device level to reduce the amount of data that has to move between the storage plane and the compute plane. The lack of movement facilitates real-time data analysis and improves performance by reducing input/output bottlenecks.
- Computer Vision
Computer vision is a field of computer science that works on enabling computers to see, identify and process images in the same way that human vision does, and then provide appropriate output. It is like imparting human intelligence and instincts to a computer. In reality though, it is a difficult task to enable computers to recognize images of different objects. Computer vision is closely linked with artificial intelligence, as the computer must interpret what it sees, and then perform appropriate analysis or act accordingly.
- Computer-Aided Design (CAD) Computer-Aided Design (CAD) is the use of computer-based software to aid in design processes. CAD software is frequently used by different types of engineers and designers. CAD software can be used to create two-dimensional (2-D) drawings or three-dimensional (3-D) models.
- Computer-Brain Interface
Computer-brain interface is a type of user interface, whereby the user voluntarily generates distinct brain patterns that are interpreted by the computer as commands to control an application or device. The best results are achieved by implanting electrodes into the brain to pick up signals. Noninvasive techniques are available commercially that use a cap or helmet to detect the signals through external electrodes.
- Connected Home
A connected home is networked to enable the interconnection and interoperability of multiple devices, services and apps, ranging from communications and entertainment to healthcare, security and home automation. These services and apps are delivered over multiple interlinked and integrated devices, sensors, tools and platforms. Connected, real-time, smart and contextual experiences are provided for the household inhabitants, and individuals are enabled to control and monitor the home remotely as well as within it.
- Container Management
Container management refers to a set of practices that govern and maintain containerization software. Container management tools automate the creation, deployment, destruction and scaling of application or systems containers.
- Container Networking
Container Networking is an emerging application sandboxing mechanism used in home desktops and web-scale enterprise networking solutions similar in concept to a virtual machine. Isolated inside the container from the host and all other containers are a full-featured Linux environment with its own users, file system, processes, and network stack. All applications inside the container are permitted to access or modify files or resources available inside the container only.
- Container-Native Storage (CNS)
Container-native storage (CNS) is different from traditional storage because it needs to be application-centric, with self-service features for developers or application owners. CNS also needs to run inside containers and work with the Kubernetes container orchestration platform.
- Content Migration
Content Migration is the process of moving information stored on a Web content management system (CMS), Digital asset management (DAM), Document management system (DMS), or flat HTML based system to a new system. Flat HTML content can entail HTML files, Active Server Pages (ASP), JavaServer Pages (JSP), PHP, or content stored in some type of HTML/JavaScript based system and can be either static or dynamic content.
- Continuous Backup
- Continuous Data Protection (CDP)
Continuous Data Protection (CDP) or Continuous Backup is a system that backs up data on a computer system every time a change is made. CDP maintains a continuous journal of data changes and makes it possible to restore a system to any previous point in time.
- Continuous Integration Ops (CIOps)
Continuous Integration Ops (CIOps) require CI operators or administrators to configure the IT infrastructure required to support the new codes before deployment continues. The CI system is designed to run build and tests then deploy at varying levels of sophistication according to the complexity of the pipeline.
- Continuous Intelligence
Continuous intelligence is a design pattern in which real-time analytics are integrated into business operations, processing current and historical data to prescribe actions in response to business moments and other events.
- Continuous innovation
Continuous innovation is a repetitive process of incorporating both modest, incremental, and total revolutionary improvements in existing products or services, product designs, and technologies.
- Controlled-environment agriculture (CEA)
Controlled-environment agriculture (CEA) is a technology-based approach to food production. The aim of CEA is to provide protection and maintain optimal growing conditions throughout the development of the crop. Production takes place within an enclosed growing structure such as a greenhouse or building. Plants are often grown using hydroponic methods in order to supply the proper amounts of water and nutrients to the root zone. CEA optimizes the use of resources such as water, energy, space, capital, and labor. CEA technologies include hydroponics, aeroponics, aquaculture, and aquaponics. Different techniques are available for growing food in controlled environment agriculture.
- Conversational Marketing
Conversational marketing technologies enable interactions between companies and customers that mimic human dialogue and do so at scale.
- Conversational User Interfaces
A conversational user interface is the ability of artificial intelligence-supported chatbots to have verbal and written interactions with human users.
- Corporate Cloud
See Private Cloud.
- Cryptocurrencies
A cryptocurrency is a digital medium of exchange that uses encryption to secure the processes involved in generating units and conducting transactions.
- Cryptocurrency Custody Services (Solutions)
Cryptocurrency custody solutions are independent storage and security systems used to hold large quantities of tokens. Custody solutions are one of the latest innovations to come out of the cryptocurrency ecosystem and have been expected to herald the entry of institutional capital into the industry.
- Cryptography
Cryptography is the study of secure communications techniques that allow only the sender and intended recipient of a message to view its contents. The term is derived from the Greek word “kryptos”, which means hidden.
- Customer experience (CX)
The customer experience (CX) spans the lifetime of customers’ relationships with a brand, starting before a purchase is made, continuing to active use, and advancing to renewal or repeat purchases. Examples of touchpoints can include opening a product’s packaging, reading the instruction manual, speaking to customer service, having the product repaired, and exchanging the product for a different model.
- Cyber Kill Chain
The Cyber Kill Chain is a traditional security model that describes an old-school scenario — an external attacker taking steps to penetrate a network and steal its data and breakdown the steps of the attack to help organizations prepare. For example, an attacker may create an infected Microsoft Office document that is intended to be delivered via phishing emails.
- Cyber Liability Insurance
Cyber liability insurance is a policy that helps a business financially recover from a cyber attack. Depending on your policy, it helps cover some or all of the costs to help pay ransomware demands, legal fees, regulatory fines, and the fallout from reputation issues. Cyber liability insurance is not a replacement for cyber security.
- Cyber-Physical Systems
Systems that combine physical and cyber components to interact with the physical world in real-time, allowing for autonomous decision-making and control. They have potential applications in areas like robotics, smart cities, and manufacturing.
- Cyberpunk
Cyberpunk is a social-political movement (or philosophy) grounded in dystopic science fiction and defined by the following features: scarce natural resources; heavy use of cybernetic augmentation like implants; the cyberspace and metaverse as the primary space where people interact; a world governed mostly by corporations instead of national sovereignties; and large inequalities between the poor and wealthy corporate executives. Cyberpunk has become a rather popular sub-genre in creative fields like literature and film, being the premise of works such as Neuromancer and Blade Runner 2049.
- Cybersecurity mesh
Cybersecurity mesh is a way for advanced, independent cybersecurity systems to work together to prevent cybercrime, such as data hacking. The goal is for the systems to move closer to the assets they protect while being able to assist each other when threats arise.
- Dark Data
Dark Data is the information assets organizations collect, process, and store during regular business activities, but generally fail to use for other purposes (for example, analytics, business relationships, and direct monetizing). Names, addresses, contact numbers, and identification numbers are some examples of frequently used business data. Email correspondences, log files, and other machine data are usually left unused.
- Data Catalog
A Data Catalog is an organized inventory of data assets in the organization, collection of metadata, combined with data management and search tools, that helps analysts and other data users to find the data that they need, serves as an inventory of available data, and provides information to evaluate fitness data for intended uses. Data Catalog tool designed to help organizations find and manage large amounts of data – including tables, files, and databases – stored in their ERP, human resources, finance, and e-commerce systems, as well as other sources like social media, and feeds.
- Data Classification
Data classification is the process of organizing data into categories that make it is easy to retrieve, sort and store for future use.
- Data Fabric
Data Fabric is an architecture and set of data services that provide consistent capabilities across a choice of endpoints spanning on-premises and multiple cloud environments.
- Data Hub
A data hub is a collection of data from multiple sources organized for distribution, sharing, and often subsetting and sharing. Generally this data distribution is in the form of a hub and spoke architecture.
- Data Integration
The discipline of data integration comprises the practices, architectural techniques and tools for achieving the consistent access and delivery of data across the spectrum of data subject areas and data structure types in the enterprise to meet the data consumption requirements of all applications and business processes.
- Data Integration Tools
The market for data integration tools includes vendors that offer software products to enable the construction and implementation of data access and data delivery infrastructure for a variety of data integration scenarios. For vendors, the demand for traditional data integration capabilities alongside the demand for innovative solutions requires robust, consistent delivery of highly developed solutions. Similarly, data integration tools interoperate and integrate with master data tools, data governance tools and data quality tools.
- Data Labeling
Data labeling, in the context of machine learning, is the process of detecting and tagging data samples. The process can be manual but is usually performed or assisted by software.
- Data Lakes
A data lake is a concept consisting of a collection of storage instances of various data assets. These assets are stored in a near-exact, or even exact, copy of the source format and are in addition to the originating data stores.
- Data Literacy
Data literacy is the ability to read, write and communicate data in context, including an understanding of data sources and constructs, analytical methods and techniques applied, and the ability to describe the use-case application and resulting value.
- Data Obfuscation
The act of deliberately making data difficult to understand or read. This can be done for various reasons, such as to protect sensitive data from unauthorized access or to make data unreadable by automated systems that might be used for data mining.
- Data Preparation
Data preparation is an iterative-agile process for exploring, combining, cleaning, and transforming raw data into curated datasets for self-service data integration, data science, data discovery, and BI/analytics.
- Data Quality
Data quality is a measure of the condition of data based on factors such as accuracy, completeness, consistency, reliability and whether it’s up to date. Measuring data quality levels can help organizations identify data errors that need to be resolved and assess whether the data in their IT systems is fit to serve its intended purpose.
- Data Quality Tools
Data quality tools are the processes and technologies for identifying, understanding and correcting flaws in data that support effective information governance across operational business processes and decision making. The packaged tools available include a range of critical functions, such as profiling, parsing, standardization, cleansing, matching, enrichment and monitoring.
- Data Storytelling
The ability to combine interactive data visualization with narrative techniques in order to package and deliver insights in a compelling, easily understood form for presentation to decision makers.
- Data Virtualization
Data virtualization is an umbrella term used to describe an approach to data management that allows an application to retrieve and manipulate data without requiring technical details about the data, such as how the data is formatted or where it is physically located. The goal of data virtualization is to create a single representation of data from multiple, disparate sources without having to copy or move the data.
- Data and Analytics Services
Data and analytics services are consulting, implementation and managed services for decision, analytics and information capabilities, executed on a technology platform that supports an organization’s fact-based decision making. Solutions contain information analysis and information management (IM) competencies, with commercial off-the-shelf and proprietary assets. Information analysis solutions focus on use cases and outcomes; IM solutions focus on information infrastructure and governance.
- Data as a service (DaaS)
Data as a service (DaaS) is a data management strategy that uses the cloud to deliver data storage, integration, processing, and/or analytics services via a network connection.
- Data for Good
Data for good is a movement in which people and organizations transcend organizational boundaries to use data to improve society.
- Data-driven approach
The data-driven approach means making strategic decisions based on data analysis and interpretation. A data-driven approach enables companies to examine and organize their data with the goal of better serving their customers and consumers.
- DataOps
DataOps is a collaborative data management practice focused on improving the communication, integration and automation of data flows between data managers and data consumers across an organization. The goal of DataOps is to deliver value faster by creating predictable delivery and change management of data, data models and related artifacts. DataOps uses technology to automate the design, deployment and management of data delivery with appropriate levels of governance, and it uses metadata to improve the usability and value of data in a dynamic environment.
- Database Audit and Protection
Database audit and protection (DAP) tools provide comprehensive security for relational database management systems (RDBMSs). DAP tools have their roots in the basic monitoring capabilities of database activity monitoring (DAM) tools. In response to expanded client requirements, vendors have added additional capabilities, such as data discovery and classification, threat and vulnerability management, application-level analysis, intrusion prevention and activity blocking, and identity and access management analysis.
- Database Encryption
Database encryption tools are used to protect data within relational database management systems (RDBMSs). Encryption can be implemented using native DBMS tools, third-party software tools and network-based appliances, or implemented within storage networks via fabric-based encryption. Database encryption is increasingly being implemented as a regulatory check box by clients that enable built-in encryption. There are two operational scenarios: encrypting the entire database or encrypting specific columns or fields within the RDBMS.
- Database Hosting
Database hosting provides a managed server that is optimized for running a relational database. Almost any host can run a web server, mail server, file share, or other file-based application. Databases are different in that they demand specific hardware and software configurations to correctly support the database.
- Database-as-a-service (DBaaS)
Database-as-a-service (DBaaS) is a cloud computing service. As a hosted/managed service, users don’t have to worry about setting up hardware or installing software. The service provider is handled by managing everything related to the database.
- Datafication
Datafication is a technological trend that is used actively in the Big Data industry. Datafication is the transformation of social action into online quantified data, that aims to transform most aspects of a business into quantifiable data that can be tracked, monitored, analyzed, and optimized. It refers to the use of tools and processes to turn an organization into a data-driven enterprise.
- DBMS (Database Management System)
A DBMS is a product used for the storage and organization of data that typically has defined formats and structures. DBMSs are categorized by their basic structures and, to some extent, by their use or deployment.
- Debugging
Debugging is the process using which we correct the bugs that we found during the testing process.
- Decarbonization
Decarbonization refers to the process of reducing carbon intensity (CO2) in the atmosphere, also lowering the amount of greenhouse gas emissions produced by the burning of fossil fuels to achieve low carbon energy sources used.
- Decentralization
Decentralization is the process of redistributing or dispersing functions. For example, a database that allows information to be securely recorded on a network of computers rather than verified and controlled by centralized entities. Central to this evolution of the Internet are four distinct blockchain-enabled applications: cryptocurrencies (fungible tokens), non-fungible tokens (NFTs), decentralized autonomous organizations (DAOs), and metaverses. In other words, it’s a P2P network, not a hierarchy of IP clients/domains/root Internet servers.
- Decentralized Applications (dApps)
A decentralized application is a computer application that runs on a distributed computing system. DApps have been popularized by distributed ledger technologies (DLT) such as the Ethereum Blockchain, where DApps are often referred to as smart contracts.
- Decentralized Autonomous Corporation (DAC)
- Decentralized Autonomous Organization (DAO)
A decentralized autonomous organization (DAO) is an organization represented by rules encoded as a computer program that is transparent, controlled by shareholders and not influenced by a central government. A DAO’s financial transaction record and program rules are maintained on a blockchain. The precise legal status of this type of business organization is unclear.
- Decentralized finance (commonly referred to as DeFi)
Decentralized finance (commonly referred to as DeFi) is a blockchain-based form of finance that does not rely on central financial intermediaries such as brokerages, exchanges, or banks to offer traditional financial instruments, and instead utilizes smart contracts on blockchains, the most common being Ethereum.
- Decision Intelligence
Decision intelligence is a practical domain framing a wide range of decision-making techniques bringing multiple traditional and advanced disciplines together to design, model, align, execute, monitor and tune decision models and processes. Those disciplines include decision management (including advanced nondeterministic techniques such as agent-based systems) and decision support as well as techniques such as descriptive, diagnostics and predictive analytics.
- Decision Management
Advanced nondeterministic techniques such as agent-based systems.
- Decryption
Decryption is a reverse process of encryption. It decodes the encrypted information so that an authorized user can only decrypt the data because decryption requires a secret key or password.
- Deep Neural Network ASIC
Deep Neural Network (DNN) application-specific integrated circuit (ASIC) is a purpose-specific processor that accelerates DNN computations.
- Deep Neural Networks
A deep neural network is a neural network with a certain level of complexity, a neural network with more than two layers. Deep neural networks use sophisticated mathematical modeling to process data in complex ways.
- Deepfake
Deep fake (also spelled deepfake) is a type of artificial intelligence (AI) used to create a convincing image, audio, and video hoaxes.
- Democratization
Democratization of technology means providing people with easy access to technical or business expertise without extensive (and costly) training. It focuses on four key areas — application development, data and analytics, design and knowledge — and is often referred to as “citizen access,” which has led to the rise of citizen data scientists, citizen programmers and more.
- Democratization of tech
Democratization of tech refers to the process of making technology more accessible, affordable, and user-friendly to a wider range of people, including those who have traditionally been excluded or marginalized from the benefits of technology.
- Desktop as a Service (DaaS)
Desktop as a service (DaaS) is an offering that provides users with an on-demand, virtualized desktop experience delivered from a remotely hosted location. It includes provisioning, patching and maintenance of the management plane and resources to host workloads.
- DevOps
DevOps is a set of practices that combine software development (Dev) and IT operations (Ops) to improve the speed and reliability of software delivery. The word DevOps is a combination of the terms development and operations, meant to represent a collaborative or shared approach to the tasks performed by a company’s application development and IT operations teams. In its broadest meaning, DevOps is a philosophy that promotes better communication and collaboration between these teams — and others — in an organization. In its most narrow interpretation, DevOps describes the adoption of iterative software development, automation and programmable infrastructure deployment and maintenance. The term also covers culture changes, such as building trust and cohesion between developers and systems administrators and aligning technological projects to business requirements. DevOps can change the software delivery chain, services, job roles, IT tools and best practices.
- DevSecOps
DevSecOps (development plus security plus operations) is a management approach that combines application development, security, operations, and infrastructure as a code (IaaS) in an automated, continuous delivery cycle. The main objective of DevSecOps is to automate, monitor and apply security at all phases of the software lifecycle, i.e., plan, develop, build, test, release, deliver, deploy, operate and monitor. Applying security at every stage of the software development process enables continuous integration, reducing the cost of compliance and providing software more rapidly. DevSecOps means that every employee and team is responsible for security from the outset, and they must make decisions efficiently and put them into action without forfeiting security.
- DHCI
Dynamic Host Configuration Protocol (DHCP) is a network management protocol used to automate the process of configuring devices on IP networks, thus allowing them to use network services such as DNS, NTP, and any communication protocol based on UDP or TCP.
- Differential Privacy
Differential Privacy – a privacy-preserving technique that adds random noise to datasets, making it difficult to identify specific individuals within the data.
- Digital Assistant
See Virtual Assistant.
- Digital Biomarkers
Digital biomarkers are defined as objective, quantifiable physiological and behavioral data that are collected and measured by means of digital devices such as portables, wearables, implantables or digestibles. The data collected is typically used to explain, influence and/or predict health-related outcomes. Digital biomarkers also represent an opportunity to capture clinically meaningful, objective data.
- Digital Divide
The Digital Divide is a growing chasm between people who have access to the latest information and communication technologies and those who don’t. This term describes the unequal spread of technology and the opportunities it affords between different socioeconomic groups in a society. These technologies include but are not limited to, smartphones, computers, and the internet.
- Digital Ethics
Digital ethics are the norms dedicated to ensuring the autonomy and dignity of users is respected on the internet. digital ethics prescribe how two individuals communicating online should behave, how two corporations should responsibly conduct internet commerce, and how companies should treat their users.
- Digital Human
Digital humans are AI-powered lifelike beings that can recreate the best parts of human interaction—conversation, communication and emotional connection. They can work in virtually any industry and across a huge number of use cases.
- Digital Ops
Digital is a business process framework designed to simplify, measure, and manage processes across the enterprise. DigitalOps includes RPA, business process management (BPM), workflow engines, decision management suites, process mining, low-code application platforms (LCAPs), and others.
- Digital Therapeutics (DTx)
Digital therapeutics (DTx) deliver evidence-based therapeutic interventions to patients that are driven by high quality software programs to prevent, manage, or treat a broad spectrum of physical, mental, and behavioral conditions. Digital therapeutics form an independent category of evidence-based products within the broader digital health landscape, and are distinct from pure-play adherence, diagnostic, and telehealth products.
- Digital Transformation
Digital transformation can refer to anything from IT modernization (for example, cloud computing), to digital optimization, to the invention of new digital business models. The term is widely used in public-sector organizations to refer to modest initiatives such as putting services online or legacy modernization.
- Digital Twin
A digital twin is a digital representation of a real-world entity or system. The implementation of a digital twin is an encapsulated software object or model that mirrors a unique physical object, process, organization, person or other abstraction. Data from multiple digital twins can be aggregated for a composite view across a number of real-world entities, such as a power plant or a city, and their related processes.
- Digital Water / Smart Water / Internet of Water / Water 4.0
May be defined as a group of emerging technology solutions that help water managers operate more effectively. These solutions harness state-of-the-art hardware and software to provide increasing levels of system intelligence, visibility, automation and control, and customer service.
- Digital Workplace
The digital workplace is the concept that businesses should use digital transformation (DX) to align technology, employees and business processes to improve operational efficiency and meet organizational goals. Meant to be the virtual compliment to the physical office space, the digital workplace is a broad term that includes any device, software or platform that staff members use to execute their jobs.
- Digital cloning
Digital cloning is an emerging technology, that involves deep-learning algorithms, which allows one to manipulate currently existing audio, photos, and videos that are hyper-realistic.
- Digital learning
Digital learning is any type of learning that is accompanied by technology or instructional practice that makes effective use of technology. It encompasses the application of a wide spectrum of practices, including blended and virtual learning. Digital learning is sometimes confused with online learning or E-learning.
- Disaster Recovery as a Service (DRaaS)
Disaster recovery as a service (DRaaS) is the replication and hosting of physical or virtual servers by a third party to provide failover in the event of a natural catastrophe, power outage or another type of business disruption.
- Distributed Control Systems
A distributed control system (DCS) is a digital automated industrial control system (ICS) that uses geographically distributed control loops throughout a factory, machine or control area. Unlike a centralized control system that operates all machines, a DCS allows each section of a machine to have its own dedicated controller that runs the operation. A DCS has several local controllers located throughout the area that are connected by a high-speed communication network. While each controller works autonomously, there is central supervisory control run by an operator.
- Distributed Denial of Service (DDoS) Attack
A type of cyber-attack in which the perpetrator continuously overwhelms the system with requests in order to prevent service of legitimate requests.
- Distributed Ledger
A type of database which spreads across multiple sites, countries, or institutions. Records are stored sequentially in a continuous ledger. Distributed ledger data can be either “permissioned” or “unpermissioned” to control who can view it.
- Distributed Ledger Technology (DLT)
Distributed ledger technology (DLT) is a digital system for recording the transaction of assets in which the transactions and their details are recorded in multiple places at the same time. Unlike traditional databases, distributed ledgers have no central data store or administration functionality.
- Distributed development
Distributed development is a software development model in which IT teams spread across geographical lines collaborate on applications or various software. These teams are often separated by mini-projects that are brought together for a final software buildout.
- Document Store DBMSs
A document store database (document store) is a database that uses a document-oriented model to store data.mDocument store databases store each record and its associated data within a single document. Each document contains semi-structured data that can be queried against using various query and analytics tools of the DBMS.
- Domotics
See Home Automation.
- Drone
A drone is a flying robot that can be remotely controlled or fly autonomously using software-controlled flight plans in its embedded systems, that work in conjunction with onboard sensors and a global positioning system (GPS). While using the words drone and UAV synonymously is common, there is a distinct difference: drones must have autonomous flight capabilities — meaning they can operate without any human intervention — while UAVs do not.
- Drone Swarms
Drone swarm technology is the ability of drones to autonomously make decisions based on shared information.
- Dual Persona
Dual Persona refers to mobile phones that enable people to maintain separate environments for personal and business use on the same device. Users can have both a work and home profile simultaneously, and by separating these two personas, they can segment and protect personal and corporate data.
- Dynamic Application Security Testing (DAST)
DAST is a process designed to detect security vulnerabilities in an application in its running state. Most DAST solutions limit testing to the exposed HTTP and HTML interfaces of Web-enabled applications; others are designed for non-Web protocol and data malformation (remote procedure calls, Session Initiation Protocol [SIP] vulnerabilities, etc.)
- E-learning
E-learning is a type of learning conducted digitally via electronic media on the internet. It can be accessed via most electronic devices including a computer, laptop, tablet, or smartphone. The teacher and students do not meet face-to-face. All coursework and communication are handled via email, forums, chat, or video conferencing.
- Edge AI
Edge AI refers to the use of AI techniques embedded in IoT endpoints, gateways, and edge devices, in applications ranging from autonomous vehicles to streaming analytics.
- Edge Analytics
Edge analytics is an approach to data collection and analysis in which an automated analytical computation is performed on data at a sensor, network switch or other device instead of waiting for the data to be sent back to a centralized data store.
- Edge Computing
Edge computing is an emerging computing paradigm that refers to a range of networks and devices at or near the user. Edge is about processing data closer to where it’s being generated, enabling processing at greater speeds and volumes, leading to greater action-led results in real-time.
- Edge Networking
Edge networking is a distributed computing paradigm that brings computation and data storage as close to the point of request as possible in order to deliver low latency and save bandwidth.
- Electronic Health Record (HER)
An electronic health record (EHR) is an individual’s official health document that is shared among multiple facilities and agencies. The role of EHRs is becoming increasing influential as more patient information becomes digital and larger numbers of consumers express a desire to have mobile access to their health records.
- Ember.js
Ember.js is an open-source, free JavaScript client-side framework used for developing web applications. It allows building client-side JavaScript applications by providing a complete solution that contains data management and an application flow.
- Emissions Upcycling
When we burn coal, oil, or natural gas, CO2 (carbon dioxide) is produced as a waste product. If the CO2 is disposed of in the atmosphere, it contributes to global warming, an extremely serious environmental problem. A better option is to capture the waste CO2, and then use it to make useful products, like concrete-replacement materials, for example. This is the meaning of ’emissions upcycling’ (note the similarity to the more familiar term ‘recycling’). Current carbon-capture technology is capable of capturing 85-95% of the CO2 produced by a coal-fired power plant, although it comes at the cost of diverting 10-40% of the energy generated to the CO2 capture and storage process itself.
- Emotion AI
See Affective Computing.
- Encryption
Encryption is the method by which information is converted into secret code that hides the information’s true meaning.
- Endpoint Protection Platform (EPP)
An endpoint protection platform (EPP) is a solution deployed on endpoint devices to prevent file-based malware attacks, detect malicious activity, and provide the investigation and remediation capabilities needed to respond to dynamic security incidents and alerts.
Detection capabilities will vary, but advanced solutions will use multiple detection techniques, ranging from static IOCs to behavioral analysis. Desirable EPP solutions are primarily cloud-managed, allowing the continuous monitoring and collection of activity data, along with the ability to take remote remediation actions, whether the endpoint is on the corporate network or outside of the office. In addition, these solutions are cloud-data-assisted, meaning the endpoint agent does not have to maintain a local database of all known IOCs, but can check a cloud resource to find the latest verdicts on objects that it is unable to classify.
- Endpoint detection and response (EDR)
Endpoint detection and response (EDR), also known as endpoint threat detection and response (ETDR) is a cyber security layer that combines real-time continuous monitoring and collection of endpoint data with rules-based automated response and analysis capabilities. EDR monitors and analyzes the activity on your network to establish a baseline for what’s normal and picks up where your firewall and antivirus leave off. Those tools (and many others) block known threats – EDR battles the unknown.
- Enterprise automation
Enterprise automation is the systematic review of an organization’s processes to ensure optimal performance while looking for manual processes that can be replaced with intelligent digital alternatives.
- Event Stream Processing
Event stream processing (ESP) is a software capacity designed to support implementation of event-driven architectures. In an environment where some software entities produce events, other entities manage events, and still others consume events, event stream processing may be properly ascribed to the event manager. This kind of software entity is constantly fielding events, and deciding whether or not to forward them to event consumers for subsequent reactions that can range from simple logging to triggering complex collection of error handling, recovery, clean-up, and other related routines.
ESP technologies include event visualization, event databases, event-driven middleware, and event processing languages, or complex event processing (CEP).
- Everything As A Service (Xaas)
XaaS refers to the variety of products and tools that can be purchased using an “as a service” consumption model. XaaS is an acronym, the ‘X’ stands for “anything”. It is an extremely wide-ranging term that refers to any tools, applications, services, games, etc., which are delivered to your laptop or other devices via the cloud, rather than obtained on-premises or in a physical format. The only difference between XaaS and just selling tools is the method of payment. Instead of an upfront fee, you pay for it as a service for long-term usage. XaaS can include a variety of technology service models, including Software as a Service (SaaS), Device as a Service (DaaS), Infrastructure as a Service (IaaS), and Data Center as a Service (DCaaS).
- Expert system
An expert system is a computer program that uses AI technologies to simulate the judgment and behavior of a human or an organization that has expertise and experience in a particular field. Expert systems are usually intended to complement, not replace, human experts.
- Explainable AI
Explainable AI (XAI) is artificial intelligence that is programmed to describe its purpose, rationale, and decision-making process in a way that can be understood by the average person. XAI is often discussed in relation to deep learning and plays an important role in the FAT ML model (fairness, accountability, and transparency in machine learning).
- Explainable Graph Neural Networks (xGNN)
xGNN is a type of neural network that uses graphs to model relationships between data, with the ability to provide clear explanations for its decisions.
- Extended Detection and Response (XDR)
Extended Detection and Response (XDR) is “a SaaS-based, vendor-specific, security threat detection and incident response tool that natively integrates multiple security products into a cohesive security operations system that unifies all licensed components.”
XDR enables an enterprise to go beyond typical detective controls by providing a holistic and yet simpler view of threats across the entire technology landscape. XDR delivers real-time information needed to deliver threats to business operations for better, faster outcomes.
- Extended Reality (XR)
Extended reality (XR) is a term referring to all real-and-virtual combined environments and human-machine interactions generated by computer technology and wearables, where the ‘X’ represents a variable for any current or future spatial computing technologies.
- Eye-Tracking
A technology that uses cameras to track the movement of the user’s eyes, allowing for more natural interaction with digital environments. Eye-tracking can be used to control the movement of the virtual camera, to enable selective focus or depth of field effects, and to provide more accurate targeting in games or other applications.
- Fabs or Semiconductor Fabs
“Fabs” is an acronym for semiconductor fabrication plants in which raw silicon wafers are turned into integrated circuits that come in frequent lingual use due to the ongoing global crisis of chips. The lack of fabs is believed to be the silicon shortage that ultimately affected the deliveries of cars, smartphones, and other electronic devices.
- Federated Cloud
A federated cloud is the deployment and management of multiple external and internal cloud computing services to match business needs. A federation is the union of several smaller parts that perform a common action.
- Federated enterprise tech buying
Federated enterprise tech buying refers to a decentralized approach to technology procurement in which individual departments or business units within a large organization are given more autonomy to make their own technology purchasing decisions, rather than relying on centralized IT departments.
- Femtech
Femtech refers to software, diagnostics, products, and services, that use technology to support women’s health.
- Fetch
Fetch is the retrieval of data by a software program, script, or hardware device. After being retrieved, the data is moved to an alternate location or displayed on a screen. Fetch-decode cycle, Hardware terms, Read, Software terms.
- Field-Programmable Gate Array (FPGA) Accelerators
Field-Programmable Gate Array (FPGA) Accelerators is a server-based reconfigurable computing accelerator that delivers extremely high performance by enabling programmable hardware-level application acceleration.
- File Analysis Software
File analysis (FA) products analyze, index, search, track and report on file metadata and file content, enabling organizations to take action on files according to what was identified. FA provides detailed metadata and contextual information to enable better information governance and organizational efficiency for unstructured data management. FA is an emerging solution, made of disparate technologies, that assists organizations in understanding the ever-growing volume of unstructured data, including file shares, email databases, enterprise file sync and share, records management, enterprise content management, Microsoft SharePoint and data archives.
- FinTech
FinTech or financial technology is a catch-all term referring to software, mobile applications, and other technologies created to improve and automate traditional forms of finance for businesses and consumers alike.
- Firewall
Typically an appliance device sits on the edge of your local network and connects you securely to the internet. A firewall will keep threats out of your network and can also provide services like VPN, content filtering, and integrity checking of websites your end users are visiting.
- Firewall as a service
Firewall as a service (FWaaS), also known as a cloud firewall, is a service that provides cloud-based network traffic inspection capabilities to customers seeking to decommission or augment their existing network firewall appliances. This approach reduces the burden on on-premises data center equipment and reduces the management burden for internal cybersecurity teams.
- Flying Autonomous Vehicles
Flying Autonomous Vehicles are flying vehicles (planes, drones, etc.) that can fly itself from a starting point to a predetermined destination in “autopilot” mode using various in-vehicle technologies and sensors.
- Fog Computing
Fog computing is a decentralized computing infrastructure in which data, computing, storage, and applications are located somewhere between the data source and the cloud. Like edge computing, fog computing brings the advantages and power of the cloud closer to where data is created and acted upon.
- FoodTech
FoodTech is an ecosystem made of all the agrifood entrepreneurs and startups (from production to distribution) innovating on the products, distribution, marketing, or business model.
- Fork
A fork creates an alternative version of a blockchain, and are often enacted intentionally to apply upgrades to a network. Soft Forks render two chains with some compatibility, while Hard Forks create a new version of the chain that must be adopted to continue participation. In the instance of a contentious Hard Fork, this can create two versions of a blockchain network.
- Full Self-Driving Beta (FSD)
Tesla adopt the Full Self-Driving (FSD) software which enables the vehicle to drive autonomously to a place entered in its navigation system electric cars. It is mostly based on AI and machine learning which the American company is testing in its arena.
- Function as a service (FaaS)
Function as a service (FaaS) is a computing service that provides an event-driven platform that allows developers to build, run, and manage applications without the need to maintain infrastructures.
- Functional Material
Functional materials are a group of engineered and advanced materials designed with specific characteristics and synthesized for some specific function with proper surface morphology and tailored properties that have an impact on selected reaction/s. For example, ferroelectricity, piezoelectricity, magnetism, or energy storage functions.
- Fungibility
Fungibility is the ability of a good or asset to be interchanged with other individual goods or assets of the same type. Fungible assets simplify the exchange and trade processes, as fungibility implies equal value between the assets.
- Game theory
Game theory is the study of mathematical models of strategic interactions among rational agents. It has applications in all fields of social science, as well as in logic, systems science and computer science.
- GameFi
GameFi refers to the combination of video games and decentralized finance (DeFi) where gamers earn cryptocurrencies (like Ethereum) by playing online video games. Axie Infinity is one such NFT-based gaming platform with around 2 million daily active users.
- Gamification
Gamification – it’s the application of gaming principles to everyday activities. Gamification utilizes game thinking and mechanics in a non-game context to engage users and elicit greater participation. For example, the company could gamify its sales process by giving employees points for each sale they make.
- General-Purpose Machine Intelligence
- Generative AI
Generative AI refers to Artificial Intelligence that can generate novel content, rather than simply analyzing or acting on existing data. Generative AI models produce text and images: blog posts, program code, poetry, and artwork.
- Generative Adversarial Networks (GANs)
A generative adversarial network (GAN) is a machine learning model in which two neural networks compete with each other to become more accurate in their predictions. GANs typically run unsupervised and use a cooperative zero-sum game framework to learn.
- Generative Pre-trained Transformer 1 (GPT-1)
The first version of the GPT series was released in 2018. It had 117 million parameters and was pre-trained on a large corpus of text data to generate human-like text.
- Generative Pre-trained Transformer 2 (GPT-2)
The second version of the GPT series was released in 2019. It had 1.5 billion parameters, making it much more powerful than GPT-1. GPT-2 was capable of generating high-quality text in various styles and genres, including news articles and poetry.
- Generative Pre-trained Transformer 3 (GPT-3)
GPT-3 – a large-scale language processing AI model capable of generating human-like text and responding to prompts.
- Generative Pre-trained Transformer 4 (GPT-4)
GPT-4 is a large multimodal model created by OpenAI and announced on March 14, 2023. Multimodal models can encompass more than just text – GPT-4 also accepts images as input. Meanwhile, GPT-3 and GPT-3.5 only operated in one modality, text, meaning users could only ask questions by typing them out.
- Genetic Augmentation
With genetic augmentation, somatic gene and cell therapies are used to treat children with severe combined immune deficiency. CRISPR technologies may soon become more easily accessible and affordable, which will enable broad genetic engineering.
- Geofencing
Geofencing is a location-based service in which an app or other software program uses radio frequency identification (RFID), Wi-Fi, GPS, or cellular data to trigger a targeted marketing action (such as a text, email, social media advertisement, app notification) when a mobile device or RFID tag enters or exits a virtual geographic boundary, known as a geofence.
- Geographic Information System (GIS)
A GIS (geographic information system) enables you to envision the geographic aspects of a body of data. Basically, it lets you query or analyze a database and receive the results in the form of some kind of map. Since many kinds of data have important geographic aspects, a GIS can have many uses: weather forecasting, sales analysis, population forecasting, and land use planning, to name a few.
- GitOps
GitOps is a paradigm designed around Kubernetes that allows developers and IT operations teams to use Git for cluster management and application delivery. Applying a Git workflow to operations and development teams allows developers to use Git pull requests to manage infrastructure provisioning and software deployment. Git is an open source distributed code management and Version control system. Git is used to track changes in source code throughout software development and is also used for configuration and content management. A portion of Git, related to GitOps, includes pull requests. A pull request lets users tell others what changes they’ve pushed to a new branch in a repository through GitHub. When a user opens a pull request, they can review, discuss or follow up on commits before any changes are merged into a base branch.
- Global Infrastructure Services (GIS)
Global Infrastructure Services (GIS) has been established to address the shortage of key skills in a range of industries. GIS operates in the following skill areas: Project planning and project scheduling. Cost management and cost control. Project controls and reporting.
- Global Positioning System (GPS)
Global Positioning System (GPS) is a U.S.-owned utility that provides users with positioning, navigation, and timing (PNT) services that use satellite signals to find the location of a radio receiver on or above the earth’s surface.
- Graph Analytics
Graph analytics is a category of tools used to apply algorithms that will help the analyst understand the relationship between graph database entries.
- Graph DBMSs
Graph DBMS represent data in graph structures as nodes and edges, which are relationships between nodes. They allow easy processing of data in that form, and simple calculation of specific properties of the graph, such as the number of steps needed to get from one node to another node. Graph DBMSs usually don’t provide indexes on all nodes, direct access to nodes based on attribute values is not possible in these cases.
- Graph Database
See Graph DBMSs.
- Graphical User Interface (GUI) Design
A graphical user interface (GUI) is a type of user interface through which users interact with electronic devices via visual indicator representations. A GUI performs functions by clicking and moving buttons, icons, and menus with a pointing device, such as a mouse or trackpad.
- Graphics-processing units (GPU) Accelerators
GPU Accelerated computing is the use of graphics processing unit (GPU) to accelerate highly parallel compute-intensive portions of the workloads in conjunction with a CPU.
- GreenTech or Green Technology
GreenTech refers to a type of technology that is considered environmentally friendly based on its production process or its supply chains like clean energy production, the use of alternative fuels, and technologies that are less harmful to the environment than fossil fuels.
- Haptic Technology / Kinesthetic Communication / 3D Touch
Haptic technology, also known as kinesthetic communication or 3D touch, refers to any technology that can create an experience of touch by applying forces, vibrations, or motions to the user. These technologies can be used to create virtual objects in a computer simulation, to control virtual objects, and to enhance remote control of machines and devices (telerobotics).
- Hardware Accelerator
Silicon chip designed to perform highly parallel operations required by AI; also enables simultaneous computations.
- Hash
Hash is a data block encryption with a unique string of numbers and letters related to a specific asset or object that is an algorithm-generated amalgamation of all the transaction hashes that came before it.
- Head-Mounted Display (HMD)
Head-mounted displays (HMDs) are small displays or projection technology integrated into eyeglasses or mounted on a helmet or hat. Heads-up displays are a type of HMD that does not block the user’s vision but superimposes the image on the user’s view of the real world. An emerging form of heads-up display is a retinal display that “paints” a picture directly on the sensitive part of the user’s retina. Although the image appears to be on a screen at the user’s ideal viewing distance, there is no actual screen in front of the user, just special optics (for example, modified eyeglasses) that reflect the image back into the eye. Other heads-up displays that are not worn by the user but are projected on a surface (for example, on a car or plane windshield) are not covered in this discussion. Some HMDs incorporate motion sensors to determine direction and movement (for example, to provide context-sensitive geographic information) or as the interface to an immersive virtual reality application.
- High availability
This means building software that’s always up and functioning and never goes down, a welcome technical challenge for engineers.
- Holographic 3D Printing
- Home Automation
Home automation is building automation for a home, called a smart home or smart house. A home automation system will control lighting, climate, entertainment systems, and appliances. It may also include home security such as access control and alarm systems. When connected to the Internet, home devices are an important constituent of the Internet of Things (“IoT”).
- Home Hub
Smart home-enabled devices can include appliances like refrigerators, washing machines, dryers, and toaster ovens, as well as heating and air conditioning units and lighting devices. Some examples of smart home-enabled electronic devices are audio and video entertainment systems, camera and security systems, and computers, laptops and other electronics mobile devices.
- Homomorphic Encryption
An encryption technique that allows data to be processed without being decrypted. This allows computations to be performed on encrypted data without ever revealing the original plaintext data.
- Human Augmentation
The field of human augmentation focuses on creating cognitive and physical improvements as an integral part of the human body. An example is using active control systems to create limb prosthetics with characteristics that can exceed the highest natural human performance.
- Human-Machine Interface (HMI)
Human-machine interface (HMI) is a component of certain devices that are capable of handling human-machine interactions. The interface consists of hardware and software that allow user inputs to be translated as signals for machines that, in turn, provide the required result to the user. Human-machine interface technology has been used in different industries like electronics, entertainment, military, medical, etc. Human-machine interfaces help in integrating humans into complex technological systems.
- Human-centric
Human-centric means that it’s technology or change or transformation that’s focused on humans and is adaptable by humans.
- Human-in-the-Loop
Human-in-the-Loop – a machine learning approach that involves human input and oversight to improve model accuracy and reduce bias.
- Hybrid Cloud
Hybrid cloud is a cloud computing environment that uses a mix of on-premises, private cloud and third-party, public cloud services with orchestration between the two platforms. By allowing workloads to move between private and public clouds as computing needs and costs change, hybrid cloud gives businesses greater flexibility and more data deployment options.
- Hybrid Cloud Hosting
Hybrid Cloud Hosting is a type of cloud hosting model that utilize public and private cloud hosting tools and techniques to provide a cloud hosting environment or solution. The most common hybrid cloud example is combining a public and private cloud environment, like Google Cloud.
- Hybrid Cloud Storage
Hybrid cloud storage is an approach to managing cloud storage that uses both local and off-site resources. The hybrid cloud storage infrastructure is often used to supplement internal data storage with public cloud storage. Policy engines keep frequently used data on-site while simultaneously moving inactive data to the cloud in a transparent manner.
- Hybrid WAN
A hybrid WAN is a wide area network that sends traffic over two or more connection types. Hybrid WANs permit dynamic traffic engineering across both private and public domains, using a variety of connectivity options to make the best use of network resources. In its strictest sense, a hybrid WAN uses dedicated multiprotocol label switching (MPLS) circuits, plus carrier Ethernet, plus T3 links.
- Hybrid automation
Hybrid automation combines attended and unattended RPA and enables the passing of work to robots automatically or by employee prompt, allowing the robot and employee to work on different tasks simultaneously.
- Hyper Text Markup Language (HTML)
HTML is an acronym that stands for Hyper Text Markup Language which is used for creating web pages and web applications. Let’s see what is meant by Hypertext Markup Language, and a Web page.
- Hyperautomation
Hyperautomation deals with the application of advanced technologies, including artificial intelligence (AI) and machine learning (ML), to increasingly automate processes and augment humans. Hyperautomation extends across a range of tools that can be automated, but also refers to the sophistication of the automation (i.e., discover, analyze, design, automate, measure, monitor, reassess). In simple terms, hyperautomation refers to the mixture of automation technologies that exist to augment and expand human capabilities.
- Hyperconverged Infrastructure Software
Hyperconverged infrastructure (HCI) software provides virtualized compute, storage, networking and associated (cloud) management — from a single instantiation running on server hardware. When the first HCI platforms emerged a decade ago, they were sold primarily as alternatives to expensive and complex storage arrays for VMware environments. These initial HCI solutions were sold primarily as appliances by a single hardware vendor.
- Hyperdata
Hyperdata refers to the means by which a dataset is linked to other datasets housed in other locations or information silos, much in the same way that hypertext indicates the relationship between texts scattered throughout the internet.
- Hypersonic
Hypersonic is an adjective that relates to anything that can travel faster than five times the speed of sound.
- Hypertext Transfer Protocol (HTTP)
HTTP is an application protocol for distributed, collaborative, hypermedia information systems that allow users to communicate data on the World Wide Web.
- iBPMSs
Intelligent business process management suites (iBPMS) combine business process management (BPM) software with additional capabilities such as artificial intelligence (AI) to help companies dynamically automate more types of start-to-finish experiences. These suites are often cloud-enabled and provide low-code tools that help citizen developers create workflow solutions very quickly.
- iBPMSs Market
The intelligent business process management suite (iBPMS) market is the natural evolution of the earlier BPMS market, adding more capabilities for greater intelligence within business processes. Capabilities such as validation (process simulation, including “what if”) and verification (logical compliance), optimization, and the ability to gain insight into process performance have been included in many BPMS offerings for several years. iBPMSs have added enhanced support for human collaboration such as integration with social media, mobile-enabled process tasks, streaming analytics and real-time decision management.
- Identity and Access Management (IAM)
Identity and access management (IAM) is the discipline that enables the right individuals to access the right resources at the right times for the right reasons.
IAM addresses the mission-critical need to ensure appropriate access to resources across increasingly heterogeneous technology environments and to meet increasingly rigorous compliance requirements. IAM is a crucial undertaking for any enterprise. It is increasingly business-aligned, and it requires business skills, not just technical expertise. - Immediate Payments
See Instant Payments.
- Immersive Analytics
Immersive Analytics investigates how new interaction and display technologies can be used to support analytical reasoning and decision-making. The aim is to provide multi-sensory interfaces for analytics approaches that support collaboration and allow users to immerse themselves in their data. Immersive Analytics builds on technologies such as large touch surfaces, immersive virtual and augmented reality environments, haptic and audio displays, and modern fabrication techniques.
- Immersive Audio
Audio technology that creates a 3D sound environment, enhances the immersive experience of VR and AR applications. Immersive audio can create the illusion of sound coming from specific directions or locations in the virtual or physical environment.
- Immersive Workspaces
Immersive workspace, come from the word immersive experience, the perception of being in one place when you are actually in another (in particular make use of immersive technology and devices, such as wearables 3d headset as an example). Immersive workspaces is new modern workplace that emerging as the result of technology advancement. The immersive workspaces enables people and contrasting environmental locations to come together in one place for the purpose of collaboration and communication in the media rich 3d environment. It allow screen sharing to co-working by collaborate working together on documents and perform live edits in real time. Actions can be created during a meeting and notes can be taken all of which will be stored on immersive workspaces website for review later on.
- Immersive technology
Immersive technology refers to new ways of creating, displaying, and interacting with applications, content, and experiences. Immersive technology has transformed the digital experience by bringing together the virtual with users’ sight, sound, and even touch.
- In-Memory Computing (IMC)
In-Memory Computing (IMC) is a computing style where applications assume all the data required for processing is located in the main memory of their computing environment. IMC stores data in RAM rather than in databases hosted on disks. This eliminates the I/O and ACID transaction requirements of OLTP applications and exponentially speeds data access because RAM-stored data is available instantaneously, while data stored on disks is limited by network and disk speeds. IMC can cache massive amounts of data, enabling extremely fast response times, and store session data, which can help achieve optimum performance.
- In-Memory DBMS
An IMDBMS is a DBMS that stores the entire database structure, including the applications, “in memory” and accesses all the data directly.
- In-Memory Data Grid Software
In-Memory Data Grid Software (IMDGs) provide a lightweight, distributed, scale-out in-memory object store — the data grid. Multiple applications can concurrently perform transactional and/or analytical operations in the low-latency data grid, thus minimizing access to high-latency, hard-disk-drive-based or solid-state-drive-based data storage. IMDGs maintain data grid durability across physical or virtual servers via replication, partitioning and on-disk persistence. Objects in the data grid are uniquely identified through a primary key, but can also be retrieved via other attributes. The most typical use of IMDGs is for web-scale transaction processing applications. However, adoption for analytics, often in combination with Apache Spark and Hadoop or stream analytics platforms, is growing fast — for example, for fraud detection, risk management, operation monitoring, dynamic pricing and real-time recommendation management.
- In-database Analytics
In-database analytics is a technology that allows data processing to be conducted within the database by building analytic logic into the database itself. Doing so eliminates the time and effort required to transform data and move it back and forth between a database and a separate analytics application.
- Indoor navigation
Indoor navigation is a base map through which you can get directions within buildings. An Indoor Positioning System (IPS), on the other hand, has the same function as GPS in an indoor environment. IPS serves to accurately locate people or objects digitally, often on a mobile device.
- Industry 4.0
Industry 4.0 refers to a new phase in the Industrial Revolution that focuses heavily on interconnectivity, automation, Machine Learning, and real-time data.
- Information Stewardship Applications
Information stewardship applications are business solutions used by business users acting in the role of information steward. For the most part, these developing solutions represent a combination of a number of disparate, previously IT-centric tools that are already on the market. They are, however, organized and presented in such a way that information stewards can support the work of information policy enforcement — as part of their normal, business-centric day-to-day work — in a range of use cases.
- Information Technologies Operations (ITOps)
Information Technologies Operations (ITOps) is the process responsible for acquiring, designing, deploying, configuring, and maintaining the physical and virtual components that comprise your IT infrastructure.
- Infrastructure and Operations (I&O)
Infrastructure and Operations, I&O, teams are broadly responsible for the administration and management of technology, information, and data. These teams manage a variety of elements including computers, servers, processes, networking, storage, data, software, security, and cloud-based services.
- Infrastructure as Code (IaC)
Infrastructure as code, also referred to as IaC, is an IT practice that codifies and manages underlying IT infrastructure as software. The purpose of infrastructure as code is to enable developers or operations teams to automatically manage, monitor and provision resources, rather than manually configure discrete hardware devices and operating systems. Infrastructure as code is sometimes referred to as programmable or software-defined infrastructure.
- Infrastructure as a Service (IaaS)
Infrastructure as a Service IaaS, also known as cloud infrastructure services, consists of automated and easily scalable computing resources. This allows businesses access to all the IT infrastructure they need, whether on a temporary or long-term basis. The favorable pricing of this service model offers the advantage to companies of not having to invest in hardware that may only be used a few times.
- Initial Coin Offerings (ICOs)
An initial coin offering (ICO) or initial currency offering is a type of funding using cryptocurrencies. It is often a form of crowdfunding, however, private ICOs which does not seek public investment are also possible. In an ICO, a quantity of cryptocurrencies is sold in the form of “tokens” (“coins”) to speculators or investors, in exchange for legal tender or other (generally established and more stable).
- Innovation Factory
The Innovation Factory is a new concept in industrial design, bringing the latest technologies for design, modeling, prototyping, and alternative manufacturing to Main Street. IF’s aim is to provide the technology, expertise, and infrastructure to enable designers with fresh ideas to move quickly and seamlessly from design concept to production, in a stimulating and supportive environment.
- Insights Engines
Insight engines apply relevancy methods to describe, discover, organize and analyze data. This allows existing or synthesized information to be delivered proactively or interactively, and in the context of digital workers, customers or constituents at timely business moments. Products in this market use connectors to crawl and index content from multiple sources. They index the full range of enterprise content, from unstructured content such as word processor and video files through to structured content, such as spreadsheet files and database records. Various “pipelines” are used to preprocess content according to type, and to derive from it data that can be indexed for query, extraction and use via a range of touchpoints. Insight engines differ from search engines in terms of capabilities that enable richer indexes, more complex queries, elaborated relevancy methods, and multiple touchpoints for the delivery of data (for machines) and information (for people).
- Instant Payments
Instant Payments are electronic retail payment solutions available 24/7/365 and resulting in the immediate or close-to-immediate interbank clearing of the transaction and crediting of the payee’s account with confirmation to the payer (within seconds of payment initiation).
- Integration Platform as a Service (iPaaS)
An iPaaS, or integration Platform as a Service, is a platform that standardizes how applications are integrated into an organization, making it easier to automate business processes and share data across applications.
- Intelligent Applications
Intelligent applications refer to software applications that use artificial intelligence, machine learning, and other advanced technologies to automate and optimize business processes, enhance decision-making, and provide personalized experiences to users.
- Intelligent Avatar Platforms (IAP)
Intelligent Avatar Platform (IAP) can be defined as an online platform supported by artificial intelligence that allows one to create a clone of themselves.
- Intelligent Drug Discovery
Intelligent Drug Discovery is a drug discovery process using AI technologies. If adopted at the drug discovery stage, AI solutions have the potential to kick-start the productivity of the entire R&D process. AI has the potential to: reduce timelines for drug discovery and improve the agility of the research process; increase the accuracy of predictions on the efficacy and safety of drugs; improve the opportunity to diversify drug pipelines.
- Intelligent Logistics
See Smart Logistics.
- Intelligent Process Automation (IPA)
Intelligent process automation (IPA) is a combination of technologies used to manage and automate digital processes. IPA should increase operational efficiency, worker performance, and response efficiency to customers. They should also reduce operational risk, with the goal of creating an effective operations environment.IPA is designed to assist human workers by doing manual, repetitive and routine tasks that were previously performed by humans. Technologies combined in IPA include robotic process automation (RPA), artificial intelligence (AI), machine learning and digital process automation (DPA). With these technologies — especially with AI and machine learning — an IPA tool should be able to learn how to adjust and improve the process flow to create an intelligent process. It should be able to learn and improve over time.
- Intelligent Sensor
Intelligent sensors utilize advanced signal processing techniques, data fusion techniques, intelligent algorithms, and artificial intelligence concepts to better understand sensor data, for better integration of sensors and better feature extraction, leading to measures that can be used in smart sensing applications.
- Intelligent Virtual Agent (IVA)
See Virtual Agent.
- Intelligent retrieval
Intelligent retrieval planning is the application of artificial intelligence techniques to the task of efficient retrieval of information from very large databases. Using such techniques, significant increases in efficiency can be obtained.
- Intent-Based Networking
Intent-based networking (IBN) is a form of network administration that incorporates artificial intelligence (AI), network orchestration, and machine learning (ML) to automate administrative tasks across a network. The goal of IBN is to reduce the complexity of creating, managing, and enforcing network policies and reduce the manual labor associated with traditional configuration management.
- Interactive Application Security Testing (IAST)
IAST is a process of testing an application “from the outside in” with the goal of detecting security vulnerabilities in its running state.
- Interactive voice response (IVR) IVR is an automated telephone system that combines pre-recorded messages or text-to-speech technology with a dual-tone multi-frequency (DTMF) interface to engage callers, allowing them to provide and access information without a live agent.
- Internal Cloud
See Private Cloud.
- IoT Platform
An IoT platform is an on-premises software suite or a cloud service (IoT platform as a service [PaaS]) that monitors and may manage and control various types of endpoints, often via applications business units deploy on the platform. The IoT platform usually provides (or provisions) Web-scale infrastructure capabilities to support basic and advanced IoT solutions and digital business operations.
- ISO 27001
ISO 27001 (formally known as ISO/IEC 27001:2005) is a specification for an information security management system (ISMS). An ISMS is a framework of policies and procedures that includes all legal, physical, and technical controls involved in an organization’s information risk management processes.
- IT Service Management (ITSM) Tools
IT Service Management (ITSM) Tools enable IT operations organizations, specifically I&O managers, to better support the production environment. ITSM tools facilitate the tasks and workflows associated with the management and delivery of quality IT services. These are most heavily used by IT service desks and IT service delivery functions. Some non-I&O departments (such as HR or facilities) adapt generic ticket-handling and workflow capabilities for their own use. ITSM tools are classified based on ITSM capabilities and integration with ITOM solutions and include: basic ITSM tools that have some ITSM capabilities and limited integration with ITOM solutions; intermediate ITSM tools that have good ITSM capabilities, and provide some basic ITOM functions or integrate with intermediate third-party ITOM solutions; advanced ITSM tools that have a full range of ITSM capabilities, and provide broad ITOM functionality natively or integrate with advanced third-party ITOM solutions.
- IT process automation (ITPA)
IT process automation (ITPA), also known as run book automation (RBA), is the ability to orchestrate and integrate tools, people and processes through workflow.
- IT solution
An information technology (IT) solution is a set of related software programs and/or services that are sold as a single package. IT vendors, service providers and value-added resellers (VARs) market their software suites and service bundles under the label solution to promote the idea that purchasing the product will help the customer successfully solve a problem or deal with a complicated situation.
- JavaScript
JavaScript is a text-based programming language used both on the client-side and server-side that allows you to make web pages interactive.
- Knowledge Graphs
Knowledge Graphs encode information (“knowledge”) as data arranged in a network (“graph”) of nodes and links (“edges”) rather than tables of rows and columns.
- Knowledge engineering
Knowledge engineering is a field of artificial intelligence (AI) that tries to emulate the judgment and behavior of a human expert in a given field. Knowledge engineering is the technology behind the creation of expert systems to assist with issues related to their programmed field of knowledge.
- LabTech
LabTech is the only Managed Service tool for remote monitoring, management, and automation developed by a Managed Service Provider (MSP) for MSPs. Their affordable, agent-based solution so closely emulates what technicians do in the field that the techs can provide the same support remotely.
- Large Language Models (LLMs) A large language model (LLM) is a type of machine learning model that can perform a variety of natural language processing (NLP) tasks, including generating and classifying text, answering questions in a conversational manner, and translating text from one language to another.
- Ledger DBMS
Ledger DBMS is a distributed ledger platform which enables the information sharing (not copied) across a network.
- Leveraging AI to Enhance Cybersecurity Defense
Security tool vendors exploit ML to improve their tools, decision support, and response operations. Thanks to research, frameworks and compute power, vendors can more easily access well-designed ML if they can get their hands on large quantities of relevant, high-quality training data. They are able to assess solutions and architecture, and deal with the latest ML-related attack techniques, including data poisoning, adversarial inputs, generative adversarial networks, and other security-relevant innovations in ML.
- Light Field Displays
A type of display technology that creates a three-dimensional image by capturing the full light field of a scene, allowing for more realistic and immersive representations of digital environments.
- Liquid Cooling
Liquid Cooling is a cooling technology used in high-performance computing (HPC) that involves immersing computer components in a dielectric fluid to dissipate heat.
- Living Maps
Living maps are maps that are updated in real time.
- Location Intelligence (LI)
Location intelligence is the process of deriving meaningful insight from geospatial data relationships to solve a particular problem. Also, business intelligence (BI) tool capability that relates geographic contexts to business data. Like BI, location intelligence software is designed to turn data into insight for a host of business purposes. Such tools draw on a variety of data sources, such as geographic information systems (GIS), aerial maps, demographic information, and, in some cases, an organization’s own databases.
- Location-Based Services (LBS)
Services are based on the location of a mobile user as determined by using network and/or mobile-device-based technology. Technologies supporting this include cell of origin (also known as cell ID), AOA, time of arrival (TOA), EOTD, and GPS or assisted GPS.
- Logical Data Warehouse
The Logical Data Warehouse (LDW) is a data management architecture for analytics combining the strengths of traditional repository warehouses with alternative data management and access strategy.
- Low-code/No-code platforms
Low-code and no-code platforms are tools that enable developers to create applications with minimal coding, often by using drag-and-drop interfaces or visual programming.
- Machine Learning (ML)
Machine learning is a type of AI that allows software applications to become more accurate at predicting outcomes without being explicitly programmed to do so. Machine learning algorithms use historical data as input to predict new output values.
- Managed Detection and Response(MDR)
Managed detection and response (MDR) services are a collection of network-, host- and endpoint-based cybersecurity technologies that a third-party provider manages for a client organization. The provider typically installs technology on premises at the client organization and provides additional external and automated services through software.
- Mashups
Mashups – are the combinations of multiple emerging technologies to boost their capabilities and exponential improvements. When the intelligence of AI mashes up with the technical capabilities of robotics, for example, companies can see major transformations in their workflows. The term “mashup” can also refer to a device that combines two or more innovations, or an application that runs various actions from multiple web sources.
- Master Data Management (MDM)
Master data management (MDM) is a technology-enabled discipline in which business and IT work together to ensure the uniformity, accuracy, stewardship, semantic consistency and accountability of the enterprise’s official shared master data assets.
- Metadata Management Solutions
Metadata management is about an organization’s management of its data and information assets. Metadata describes the various facets of an information asset that can improve its usability throughout its life cycle. Enterprise metadata management (EMM) encompasses the roles, responsibilities, processes, organization and technology necessary to ensure that the metadata across the enterprise adds value to that enterprise’s data. EMM is different to metadata management, which only operates at the level of a single program, project or initiative; EMM works across all. The market for metadata management solutions comprises vendors that include one or many metadata management capabilities such as
- Metadata repositories
- Business glossary
- Data lineage
- Impact analysis
- Rules management
- Semantic
- Metadata ingestion and translation
- Metaplex
Metaplex is an open-source protocol that allows for the creation and minting of NFTs using a standardized format across wallets and applications. It consists of two core components: an on-chain program, meaning sellers and buyers do not need to trust a centralized authority, and a self-hosted web3 application.
- Metaverse
Metaverse: A virtual 3D world where people can interact with a computer-generated environment and other users. Popular blockchain-based environments that comprise the Metaverse include Decentraland and The Sandbox, virtual gaming worlds where users can buy and sell land as NFT.
- Micro-Electromechanical Systems (MES)
A MEMS (micro-electromechanical system) is a miniature machine that has both mechanical and electronic components. The physical dimension of a MEMS can range from several millimeters to less than one micrometer, a dimension many times smaller than the width of a human hair.
- Microbiome Therapeutics
The microbiome is the natural bacterial colonisation of the body – including skin, gut and stomach. Scientists believe that this is a complex network of interactions, which can affect metabolism, neurology, infection, gut disease and other chronic health conditions. Microbiome therapy aims to establish a healthy gut microbiome to restore gut function and prevent dysregulation. Microbiome therapies (MBT) are currently in development to prevent recurrent C. difficile infections, which are primarily contracted in hospitals, leading to diarrhea, dehydration, fever and weight loss.
- Microsegmentation
Microsegmentation is a technique used to divide a network into logical and secure units through the application of policies that dictate how data and applications are accessed and controlled.
- Microservice
A microservice is a service-oriented application component that is tightly scoped, strongly encapsulated, loosely coupled, independently deployable and independently scalable.
- Microservices
Microservices, or microservices architecture, is an approach to application development in which a large application is built from modular components or services. Each module supports a specific task or business goal and uses a simple, well-defined interface, such as an application programming interface (API), to communicate with other sets of services.
- Mining
The process by which “blocks” or transactions are verified and added to a blockchain. In order to verify a block a miner must use a computer to solve a cryptographic problem. Once the computer has solved the problem, the block is considered “mined” or verified. In the Bitcoin or Ethereum blockchain, the first computer to mine or verify the block receives bitcoin or ether, respectively.
- Mithril.js
Mithril.js is a small (<10kb gzip), modern, client-side fast frontend JavaScript framework for building single-page applications. Unlike other popular frameworks, Mithril covers all the features that other frameworks provide like DOM Elements, Components, Routing, and XHR. The XHR is just a way to communicate with the server, communication is the most important thing to progress.
- Mixed Reality (MR)
Mixed reality (MR) is the merging of real and virtual worlds to produce new environments and visualizations, where physical and digital objects co-exist and interact in real time.
- Mobile Threat Defense (MTD)
Mobile Threat Defense (MTD) software is meant to protect organizations and individual users from security threats on mobile platforms.
MTD protects against attacks made specifically for mobile devices and operating systems such as Apple iOS and Google Android. Hackers may also use malware, phishing or network attacks to compromise a user’s device, which could then be used to steal data or to purposely cause a negative business impact. MTD’s goal is to protect users from such occasions. - Momentary Markets
The rise of “momentary markets” means that every discrete piece of time is an opportunity: a chance to deliver a new on-demand product or service designed to meet a specific customer’s needs.
- Multi-Factor Authentication (MFA)
Multi-factor Authentication (MFA) is an authentication method that rather than just asking for a username and password requires one or more additional verification factors for users to provide and gain access to a resource such as an application, online account, or a VPN. MFA doesn’t replace other security layers, it’s just one more layer to help protect you from unwanted intrusions. MFA is a core component of a strong identity and access management (IAM) policy. which decreases the likelihood of a successful cyber attack.
- Multi-cluster
Multi-cluster is a strategy for deploying an application on or across multiple Kubernetes clusters with the goal of improving availability, isolation, and scalability.
- Multiexperience Development Platforms (MXDP)
MXDPs serve to centralize life cycle activities — designing, developing, testing, distributing, managing and analyzing — for a portfolio of multiexperience apps. Multiexperience refers to the various permutations of modalities (e.g., touch, voice and gesture), devices and apps that users interact with on their digital journey across the various touchpoints. Multiexperience development involves creating fit-for-purpose apps based on touchpoint-specific modalities, while at the same time ensuring a consistent user experience across web, mobile, wearable, conversational and immersive touchpoints.
- Multimodal Deep Semantic Understanding
Multimodal deep semantic understanding takes the information of different models such as voice, image, and text as input, and integrates perception and cognition technologies to achieve a multi-dimensional deep understanding of information. With the rapid development and large-scale application of computing vision, speech, natural language understanding, and knowledge graph, multimodal deep semantic understanding is gradually mature, which leads to a broader application scenario.
- Multimodel DBMSs
Multimodel DBMSs support several types of data model, such as relational, XML and the common nonrelational types (e.g., document, key-value, search and graph), in a single DBMS. They are intended to reduce complexity in the data access layer by using a single language to access different persistent types, although multiple languages can also be used.
- Nanobots
Tiny robots – usually only a few nanometers to a few micrometers in size. Nanobots have the potential to perform various tasks, such as delivering drugs to specific parts of the body, cleaning up pollution, and building micro-electronic components.
- Nanohybrids
A nanohybrid is combination of two nanomaterials through chemical bonding while a nanocomposite is addition of nanomaterial to a matrix for example a polymer which in itself is not a nanomaterial.
- Nanoscale 3D Printing
Nanoscale 3D printing is a technique that can fabricate tiny structures a thousand times faster than conventional two-photon lithography (TPL) techniques, without sacrificing resolution.
- Nanotechnology
The manipulation of matter at the nanoscale, which is typically between 1 and 100 nanometers. Nanotechnology has potential applications in fields like medicine, electronics, and energy.
- Natural Language Generation
Natural-language generation (NLG) is a software process that transforms structured data into natural language. It can be used to produce long form content for organizations to automate custom reports, as well as produce custom content for a web or mobile application. It can also be used to generate short blurbs of text in interactive conversations (a chatbot) which might even be read out by a text-to-speech system.
- Natural Language Processing
Natural-language processing (NLP) technology involves the ability to turn text or audio speech into encoded, structured information, based on an appropriate ontology. The structured data may be used simply to classify a document, as in “this report describes a laparoscopic cholecystectomy,” or it may be used to identify findings, procedures, medications, allergies and participants.
- Natural Language Query
Natural Language Query enables users to query data using business terms that are either typed into a search box or spoken.
- Neobank
A neobank (also known as an online bank, internet-only bank, or digital bank) is a type of direct bank that operates exclusively online without traditional physical branch networks.
- Net Neutrality
Net neutrality is the theory that internet service providers (ISPs) should treat all data on the Internet equally. They should not differentiate or charge differently by user, content, website, platform, application, type of attached equipment, or method of communication.
- Network Attached Storage (NAS)
A network-attached storage (NAS) is a server for computer data storage that operates at the file level, rather than the block level. It is connected to a computer network and enables data access to a diverse range of clients. The term “NAS” can encompass both the technology and systems involved, or specifically refer to a dedicated device designed for this purpose.
- Network Traffic Analysis
Network traffic analysis (NTA) is the process of intercepting, recording and analyzing network traffic communication patterns in order to detect and respond to security threats.
- Network as a Service (NaaS)
Network as a Service (NaaS) from CBTS provides businesses with a scalable method of supporting, maintaining, expanding, and securing today’s commercial networks. Deploy a fully managed networking solution with cloud integration, security, switching, Wi-Fi, management, monitoring, and SD-WAN.
- Network functions virtualization (NFV)
Network functions virtualization (NFV) is an initiative to virtualize network services traditionally run on proprietary, dedicated hardware. With NFV, functions like routing, load balancing, and firewalls are packaged as a Virtual Machine on commodity hardware. Individual virtual network functions, or VNFs, are an essential component of NFV architecture.
- Neural nets
Neural nets are a means of doing machine learning, in which a computer learns to perform some task by analyzing training examples. Usually, the examples have been hand-labeled in advance. An object recognition system, for instance, might be fed thousands of labeled images of cars, houses, coffee cups, and so on, and it would find visual patterns in the images that consistently correlate with particular labels.
- Neuralink
A technology that uses implanted brain-machine interfaces to allow users to control computers or other devices using their thoughts. Neuralink has the potential to enable highly immersive and intuitive interactions with digital environments.
- Neurolinguistic Programming
Neurolinguistic Programming – a technique used to understand and influence human behavior by studying the relationship between language, behavior, and thought.
- Neuromorphic Computing
A type of computing that uses hardware that is designed to mimic the structure and function of the human brain. Neuromorphic computing is still in the experimental stages but has the potential to lead to significant advances in areas like artificial intelligence and robotics.
- Neuromorphic Hardware
Neuromorphic hardware uses specialized computing architectures that reflect the structure (morphology) of neural networks from the bottom up: dedicated processing units emulate the behavior of neurons directly in hardware, and a web of physical interconnections (bus-systems) facilitate the rapid exchange of information.
- Next-Generation Memory
Next-generation memory can be defined as a common label applied to a major upgrade of hardware or a software product. The next-generation memory market has grown over the last few years because of the increasing demand for faster, efficient, and cost-effective memory solutions. The emerging Big Data and artificial intelligence (AI) applications, including machine learning (ML), drive innovations across many industries and this is accompanying new memory technologies in various end-users such as BFSI, Consumer Electronics, etc.
- Next-generation digital learning NGDLE
The NGDLE is a framework outlined in a research report by EDUCAUSE. The NGDLE is an ecosystem of interconnected and customizable applications that support digital learning through five aspects: interoperability; personalization; analytics, advising, and learning assessment; collaboration; and accessibility and universal design in order to establish cost and time-effective quality transfer of knowledge to students and employees.
- NoOps
NoOps (no operations) is the concept that an IT environment can become so automated and abstracted from the underlying infrastructure that there is no need for a dedicated team to manage software in-house.
- NoSQL
NoSQL is an approach to database design that can accommodate a wide variety of data models, including key-value, document, columnar and graph formats. NoSQL, which stands for “not only SQL,” is an alternative to traditional relational databases in which data is placed in tables and data schema is carefully designed before the database is built. NoSQL databases are especially useful for working with large sets of distributed data.
- Node.js
Node.js is an open-source, server-side runtime environment and library that is used for easily building fast and scalable network applications outside the client’s browser.
- Non-fungible tokens or NFTs
Non-fungible tokens or NFTs are cryptographic assets on blockchain with unique identification codes and metadata that distinguish them from each other. Unlike cryptocurrencies, they cannot be traded or exchanged at equivalency. This differs from fungible tokens like cryptocurrencies, which are identical to each other and, therefore, can be used as a medium for commercial transactions.
- NVMe-oF
NVMe over Fabrics, also known as NVMe-oF and non-volatile memory express over fabrics, is a protocol specification designed to connect hosts to storage across a network fabric using the NVMe protocol.
The protocol is designed to enable data transfers between a host computer and a target solid-state storage device or system over a network — accomplished through a NVMe message-based command. Data transfers can be transferred through methods such as Ethernet, Fibre Channel (FC) or InfiniBand. - Object Storage
Object storage, also called object-based storage, is an approach to addressing and manipulating data storage as discrete units, called objects. Objects are kept inside a single repository and are not nested as files inside a folder inside other folders.
- Offshore Dedicated Team®
Offshore Dedicated Team® is a progressive delivery model, where Intetics creates a dedicated team, engaging the relevant software engineers and experts to achieve your product goals by using relevant technology, processes, and tools based on your product specifications for the entire duration of the project.
- Online Learning
It means that the majority of the coursework will be done through the internet — through forums, shared documents, email, chat, etc. However, it doesn’t imply that there is no face-to-face interaction between students and the teacher. It can include doing coursework in a classroom or at a distance, so long as communication is primarily online.
- Open Source Software (OSS)
Open Source Software (OSS) is code that is designed to be publicly accessible. Anyone can see, modify, and distribute the code as they see fit. OSS is developed in a decentralized and collaborative way, relying on peer review and community production rather than a single author or company. OSS is often cheaper, more flexible, and has more longevity than its proprietary peers.
- Optical Character Recognition (OCR)
Optical character recognition or optical character reader (OCR) is the electronic or mechanical conversion of images of typed, handwritten or printed text into machine-encoded text, whether from a scanned document, a photo of a document, a scene-photo (for example the text on signs and billboards in a landscape photo) or from subtitle text superimposed on an image (for example from a television broadcast).
- Order to cash (OTC or O2C)
Order to cash (OTC or O2C) is a set of business processes that involve receiving and fulfilling customer requests for goods or services. It is a top-level, or context-level, a term used by management to describe the finance-related component of customer sales. Other context-level business processes include marketing to lead, procure to pay (P2P), hire to retire, and concept to launch and sustain and retain.
- OS Containers
OS containers are virtual environments that share the kernel of the host operating system but provide user space isolation. For all practical purposes, you can think of OS containers as VMs. You can install, configure and run different applications, libraries, etc., just as you would on any OS. Just as a VM, anything running inside a container can only see resources that have been assigned to that container.
- Peer-to-peer (P2P)
A peer-to-peer (P2P) service is a decentralized platform whereby two individuals interact directly with each other, without intermediation by a third party. Instead, the buyer and the seller transact directly with each other via the P2P service.
- Persistent Memory Server
New persistent-memory technologies will help reduce costs and complexity of adopting in-memory computing (IMC)-enabled architectures. Persistent memory represents a new memory tier between DRAM and NAND flash memory that can provide cost-effective mass memory for high-performance workloads. It has the potential to improve application performance, availability, boot times, clustering methods, and security practices while keeping costs under control. It will also help organizations reduce the complexity of their application and data architectures by decreasing the need for data duplication.
- Personalization
Personalization is the concept of customizing the information presented to a user of a product which is often applied by large companies in the form of targeted ads or recommendation tools, such as Google Ads, Amazon’s product recommendations, or Facebook’s friend recommendations. The purpose of personalization is to collect and analyze large amounts of data, so companies can personalize their product with your experience.
- Personification
Personification is the delivery of relevant digital experiences to individuals based on their inferred membership in a defined customer segment, rather than their personal identity.
- Physical Augmentation
Physical augmentation alters the person’s inherent physical capability by implanting or hosting a technology within their bodies.
- Platform as a Service (PaaS)
Platform as a service (PaaS) is a cloud computing model in which a third-party provider delivers hardware and software tools — usually those needed for application development — to users over the internet. A PaaS provider hosts the hardware and software on its own infrastructure. As a result, PaaS frees developers from having to install in-house hardware and software to develop or run a new application.
- Point of Interest
A Point of Interest is a data set containing GPS coordinates that represent important locations in navigation applications. It is a quick, easy & accurate way to populate the mapping project with important places of features, buildings, and landmarks.
- Polymer.js
Polymer.js is an open-source JavaScript library created by Google that allows reusing the of HTML elements for building applications with components.
- Positioning, Navigation, and Timing (PNT)
Positioning, Navigation, and Timing (PNT). When PNT is used in combination with map data and other information (weather or traffic data, for instance) the result is the most popular and recognizable service–the modern navigation system better known as the Global Positioning System (GPS).
- Powered Exosuit
A powered exoskeleton (also known as power armor, powered armor, powered suit, exoframe, hardsuit, or exosuit) is a wearable mobile machine that is powered by a system of electric motors, pneumatics, levers, hydraulics, or a combination of technologies that allow for limb movement with increased strength and endurance. Its design aims to provide back support, sense the user’s motion, and send a signal to motors which manage the gears. The exoskeleton supports the shoulder, waist and thigh, and assists movement for lifting and holding heavy items, while lowering back stress.
- Predictive Analytics
Predictive analytics encompasses a variety of statistical techniques from data mining, predictive modeling, and machine learning, that analyze current and historical facts to make predictions about future or otherwise unknown events.
- Predictive Software Engineering (PSE)
Predictive Software Engineering (PSE) is a framework that addresses bottlenecks of custom software product development. It reconstructs a reliable approach to delivering software development services.
- Prescriptive Analytics
Prescriptive Analytics is a form of advanced analytics that examines data or content to answer the question “What should be done?” or “What can we do to make it happen?”, and is characterized by techniques such as graph analysis, simulation, complex event processing, neural networks, recommendation engines, heuristics, and machine learning.
- Principle of Least Privilege (PoLP)
Principle of Least Privilege (PoLP), also known as the principle of minimal privilege or the principle of least authority, requires that in a particular abstraction layer of a computing environment, every module (such as a process, a user, or a program, depending on the subject) must be able to access only the information and resources that are necessary for its legitimate purpose. Least Privilege refers to a security-minded IT engineer, who always embraced a policy in which he assigns users or resources the minimal privileges required for them to do their jobs.
- Private Cloud
A private cloud is a cloud architecture used exclusively by a single organization where the services and infrastructure are maintained on a private network.
- Private Cloud dbPaaS
Private cloud dbPaaS offerings merge the isolation of private cloud database platforms with the self-service and scalability of the public cloud. Private cloud dbPaaS can play the role of a transition technology as organizations develop their long-term cloud strategy. Private cloud dbPaaS is an option for organizations that are unable or not ready to move to public cloud offerings, due to security, regulatory or other concerns.
- Product-led growth
Product-led growth is a business strategy in which a company focuses on building a high-quality product that drives user acquisition, retention, and revenue growth, rather than relying on traditional sales and marketing techniques.
- Programmable Matter
Programmable Matter – a concept that refers to a material or substance that can change its physical properties, structure, or shape in response to external stimuli or instructions. This could potentially enable the creation of objects or devices that can self-assemble, self-repair, or even change their function or appearance on the fly.
- Proof of Stake (PoS)
An alternative consensus protocol, in which an individual or “validator” uses their own cryptocurrency to validate transactions or blocks. Validators “stake” their cryptocurrency, such as ether, on whichever transactions they choose to validate. If the individual validates a block (group of transactions) correctly then the individual receives a reward. Typically, if a validator verifies an incorrect transaction then they lose the cryptocurrency that they staked. Proof of Stake requires a negligible amount of computing power compared to Proof of Work consensus (see also Hybrid ConsenSys Model).
- Proof of Work (PoW)
A protocol for establishing consensus across a system that ties mining capability to computational power. Hashing a block, which is in itself an easy computational process, now requires each miner to solve for a set, difficult variable. In effect, the process of hashing each block becomes a competition. This addition of solving for a target increases the difficulty of successfully hashing each block. For each hashed block, the overall process of hashing will have taken some time and computational effort. Thus, a hashed block is considered Proof of Work (see also Hybrid ConsenSys Model).
- Protecting AI-Powered Systems
Application leaders should closely monitor ML algorithms and the data they consume. This will help them determine whether “there are extant or potential corruption (“poisoning”) issues.” Once infected, data manipulation is very likely to compromise data-driven decisions, in which data quality, integrity, confidentiality and privacy are an absolute must. There are five phases of ML pipelines that require protection: data ingestion; preparation and labeling; model training; inference validation; and production deployment. There are several risks associated with each phase, and enterprises need to ensure that they’re well prepared and armed with knowledge.
- Public cloud
The public cloud is a cloud architecture that is owned and managed by a cloud service provider that users need to access using web browsers.
- Python
Python is a general-purpose, high-level, interpreted object-oriented programming language popularly used for website development, data analytics, and automation.
- Quantified Self
The Quantified Self defines the self-tracking, measurement, and movement geared toward gathering data about any aspects of your daily life using technology like applications and wearable smart devices, such as smart-watch activity trackers or sleep monitors. Collected data is using that information to optimize your behavior.
- Quantum Computing
Quantum computing is an area of study focused on the development of computer technologies based on the principles of quantum theory. Quantum theory which explains the nature and behavior of energy and matter on the quantum (atomic and subatomic) level. Quantum computing uses a combination of bits to perform specific computational tasks at a much higher efficiency than classical computers. Development of quantum computers mark a leap forward in computing capability, with massive performance gains for specific use cases — specifically surrounding any sort of computation heavy process like simulations.
- Quantum Key Distribution (QKD)
QKD is a technique for secure communication that uses quantum mechanics to transmit cryptographic keys between two parties.
- Ransomware
Ransomware refers to malware that is designed to deny a user or organization access to files on their computer. By encrypting, these files infect computer-sensitive data until a ransom has been paid.
- React
React is a declarative, efficient, and flexible JavaScript library for building user interfaces. It lets you compose complex UIs from small and isolated pieces of code called “components”.
- Real-time Payments
See Instant Payments.
- Real-time data (RTD)
Real-time data (RTD) is information that is delivered immediately after collection with no delay in the timeliness of the information provided. Real-time data is often used for navigation or tracking.
- Regtech
RegTech (regulatory technology) is a class of software applications for managing regulatory compliance. Companies invest in RegTech as a way to save time and money, allowing resources that were once devoted to regulatory compliance to be diverted elsewhere, once RegTech takes over those functionalities.
- Reinforcement Learning
Reinforcement learning is a training method based on rewarding desired behaviors and/or punishing undesired ones. The learning method has been adopted in artificial intelligence (AI) as a method of directing unsupervised machine learning through rewards and penalties. Reinforcement learning is used in operations research, information theory, game theory, control theory, simulation-based optimization, multi-agent systems, swarm intelligence, statistics, and genetic algorithms.
- Remote In-Sourcing®
Remote In-Sourcing® is a new business model that represents an innovative approach to outsourcing practices that goes beyond traditional ODT. Essentially, it allows providers to build custom, specialized teams that perfectly match clients’ sourcing needs while keeping costs and risks low and talent availability high.
- Restaurant and Kitchen Automation
Restaurant and Kitchen Automation is broadly defined as the process of using robots or automated equipment that is designed for cooking food with precision while ensuring the utmost quality and hygiene.
- Rise of digital marketplaces
The rise of digital marketplaces refers to the increasing popularity and importance of online platforms that connect buyers and sellers, offering a range of products and services, and enabling transactions and interactions between them.
- Robotic Process Automation Software
Robotic process automation (RPA) tools perform “if, then, else” statements on structured data, typically using a combination of (UI) interactions or by connecting to APIs to drive client servers, mainframes, or HTML code. An RPA tool operates by mapping a process in the RPA tool language for the software “robot” to follow, with runtime allocated to execute the script by a control dashboard.
- Robotic process automation (RPA)
Robotic process automation (RPA) is a productivity tool that allows a user to configure one or more scripts (which some vendors refer to as “bots”) to activate specific keystrokes in an automated fashion. The result is that the bots can be used to mimic or emulate selected tasks (transaction steps) within an overall business or IT process. These may include manipulating data, passing data to and from different applications, triggering responses, or executing transactions. RPA uses a combination of user interface interaction and descriptor technologies. The scripts can overlay on one or more software applications.
- Robust and scalable
Robust and scalable means building software that’s powerful and wide-reaching.
- SASE
Secure Access Service Edge, also known as SASE (pronounced “sassy”) – is a cloud architecture model that bundles network and security-as-a-service functions together and delivers them as a single cloud service. SASE allows organizations to unify their network and security tools in a single management console. This provides a simple security and networking tool that is independent of where employees and resources are located.
- Scalability
Scalability is the measure of a system’s ability to increase or decrease in performance and cost in response to changes in application and system processing demands.
- SEO strategy
An SEO strategy is a process of organizing a website’s content by topic to improve the likelihood of appearing in search results. Essentially, it is the process you follow in order to maximize the opportunity to gain organic traffic from search engines.
- Security Orchestration, Automation and Response (SOAR)
SOAR refers to technologies that enable organizations to collect inputs monitored by the security operations team. For example, alerts from the SIEM system and other security technologies — where incident analysis and triage can be performed by leveraging a combination of human and machine power — help define, prioritize and drive standardized incident response activities. SOAR tools allow an organization to define incident analysis and response procedures in a digital workflow format.
- Security information and event management (SIEM)
SIEM is a solution that helps organizations detect, analyze, and respond to security threats before they harm business operations.
- Self-Sovereign Identity (SSI)
SSI is a decentralized digital identity system that allows individuals to own and control their identity data, without relying on centralized authorities.
- Self-service business intelligence (BI)
Self-service business intelligence (BI) is an approach to data analytics that enables business users to access and explore data sets even if they don’t have a background in BI or related functions like data mining and statistical analysis.
- Sensor Fusion
Sensor fusion is combining of sensory data or data derived from disparate sources such that the resulting information has less uncertainty than would be possible when these sources were used individually.
- Sensory Augmentation
Sensory augmentation involves hearing, vision, and perception. Examples include virtual reality (VR), augmented reality (AR), and mixed reality (MR). We are witnessing several companies delve into human augmentation, whether it be smart to contact lenses that are able to detect glucose levels in tears and intraocular pressure, or an “electronic nose” that is identical to the human nose.
- Sentiment analysis
Sentiment analysis also referred to as opinion mining, is an approach to natural language processing (NLP) that identifies the emotional tone behind a body of text. This is a popular way for organizations to determine and categorize opinions about a product, service, or idea.
- Serverless Computing
Serverless computing is an event-driven application design and deployment paradigm in which computing resources are provided as scalable cloud services. In traditional application deployments, the server’s computing resources represent fixed and recurring costs, regardless of the amount of computing work that is actually being performed by the server. In a serverless computing deployment, the cloud customer only pays for service usage; there is never any cost associated with idle, down-time.
- Servware
- Shadow AI
Shadow IT refers to IT devices, software and services outside the ownership or control of IT organizations.
- Shift left
Shift left referred to as “test earlier” or “start testing earlier” is an approach to software development that aims to shift the focus of quality assurance testing from the end of the software development life cycle towards the beginning. The goal is to prevent defects or errors in a product before they are deployed and save time and money.
- Sidechain
A sidechain is a separate blockchain that acts as an extension to the parent blockchain, often referred to as the main chain. To create a sidechain it is necessary to introduce an off-chain process that will propagate the data between both blockchains. This will allow for the transfer of the assets or synchronization of any kind of data between the blockchains.
- Silicon Anode Batteries
Silicon anode batteries are an extension of widely used lithium ion (Li-Ion) batteries. Early generation Li-Ion batteries used lithium as the anode material. This was replaced with carbon/graphite following a number of widely reported overheating and explosion incidents. Next-generation Li-Ion batteries are likely to make use of silicon anodes that utilize silicon nanotubes, or a comparable coating process. This will result in significantly higher energy storage and longer battery life.
- Simulated Reality
A technology that combines elements of VR, AR, and AI to create a completely immersive and interactive virtual environment that simulates reality. Simulated reality is often used in training and simulation applications.
- Single-page application (SPA)
A Single-page application (SPA) is a web app implementation that loads only a single web document, and then updates the body content of that single document via JavaScript, APIs such as XMLHttpRequest and Fetch when different content is to be shown.
- Site Reliability Engineering (SRE)
Site Reliability Engineering (SRE) is a software engineering approach to IT operations that helps to improve the reliability of a system today, while also improving it as it grows over time. SRE teams use the software as a tool to manage systems, solve problems, and automate operations tasks.
- SMAC (social, mobile, analytics and cloud)
SMAC (social, mobile, analytics and cloud) is the concept that the convergence of four technologies is currently driving business innovation. SMAC is the basis for an ecosystem that enables a business to transition from e-business to digital business.
- Small data
Small data is data in a volume and format that makes it accessible, informative, and actionable. Small data typically provides information that answers a specific question or addresses a specific problem.
- Smart Assets
Smart assets are unique virtual currency tokens that may represent a tangible real-world asset or non-tangible ownership that can be purchased, sold, or exchanged as defined by the rules of smart contracts on the blockchain network.
- Smart Contract
A smart contract, also known as a cryptocontract, is a computer program that directly controls the transfer of digital currencies or assets between parties under certain conditions. A smart contract not only defines the rules and penalties related to an agreement in the same way that a traditional contract does, but it can also automatically enforce those obligations.
- Smart Dust
In nanotechnology, smart dust is an ad hoc network of tiny devices equipped with wireless micro-electromechanical sensors (MEMS).
- Smart Home Devices
Smart home-enabled devices can also operate in conjunction with other devices in the home and communicate information to other smart devices.
- Smart Logistics
Smart logistics makes supply chains more effective and efficient at each step. With connected devices in the supply chain and intelligent asset tracking tools, smart logistics brings end-to-end visibility and improves the way companies transport goods, control inventory and mobile assets, replenish stock, and manage the retail experience. For example, sensor-driven asset tracking tools can provide insight at each stage of shipment, tracking moisture, heat, and vibration to give companies real-time visibility as a product moves through the supply chain.
- Smart Manufacturing
See Industry 4.0.
- Smart Matter
See Smart Dust.
- Smart Robot
A smart robot is an artificial intelligence (AI) system that can learn from its environment and its experience and build on its capabilities based on that knowledge.
- Smart Robotics
Smart robotics is a technology comprises robots that can collaborate with human beings, learn from its environment/experience and refine or modify its actions accordingly.
- Smart Speaker
A smart speaker is a wireless and smart audio playback device that uses several types of connectivity for additional functions. Smart speakers have special features to enhance ease of use, connect to multiple types of audio sources and provide additional functionality.
- Smart energy
Smart energy is the process of using devices for energy efficiency. It focuses on powerful, sustainable renewable energy sources that promote greater eco-friendliness while reducing costs.
- Smishing
Smishing is a type of phishing cybersecurity attack carried out over mobile text messaging, also known as SMS phishing that relies on exploiting human trust rather than technical exploits. SMS phishing can be assisted by malware or fraudulent websites.
- Social Robots
Social Robots use artificial intelligence to decide how to act on information received through cameras and other sensors. The ability to respond in ways that seem lifelike has been informed by research into such issues as how perceptions form, what constitutes social and emotional intelligence, and how people can deduce others’ thoughts and feelings. Advances in AI have enabled designers to translate such psychological and neuroscientific insights into algorithms that allow robots to recognize voices, faces, and emotions, interpret speech and gestures, respond appropriately to complex verbal and nonverbal cues, make eye contact, speak conversationally, and adapt to people’s needs by learning from feedback, rewards, and criticisms.
- Software as a Service (SaaS)
Software as a service (SaaS) is a software distribution model in which a third-party provider hosts applications and makes them available to customers over the Internet. SaaS is one of three main categories of cloud computing, alongside infrastructure as a service (IaaS) and platform as a service (PaaS).
- Software-Defined Cloud Interconnect
Software-defined cloud interconnects (SDCI) provide secure, automated connectivity to any infrastructure-as-a-service (IaaS) environment. SDCI uses dispersed sites (PoPs) and SD-WANs to interconnect sites and connect sites to cloud infrastructures.
- Software-Defined Services
A software-defined application services platform provides load balancing, security, and application analytics services across a variety of application environments. Software-defined application services are more flexible and efficient than hardware appliances or virtual load balancers.
- Software-Defined Storage (SDS)
Software-defined storage (SDS) is a computer program that manages data storage resources and functionality and has no dependencies on the underlying physical storage hardware.
- Software-Defined WAN (SD-WAN)
Software-defined WAN (SD-WAN) is technology that uses software-defined networking (SDN) concepts to distribute network traffic across a wide area network (WAN). An SD-WAN follows configured policies to automatically determine the most effective way to route application traffic between branch offices and data center sites.
- Solid-State Arrays
Solid State Arrays, or SSAs, are shared storage device resources comprised of solid state drives (SSDs). Solid State Arrays typically provide lower latency and better longevity compared to hard disk drive (HDD) storage, and SSAs have become increasingly popular in data centers as a result of taking up far less space, utilizing less energy (a tenth of the energy compared to hard drives), and providing much better reliability relative to HDD storage.
- Spark
Spark is a general-purpose distributed data processing engine that is suitable for use in a wide range of circumstances. On top of the Spark core data processing engine, there are libraries for SQL, machine learning, graph computation, and stream processing, which can be used together in an application. Programming languages supported by Spark include: Java, Python, Scala, and R. Application developers and data scientists incorporate Spark into their applications to rapidly query, analyze, and transform data at scale. Tasks most frequently associated with Spark include ETL and SQL batch jobs across large data sets, processing of streaming data from sensors, IoT, or financial systems, and machine learning tasks.
- Spatial Computing
A term used to describe the integration of virtual and physical environments, enabling users to interact with digital content as if it were part of the physical world. Spatial computing is often used in AR and MR applications.
- Spectrum Sensing
Spectrum Sensing is a wireless communications technique that is able to pick a free channel for transmission.
- Speech Recognition
Speech recognition is the ability of a machine or program to identify words and phrases in spoken language and convert them to a machine-readable format. Rudimentary speech recognition software has a limited vocabulary of words and phrases, and it may only identify these if they are spoken very clearly. More sophisticated software has the ability to accept natural speech.
- Spoofing
A common spoofing scenario happens when an email is sent from a fake sender address, asking the recipient to provide sensitive data. Typically, the recipient is prompted to click on a link to log into their account and update personal and financial details.
- SQL-on-Hadoop
SQL-on-Hadoop is a class of analytical application tools that combine established SQL-style querying with newer Hadoop data framework elements.
- Stablecoin
Stablecoin is a fixed-price cryptocurrency whose market value is attached to another stable asset. Differing from normal cryptocurrencies, a stablecoin can be pegged to assets such as certain fiat currencies that can be traded on exchanges, including the U.S. dollar or the Euro. Some stablecoins can be pegged to other types of assets, including precious metals, such as gold, and even other cryptocurrencies.
- Static Application Security Testing (SAST)
Static application security testing (SAST) is a program designed to analyze application (app) source code in order to find security vulnerabilities or weaknesses that may open an app up to a malicious attack. Software developers have been using SAST for over a decade to find and fix flaws in app source code early in the software development life cycle (SDLC), before the final release of the app.
- Stonks
Stonk, a deliberate misspelling of stock (meaning “a share of the value of a company which can be bought, sold, or traded as an investment”), was coined in a 2017 meme. It is mostly used for humorous purposes to imply a vague understanding of financial transactions or poor financial decisions.
- Storage as a Service (STaaS)
Storage as a Service (STaaS) is the practice of using public cloud storage resources to store your data. Using STaaS is more cost-efficient than building private storage infrastructure, especially when you can match data types to cloud storage offerings.
- Strategic Portfolio Management
Strategic Portfolio Management (SPM) is the collection of processes and capabilities that enable an enterprise to align all execution with strategy and more effectively drive business agility. Strategic Portfolio Management provides an integrated approach to planning, funding and overseeing all discretionary investments. In fact, it’s the only way to manage every investment as part of your overall enterprise strategy. Strategic Portfolio Management is critical for today’s businesses because it provides a top-down approach to linking business strategies to business outcomes regardless of how those outcomes are achieved – multiple ways of working, including the tri-modal reality, and multiple investments types – projects, programs, products and capabilities.
- Strong AI
- Sustainable Manufacturing
Sustainable manufacturing is the creation of manufactured products through economically-sound processes that minimize negative environmental impacts while conserving energy and natural resources. Sustainable manufacturing also enhances employee, community and product safety.
- Sustainable business
Sustainable business refers to a business model that focuses on creating long-term value for all stakeholders, including shareholders, employees, customers, and the environment, by integrating environmental, social, and governance (ESG) considerations into its operations and decision-making processes.
- Swarm Robotics
A field of robotics that focuses on the coordination of large numbers of robots to perform tasks in a collective and decentralized manner. Swarm robotics has potential applications in areas like agriculture, construction, and search and rescue.
- Synthetic Biology
A field of biology that uses engineering principles to design and build new biological systems and organisms with specific functions. Synthetic biology has potential applications in areas like biomanufacturing, agriculture, and healthcare.
- Synthetic Data
Synthetic data is information that’s artificially manufactured rather than generated by real-world events. Synthetic data is created algorithmically, and it is used as a stand-in for test datasets of production or operational data, to validate mathematical models and, increasingly, to train machine learning models.
- Synthetic Food
Synthetic food products are derived from chemically synthesized food substances. They are produced by new technological methods from such nutritive substances as proteins or their component amino acids, carbohydrates, fats, vitamins, and trace elements. Synthetic and artificial food products usually imitate natural food products in appearance, taste, and odor.
- System Security Plan (SSP)
SSP is a set of directives that provide an overview of the security requirements of a system and describe the controls (both those already in place and those planned), responsibilities, and expected behavior of all individuals who access the system.
- TETRA™
TETRA™ is a consultative service for in-depth software quality assessment. It is the result of 6K+ hours of original research conducted by Intetics. It is a unique system of metrics and measurements based on ISO standards and Intetics with 29+ years of project expertise. The results are transparent and easy – and vital – for everyone on your team to understand, from top management to tech specialists.
- Techno-nationalism
Techno-nationalism is the policy of promoting or protecting domestic technology industries through government intervention or regulation, often for national security or economic competitiveness.
- Technological Unemployment
Technological Unemployment is the notion of humans losing jobs to machines.
- Techquilibrium
TechQuilibrium is the balance point where the enterprise has the right mix of traditional and digital capabilities and assets, to power the business model needed to compete most effectively, in an industry that is being digitally revolutionized.
- Testing
Testing is the process using which we find errors and bugs.
- Text Analytics
Text analytics is the automated process of translating large volumes of unstructured text into quantitative data to uncover insights, trends, and patterns. Combined with data visualization tools, this technique enables companies to understand the story behind the numbers and make better decisions.
- Text mining
The process of extracting information from collections of textual data and utilizing it for business objectives.
- The Digital Twin of an Organization (DTO)
It is predicated on using a digital representation of an organization (its business model, strategies etc.) to better plan and execute a business transformation initiative. The whole idea behind the digital twin concept, and the reason why it is so useful, is that it offers a virtual model that can be analyzed and tweaked more easily than the real thing. The new insights and efficiencies you uncover this way can in turn be used to improve the organization. replicate what works, or address issues before they turn into real-world problems.
- The internet of things, or IoT
The internet of things, or IoT, is a system of interrelated computing devices, mechanical and digital machines, objects, animals or people that are provided with unique identifiers (UIDs) and the ability to transfer data over a network without requiring human-to-human or human-to-computer interaction.
- Thin clients
A thin client (or lean client) is a virtual desktop computing model that runs on the resources stored on a central server instead of a computer’s resources. Normally thin clients take the form of low-cost computing devices that heavily rely on a server for computation. The term can also describe software applications that use the client-server model in which the server performs all the processing.
- Time Series DBMS
A Time Series DBMS is a database management system that is optimized for handling time series data: each entry is associated with a timestamp. For example, time series data may be produced by sensors, smart meters or RFIDs in the so-called Internet of Things, or may depict the stock tickers of a high frequency stock trading system.
- Time of arrival (TOA or ToA)
Time of arrival (TOA or ToA) is the absolute time instant when a radio signal emanating from a transmitter reaches a remote receiver. The time span elapsed since the time of transmission (TOT or ToT) is the time of flight (TOF or ToF). Time difference of arrival (TDOA) is the difference between TOAs.
- Token
A Token represents an asset built on an existing blockchain (different from a coin). Tokens are designed to be unique, liquid, secure, instantly transferable, and digitally scarce.
- Tokenization
Tokenization refers to a process by which a piece of sensitive data, such as a credit card number, is replaced by a surrogate value known as a token. The sensitive data still generally needs to be stored securely at one centralized location for subsequent reference and requires strong protections around it. The security of a tokenization approach depends on the security of the sensitive values and the algorithm and process used to create the surrogate value and map it back to the original value.
- Transfer Learning
Transfer learning is the application of knowledge gained from completing one task to help solve a different, but related, problem. The development of algorithms that facilitate transfer learning processes has become a goal of machine learning technicians as they strive to make machine learning as human-like as possible.
- Transportation industry Transportation industries refer to industries in the for-hire transportation and warehousing sector, such as air, rail, water, and truck transportation.
- Travel Tech
Travel tech is the use of IT and e-commerce solutions in the travel, tourism, and hospitality industry with the goal of automating travel, saving time, reducing costs, and creating a seamless travel experience for consumers, including before, during, and after a trip.
- uCPE
uCPE, or Universal Customer Premises Equipment, is a general purpose platform that integrates multiple (previously physical) functions virtually, to any site on a network.
- Ultrasound Haptics
A technology that uses ultrasound waves to create haptic sensations on the user’s skin, enabling the user to “feel” virtual objects or textures. Ultrasound haptics can be used in conjunction with VR and AR applications to enhance the sense of immersion.
- Unified Endpoint Management
Unified endpoint management (UEM) is an approach to securing and controlling desktop computers, laptops, smartphones and tablets in a connected, cohesive manner from a single console. Unified endpoint management typically relies on the mobile device management (MDM) application performance indicators (APIs) in desktop and mobile operating systems.
- Unified Storage
Unified storage sometimes called network unified storage/multiprotocol storage – is a storage system that makes it possible to run and manage files and applications from a single device.
- Unified communications as a service (UCaaS)
Unified communications as a service (UCaaS) is a cloud delivery model that offers a variety of communication and collaboration applications and services.
- Unmanned Aerial Vehicle (UAV)
UAV stands for an unmanned aerial vehicle or uncrewed aerial vehicle commonly referred to as a drone – the actual aircraft being piloted by remote control or onboard computers.
- User Behavior Analytics
User behavior analytics (UBA) is the tracking, collecting and assessing of user data and activities using monitoring systems.
- User Experience (UX)
The User Experience (UX) is how the user feels, the emotions he gets, and the ease of use he has when interacting with a product or service. This can be also the communication with the customer and the product or the design of a service.
- User Interface (UI)
The User Interface (UI) explains the parts of an application, website, computer, etc. which the user can manipulate. So everything from the website menu to buttons and even keyboards or touchscreen can be part of the User Interface (UI).
- Vehicle-to-Home (V2H) Charging
V2H is when a bidirectional EV charger is used to supply power (electricity) from an EV Car’s battery to a house or, possibly, another kind of building.
- Vibrotactile Feedback
A type of haptic technology that uses vibrations to create sensations on the skin, allowing users to feel physical feedback from digital environments. Vibrotactile feedback can enhance the sense of immersion in VR and AR applications.
- Virtual Agent
A virtual agent is a software program that uses scripted rules and, increasingly, artificial intelligence applications to provide automated service or guidance to humans.
- Virtual Assistant (VA)
A virtual assistant is a person who provides support services to a business from a remote location. Administrative tasks are most common for virtual assistants, but they may also assist with any other elements required by the business.
- Virtual Assistants
A virtual assistant is an application program that understands natural language voice commands and completes tasks for the user.
- Virtual Firewall
A virtual firewall is a firewall device or service that provides network traffic filtering and monitoring for virtual machines (VMs) in a virtualized environment. Like a traditional network firewall, a virtual firewall inspects packets and uses security policy rules to block unapproved communication between VMs. A virtual firewall is often deployed as software.
- Virtual Machine
A virtual machine (VM) is a software implementation of a hardwarelike architecture, which executes predefined instructions in a fashion similar to a physical central processing unit (CPU). VMs can be used to create a cross-platform computing environment that loads and runs on computers independently of their underlying CPUs and operating systems.
- Virtual Private LAN Services (VPLS)
Virtual private LAN service (VPLS) is a telecom carrier-provided service that makes it possible for customers to create a logical local area network (LAN) structure between geographically separate sites. All services in a VPLS appear to be on the same LAN, regardless of location.
- Virtual Private Network (VPN)
VPN stands for “virtual private network” is a service that encrypted tunnel for your data, protects your online identity, and hides your IP address by letting the network redirect it through a specially configured remote server run by a VPN host. This means that if you surf online with a VPN, the VPN server becomes the source of your data. Your Internet Service Provider (ISP) and other third parties cannot see which websites you visit or what data you send and receive online.
- Virtual Reality (VR)
Virtual reality (VR) provides a computer-generated 3D environment (including both computer graphics and 360-degree video) that surrounds a user and responds to an individual’s actions in a natural way, usually through immersive head-mounted displays.
- Visual Data Discovery
Visual Data Discovery is the process of using visualizations and iterative visual exploration of data to improve time to insight. Visual data discovery improves business decision-making by enabling the exploration of data through interaction with charts and images. Processes such as in-memory computing and mashing or fusing of multiple data sources allow visualizations to be robust and real time; while the exponential growth of data has made their use essential in many organizations.
- Voice bots
Voice bots are software powered by artificial intelligence (AI) that allow a caller to navigate an interactive voice response (IVR) system with their voice, generally using natural language. Callers don’t have to listen to menus and press corresponding numbers on their keypads.
- Voice over Internet Protocol (VOIP)
Voice over Internet Protocol – Telephone or other communication services delivered over the internet. For example Skype, WhatsApp, Google Hangouts, Viber, etc.
- Voice-as-User Interface (VUI)
Voice-as-User Interface (VUI) is the term encompassing voice commands, voice search, and Voice Assistant devices using voice as the primary means users interact with a computer system or device. For example, “Ok, Google” or “Hey Siri”.
- VPA-Enabled Wireless Speakers
VPA-Enabled Wireless Speakers is cloud-enabled far-field voice computing devices connecting the user to a virtual assistant service.
- Vue.js
Vue.js is a progressive JavaScript framework, which is used to build simple or complex UIs (User Interfaces) and SPAs (Single-page Applications).
- Vulnerability Scanning
Vulnerability scanning is a process that proactively looks for known weaknesses in your network and flaws in systems and software running on them. This is an integral component of a vulnerability management program, which has one overarching goal – to protect the organization from breaches and the exposure of sensitive data.
- Wantrepreneur
Wantrepreneur is someone who has an idea for a business product and would like to become an entrepreneur but never follows through with it because of practical grounds for achieving his aspirations.
- Wearable technology
Wearable technology is any kind of electronic device designed to be worn on the user’s body. Such devices can take many different forms, including jewelry, accessories, medical devices, and clothing or elements of clothing.
- Web 3.0
Inaccurate expression of the right term Web3.
- Web Real-Time Communication (Web RTC)
Web RTC is an open framework by the W3C (World Wide Web Consortium) for browser-to-browser real-time communications via voice calling, video chat, and P2P file sharing that does not require plugins or other extensions.
- Web1
Web1 Commonly known as World Wide Web. The first iteration of the web is commonly referred to as the “read-only web.” Web1 was characterized by static websites that displayed information. There was little to no user interaction or user-generated content.
- Web2
Starting in the 90s, the “read-write web” is characterized by user-generated content and improved user interfaces. This led to the creation of blogs, social media platforms, and sites like Wikipedia and Youtube. Web2 placed more emphasis on user experience and interoperability between different applications and websites, giving us the vast network of connected websites and resources we are familiar with today.
- Web3 Web3 is the third generation of the evolution of web technologies. The next iteration of the web is being ushered in as we speak, which leverages blockchain technology, open-source applications, and the decentralization of data and information. Web3 aims to remove control of the web from monopolistic tech companies and return data and content ownership to its users. Also referred to as “read-write-trust-web.”
- Wi-Fi 6 (802.11ax)
Wi-Fi 6, also known as 802.11ax, is the latest generation and standard for wireless internet that replaces the 802.11ac, or Wi-Fi 5, standard. Prior to the release of Wi-Fi 6, Wi-Fi standards were identified by version numbers ranging from 802.11b to 802.11ac. However, the Wi-Fi Alliance decided to rename the past standards with the latest iteration with user-friendly conventions to help decrease confusion and identify compatibility between devices. Wi-Fi 6 combines high speed gigabit Ethernet wireless with the reliability of licensed radio.
- Wide-Column DBMSs
See Column-oriented DBMS.
- Zero Trust Network Access
Zero trust network access (ZTNA) is a product or service that creates an identity- and context-based, logical access boundary around an application or set of applications. The applications are hidden from discovery, and access is restricted via a trust broker to a set of named entities. The broker verifies the identity, context and policy adherence of the specified participants before allowing access and prohibits lateral movement elsewhere in the network. This removes application assets from public visibility and significantly reduces the surface area for attack.
- 360-Degree Video
Video content captures a complete 360-degree view of a scene, allowing viewers to explore the environment as if they were physically present. 360-degree video can be experienced through a VR headset or on a computer or mobile device.
- 3D Printing
3-D printing is a manufacturing process that builds layers to create a three-dimensional solid object from a digital model.
- 3D Volumetric Printing
It’s a resin-based process that doesn’t use lasers or even a layer-by-layer approach. Instead, a pattern is projected into a vat of transparent photopolymer liquid repeatedly from all angles. It’s must like a CT scan, except in reverse: the pattern is projected to form the object instead of scanning the object.
- 4D Printing
4D printing is the process through which a 3D printed object transforms itself into another structure over the influence of external energy input such as temperature, light, or other environmental stimuli.
- 4th Industrial Revolution (4IR)
The Fourth Industrial Revolution is the current and developing environment in which disruptive technologies and trends such as the Internet of Things (IoT), robotics, virtual reality (VR), and artificial intelligence (AI) are changing the way modern people live and work.
- 5G Networks
Fifth-generation wireless (5G) is the latest iteration of cellular technology, engineered to greatly increase the speed and responsiveness of wireless networks. With 5G, data transmitted over wireless broadband connections can travel at multigigabit speeds, with potential peak speeds as high as 20 gigabits per second (Gbps) by some estimates. These speeds exceed wireline network speeds and offer latency of 1 millisecond (ms) or lower for uses that require real-time feedback. 5G will also enable a sharp increase in the amount of data transmitted over wireless systems due to more available bandwidth and advanced antenna technology.
- 6G Networks
6G is the sixth generation of mobile communications, the implementation of which is expected in 2026-2030, based on telecommunications standards following 5G standards. In theory, 6G is planned to be significantly faster than 5G and will not only support the current range of mobile devices as 5G does, but also AR and VR applications.