3d chip stacking | The process of building integrated circuits with both horizontal and vertical interconnections between transistors. this brings elements of the chip physically closer together, increasing density ... |
additive manufacturing | A computer-controlled process in which successive layers of material are deposited to create a part that matches a 3d design. |
adversarial machine learning | A broad collection of techniques used to exploit vulnerabilities across the entire machine learning stack and lifecycle. adversaries may target the data sets, algorithms, or models that an ml syst... |
ai assurance | The defensive science of protecting ai applications from attack or malfunction. |
ai digital ecosystem | A technology stack driving the development, testing, fielding, and continuous update of ai-powered applications. the ecosystem is managed as a multilayer collection of shared ai essential building... |
ai governance | The actions to ensure stakeholder needs, conditions, and options are evaluated to determine balanced, agreed-upon enterprise objectives; setting direction through prioritization and decision-makin... |
ai lifecycle | The steps for managing the lifespan of an ai system: 1) specify the system’s objective. 2) build a model. 3) test the ai system. 4) deploy and maintain the ai system. 5) engage in a feedback loop ... |
ai stack | Ai can be envisioned as a stack of interrelated elements: talent, data, hardware, algorithms, applications, and integration.5 |
anonymization | Also referred to as data de-identification, this is the process of removing or replacing with synthetic values any identifiable information in data. this is intended to make it impossible to deriv... |
augmented reality | Enhanced digital content, spanning visual, auditory, or tactile information, overlaid onto the physical world.9 authorization to operate (ato): the official management decision given by a senior o... |
automation bias | An unjustified degree of reliance on automated systems or their outcomes. |
biometric technologies | Technologies that leverage physical or behavioral human characteristics that can be used to digitally identify a person and grant access to systems, devices, or data, such as face, voice, and gait... |
biosensors consist of three parts | A component that recognizes the analyte and produces a signal, a signal transducer, and a reader device.11 |
carbon nanotubes | - Nano-scale structures that can be used to make transistors and could potentially replace silicon transistors in the future. compared to existing silicon transistors, carbon nanotube transistor...
углеродные нанотрубки; |
cloud computing | - The act of running software within information technology environments that abstract, pool, and share scalable resources across a network.13
- A type of computing that uses groups of ...
облачные вычисления; |
cloud infrastructure | The components needed for cloud computing, which include hardware, abstracted resources, storage, and network resources.14 604 p t echn i cal g los sary |
commonsense reasoning | The process of forming a conclusion based on the basic ability to perceive, understand, and judge things that are shared by (“common to”) most people and can reasonably be expected without need fo... |
computational thinking | The thought processes involved in formulating problems so their solutions can be represented as computational steps and algorithms.16 |
continuous integration | A process that aims to minimize the duration and effort required by “each” integration episode and deliver at any moment a product version suitable for release. in practice, this requires an integ... |
data architecture | The structure of an organization’s logical and physical data assets and data management resources.20 |
de-anonymization | Matching anonymous data (also known as de-identified data) with publicly available information, or auxiliary data, in order to discover the individual to whom the data belong.23 (see anonymization... |
deep learning | - A machine learning implementation technique that exploits large quantities of data, or feedback from interactions with a simulation or the environment, as training sets for a network with mult...
глубокое или глубинное обучение; |
deepfake | Computer-generated video or audio (particularly of humans) so sophisticated that it is difficult to distinguish from reality.24 deepfakes have also been referred to as synthetic media. |
deployed ai | Ai that has been fielded for its intended purpose within its relevant operational environment. |
devsecops | Enhanced engineering practices that improve the lead time and frequency of delivery outcomes, promoting a more cohesive collaboration between development, security, and operations teams as they wo... |
differential privacy | A criterion for a strong, mathematical definition of privacy in the context of statistical and machine learning analysis used to enable the collection, analysis, and sharing of a broad range of st... |
digital ecosystem | The stakeholders, systems, tools, and enabling environments that together empower people and communities to use digital technology to gain access to services, engage with each other, and pursue mi... |
digital infrastructure | The foundational components that enable digital technologies and services. examples of digital infrastructure include fiber-optic cables, cell towers, satellites, data centers, software platforms,... |
domain-specific hardware architectures | Hardware that is specifically designed to fulfill certain narrow functions, seeking performance gains through specialization. |
edge computing | A distributed-computing paradigm that brings computation and data storage closer to the location where it is needed (i.e., the network edge where smart sensors, devices, and systems reside along w... |
explainability | - A characteristic of an ai system in which there is provision of accompanying evidence or reasons for system output in a manner that is meaningful or understandable to individual users (as well...
|
federated data repository | A virtual data repository that links data from distributed sources (e.g., other repositories), providing a common access portal for finding and accessing data. field-programmable gate array (fpga)... |
gallium nitride | An alternative material to silicon for transistors. gallium nitride transistors feature higher electron mobility than silicon and are capable of faster switching speed, higher thermal conductivity... |
homomorphic encryption | A technique that allows computation to be performed directly on encrypted data without requiring access to a secret key. the result of such a computation remains in encrypted form and can at a lat... |
information operations | The tactics, techniques, and procedures employed in both the offensive and defensive use of information to pursue a competitive advantage.43 internet of things (iot): a global infrastructure for t... |
intelligent sensing | Utilizing advanced signal processing techniques, data fusion techniques, intelligent algorithms, and ai concepts to better understand sensor data for better integration of sensors and better featu... |
interpretability | The ability to understand the value and accuracy of system output. interpretability refers to the extent to which a cause and effect can be observed within 608 p t echn i cal g los sary a system o... |
legacy systems | Outdated systems still in operation that are hard to maintain owing to shortage of skill sets and obsolete architecture.47 machine learning (ml): the study or the application of computer algorithm... |
mlops | Enhanced engineering practices that combine ml model development and ml model operations technologies to support continuous integration and delivery of ml-based solutions.49 |
modeling and simulation | Modeling the physical world to support the study, optimization, and testing of operations through simulation without interfering or interrupting ongoing processes. modeling and simulation can be u... |
multimodal data | Data comprising several signal or communication types, such as speech and body gestures during human-to-human communication. |
multi-party federated learning | An ml setting where many clients (e.g., mobile devices or whole organizations) collaboratively train a model under the orchestration of a central server (e.g., service provider) while keeping the ... |
multi-source data | Data obtained and aggregated from different origins. |
networks are trained in tandem | One is designed to be a generative network (the forger) and the other a discriminative network (the forgery detector). the objective is for each network to train and better itself off the other, r... |
neuromorphic computing | Computing that mimics the human brain or neural network.52 609 p |
object recognition | The algorithmic process of finding objects in the real world from an image, typically using object models which are known a priori.53 one shot (or few shot) learning: an approach to machine learni... |
planning and optimization | Determining necessary steps to complete a series of tasks, which can save time and money and improve safety. |
platform environment | Provides an application developer or user secured access to resources and tools (e.g., workflows, data, software tools, storage, and compute) on which applications can be developed or run. |
polymorphic malware | A type of malware that constantly changes its identifiable features (i.e., signatures) in order to evade detection. many of the common forms of malware can be polymorphic, including viruses, worms... |
pseudonymization | A data management technique to strip identifiers linking data to an individual. concern exists that such data could still be linked with other data that allows for a person’s identity to be redisc... |
pytorch | A free and open-source software library for training neural networks and other machine learning architectures, initially developed by facebook ai research. |
quantum computer | A machine that relies on the properties of quantum mechanics to perform computations. quantum computers encode information in qubits, which can exist in a linear combination of two states. these s... qk; квантовый компьютер; |
reinforcement learning | - A method of training algorithms to make suitable actions by maximizing rewarded behavior over the course of its actions.61 this type of learning can take place in simulated environments, such ...
|
reliable ai | An ai system that performs in its intended manner within the intended domain of use. |
responsible ai | An ai system that aligns development and behavior to goals and values. this includes developing and fielding ai technology in a manner that is consistent with democratic values.62 |
robust ai | An ai system that is resilient in real-world settings, such as an object-recognition application that is robust to significant changes in lighting. the phrase also refers to resilience when it com... |
self-healing robots | Robots that use structural materials to self-identify damage and initiate healing on their own, repeatedly.64 |
self-replicating robots | A means of manufacturing, so that fleets of autonomous rovers can extract water and metals from local terrain—say on the moon or mars—to construct new industrial robots autonomously and continue t... |
self-supervised machine learning | A collection of machine learning techniques that are used to train models or learn embedded representations without reliance on costly labeled data; rather, an approach is to withhold part of each... |
semiconductor photonics | As it relates to semiconductors, this refers to the use of light, rather than electricity, to transfer information on a chip. this allows for much faster data transfer speeds, resulting in signifi... |
semiconductors | The silicon-based integrated circuits that drive the operations and functioning of computers and most electronic devices. |
semi-supervised machine learning | A process for training an algorithm on a combination of labeled and unlabeled data. typically, this combination will contain a very small amount of labeled data and a very large amount of unlabele... |
smart sensors | Devices capable of pre-processing raw data and prioritizing the data to transmit and store, which is especially helpful in degraded or low-bandwidth environments. |
smart systems | Information technology systems with autonomous functions enabled by ai. |
supervised machine learning | - A process for training algorithms by example. the training data consists of inputs paired with the correct outputs. during training, the algorithm will search for patterns in the data that cor...
|
symbolic logic | A tool for creating and reasoning with symbolic representations of objects and propositions based on clearly defined criteria for logical validity.69 |
synthetic data generation | The process of creating artificial data to mimic real sample data sets. it includes methods for data augmentation that automate the process for generating new example data from an existing data se... |
technical baseline | The government’s capability to understand underlying technology well enough to make successful acquisition decisions independent of contractors.70 |
tensorflow | A free and open-source software library for training neural networks and other machine learning architectures, initially developed by google brain. test and evaluation, verification and validation... |
unintended bias | Ways in which algorithms might perform more poorly than expected (e.g., higher false positives or false negatives), particularly when disparate outcomes are produced (e.g. across categories, class... |
unsupervised machine learning | - A process for training a model in which the model learns from the data itself without any data labels. two common approaches are clustering (in which inherent groupings are discovered) and ass...
|