AI / Machine Learning |
= |
AI is the broader concept of machines acting in a way that we would consider “smart”. Machine Learning is a form of AI based on giving machines access to data and let them learn for themselves. Includes neural networks, deep learning, language processing. A possible application is fraud detection. |
AI Augmented Development |
= |
Use of AI and NLP in the development environment: debugging, testing (mutation, fuzzing), generation of code/documentation, augmented coding, recommendations for refactoring, … |
EDA |
= |
An Event Driven Architecture (EDA) can offer many advantages over more traditional approaches. Events and asynchronous communication can make a system much more responsive and efficient. Moreover, the event model often better resembles the actual business data coming in. |
Generative AI |
= |
Generative AI is the technology to create new content by utilising existing text, audio files, or images. With generative AI, computers detect the underlying patterns related to the input and produce similar content. |
Crypto-agility |
+ |
Crypto-agility allows an information security system to switch to alternative cryptographic primitives and algorithms without making significant changes to the system’s infrastructure. Crypto-agility facilitates system upgrades and evolution. |
AI for Security |
+ |
Non-traditional methods for improving analysis methods in the security technology of systems and applications (e.g., user behaviour analytics, improved detection of potential attacked from system logs). |
Causal AI |
+ |
Causal AI techniques makes it possible to understand the causes of a prediction outcome, it encompasses methods like causal Bayesians networks, causal rules, combination of symbolic and neural AI, etc. |
Verifiable Credentials |
N |
Verifiable credentials (VC) can represent all of the same information that physical credentials represent. Additional technologies, such as digital signatures, makes VC more tamper-evident and more trustworthy than their physical counterparts. VC are typically stored in digital wallets. |
Confidential Computing |
– |
Confidential computing allows an entity to do computations on data without having access to the data itself. This can be realised in a centralised way with homomorphic encryption or a trusted execution environment (TEE) or in a decentralised way with secure multiparty computation. |
NLP |
– |
Natural Language Processing (NLP), part of AI, includes techniques to distil information from unstructured textual data, with the aim of using that information inside analytics algorithms. Used for text mining, sentiment analysis, entity recognition, Natural Language Generation (NLG). |
Graph Analytics |
= |
Graph Analytics is the process of investigating relational structures (i.e., relations between entities such as people, companies, addresses, …) by the use of network and graph theory. When entities include people, we talk about SNA (Social Network Analytics). |
AI/ML Engineering |
= |
In machine learning ‘by hand’, a lot of time is lost between training a model and putting it in production, to then wait for feedback for potential retraining. CD4ML (continuous delivery for ML) attempt to automate this process, working towards Adaptive AI. |
API Economy |
= |
API’s, to connect services within and across multiple systems, or even to 3rd parties, are becoming prevalent and push a new business model, centred around the integration of readily available data and services. They also help with loose coupling between components. |
Augmented Data Quality |
= |
Through the addition of AI, machine learning, knowledge graphs, NLP , … in data quality tools technologies, results could be more efficient for the business. |
SuperApps |
= |
Some mobile apps, like WeChat and AliPay, become entire ecosystems of pluggable mini-apps. Users can greatly customise their experience within the superapp, and integration between mini-apps is much tighter than that of normal smartphone apps. Popular now in China, but may be coming here soon |
Synthetic Data |
= |
Synthetic Data is concerned with creating a fictitious dataset that mimics a real one in format, looks and statistical properties. Can be used to further minimise the need to share sensitive or protected data. |
Rules as Code |
+ |
Rules as Code is a LegalTech concept in which the goal is to semi-automate the link between (suitably formalised) regulations on one hand, and the derived code (implementations, verification or compliance processes) on the other hand. |
Human Augmentation |
+ |
Human Augmentation is the enhancement of human capabilities, such as senses, actions, or cognition, using technology and science. It includes medical advancements, wearables (e.g. intelligent glasses), genetic engineering, and brain-computer interfaces. |
Data centric AI |
N |
A machine learning approach consisting in systematically applying data engineering best practices with a strong focus on data quality in order to improve the quality of a model. |
prompt engineering |
N |
A prompt is a natural language description defining the context in which a Large Language Model operates and outputs text. Small changes in the prompt can cause great changes in behaviour. Prompt engineering tries to find optimal prompts to achieve desired LLM behaviour. |
Zero Trust Architecture |
– |
The main concept behind zero trust is “never trust, always verify,” which means that devices should not be trusted by default, even if they are connected to a managed corporate network such as the corporate LAN and even if they were previously verified. Also known as “perimeterless security.” |
Back Tracking Anomalies |
– |
Method to detect causes of data quality problems in data flows between information systems and to improve them structurally. ROI is Important and facilitates a win-win approach between institutions. To monitor the anomalies and transactions an extension to the existing DBMS has to be built. |
Big Data Processing |
– |
Big data analytics solutions require architecture, which 1) has the calculations executed where data is stored, 2) spreads data and calculations over several nodes, and 3) uses a data warehouse architecture that makes all types of data available for analytical tools in a transparent way. |
Data Virtualisation |
– |
Methods and tools to access databases with heterogeneous models and to facilitate access for users using a virtual logical view. |
Knowledge Graphs |
– |
Knowledge Graphs relate entities in a meaningful graph structure to facilitate various processes from information retrieval to business analytics. Knowledge graphs typically integrate data from heterogeneous sources such as databases, documents, and even human input. |
Microservices |
– |
Independently maintainable and deployable services, which are kept very small (hence, ‘micro-‘), make an application, or even large groups of related systems, much more flexibly scalable, and provide functional agility, which allows a system to rapidly support new business opportunities. |
Reactive Computing |
– |
The flow of (incoming) data, and not an application’s (or CPU’s) regular control flow, govern its architecture. This is a new paradigm, sometimes even driven by new hardware, and opposes the traditional way of working with fluxes. Also known as Dataflow Architecture and related to EDA. |
Data-Centric Security |
= |
Approach to protect sensitive data uniquely and centrally, regardless of format or location (using e.g. data anonymization or tokenisation technologies in conjunction with centralised policies and governance). |
Multimedia Data Protection |
= |
Protection of multimedia data has gained importance with social media, remote-working, but also with the development of powerful AI models. Detecting falsification is critical. For instance one should be able to detect forgery of images (e.g., faces used for biometrics). |
Augmented Data Science |
= |
Augmented data science and machine learning (augmented DSML) uses artificial intelligence to help automate and assist key aspects of a DSML process. These aspects include data access and preparation, feature engineering, as well as model operationalisation, model tuning and management. |
Data Observability |
= |
Monitoring and management of performance and “system incidents” & Monitoring of data errors in real time and lineage to automatically resolve the cause (only bugs and formal causes) in the software components of the various information systems that are linked to each other |
Edge Computing |
= |
Information processing and content collection and delivery are placed closer to the endpoints to fix high WAN costs and unacceptable latency of the cloud. Also in context of AI solutions, edge computing becomes more relevant (ref. tinyML) |
GitOps |
= |
Best practices coming from DevOps, applied to Operations. This, for instance, means, that all configuration is specified in files that can be maintained using version control and that are machine readable by tools to automate as many things as possible. |
Privacy by Design |
= |
Privacy by design calls for privacy to be taken into account throughout the whole engineering process. The European GDPR regulation incorporates privacy by design. An example of an existing methodology is LINDDUN. |
Process Mining |
= |
Includes automated process discovery (extracting process models from an event log from an information system), and offers also possibilities to monitor, check and improve processes. Often used in preparation of RPA and other business process initiatives (context digital transformation). |
Visual Analytics |
= |
Methodology and enabling tools allowing to combine data visualisation and analytics. Allows rapidly exploring, analysing, and forecasting data. This helps modelling in advanced analytics, and to make modern, interactive, self-service BI applications. |
Voice of the Citizen Applications |
= |
Contains a number of approaches to capture and analyse explicit or non-explicit feedback from users, in order to improve the systems and remove frictions. |
Remote Identity Verification |
– |
Remote identity verification comprises the processes tools to remotely verify someone’s identity, without the need for the person to physically present themselves to an authority. |
Composable Applications |
– |
Applications composed of business-oriented building blocks, where these modular reusable blocks are independent one of another and can be configured by Business and IT into a solution. Main advantage is the support for agility of the business to changes while resilience should be maintained. |
Collaborative MDM |
– |
In Master Data Management, collaborative and organised management of anomalies stemming from distributed authentic sources, by their official owners. |
Living Documentation |
– |
Living documentation actively co-evolves with code, making it constantly up-to-date without requiring separate maintenance. It lives with the code and can be automatically used by tools to generate publishable specifications. An example can be found in some forms of annotations. |
Hexagonal Architecture |
= |
Also ‘Onion Architecture’: a set of architectural principles making the domain model code central to everything and dependent on no other code or framework. Other aspects of the program code can be dependent on the domain code. Gained a lot of popularity in the community recently |
Web3 |
= |
Web3 is an idea for a new iteration of the World Wide Web which incorporates concepts such as decentralisation, blockchain technologies, and token-based economics. It promises to give back control to citizens over their assets, as well as over their identity. |