CloudFabrix Documentation
Asset IntelligenceOps IntelligenceObservabilityRobotic Data
  • Getting Started
  • CloudFabrix Overview
  • AIOps Platform
  • AIOps Solutions
  • CloudFabrix RDA
    • RDA - Overview
      • RDA - Terminology and Artifacts
    • RDA - Installation
      • Linux OS
      • Windows OS
      • Mac OS
      • RDA Client
      • Worker Nodes
      • Event Gateway
      • Edge Collector
      • Log Shippers
        • Filebeat
        • Fluentd
        • Rsyslog
        • Syslog (udp)
        • Splunk forwarder (Windows and Linux)
        • Winlogbeat (Windows)
      • RDA Log Archives
    • RDA - Administration
      • RDA - Configuration
    • RDA - User Guide
      • RDA - AIOps Studio
        • AIOps Studio - Solution Packages
        • AIOps Studio - Pipelines
        • AIOps Studio - Explore
        • AIOps Studio - Administration
        • RDA CLI in UI
        • AIOps Studio - Examples
          • File Operations
          • Loop Operations
          • Data Management Operations - cfxdm
          • Data mapping - cfxdm - dm:eval
          • Filters - cfxdm - dm:filter
          • Data mapping - cfxdm - dm:map
          • Metadata - cfxdm-dm:metadata
          • Data mapping - cfxdm - dm:functions
            • Any_non_null
            • Concat
            • Datetime
            • Date and Timestamp
            • Evaluate
            • Fixed
            • Highest
            • Join
            • jsonDecode
            • Lower
            • Lowest
            • Match
            • Minutes_Between
            • Replace
            • Seconds_Between
            • Slice
            • Split
            • Strip
            • To_Numeric
            • Ts_To_Datetimestr
            • Upper
            • When_Null
          • Data Mapping cfxdm - dm:sort
          • Data Mapping cfxdm - dm:head
          • Data Mapping cfxdm - dm: tail
          • Data Mapping cfxdm - dm: dedup
          • Data Mapping cfxdm - dm:selectcolumns
          • Data Mapping cfxdm - dm:fixcolumns
          • Data Mapping cfxdm - dm:mergecolumns
          • Data Mapping cfxdm - dm:describe
          • Data Mapping cfxdm - dm:save
          • Data Mapping cfxdm - dm:savedlist
          • Data Mapping cfxdm - dm:recall
          • Data Mapping cfxdm - dm:concat
          • Data Mapping cfxdm - dm:groupby
          • Data Mapping cfxdm - dm:to_type
          • Data Mapping cfxdm - dm:enrich
          • Data Mapping cfxdm - dm:dns_ip_to_name
          • Data Mapping cfxdm - dm:dns_name_to_ip
        • AIOps Studio - Datasource Examples
          • Elasticsearch (v1)
      • RDA - Data Management (cfxdm)
        • cfxdm - dm:filter
        • cfxdm - dm:map
        • cfxdm - dm:functions
        • cfxdm - dm:sort
        • cfxdm - dm:head
        • cfxdm - dm:tail
        • cfxdm - dm:dedup
        • cfxdm - dm:selectcolumns
        • cfxdm - dm:mergecolumns
        • cfxdm - dm:describe
        • cfxdm - dm:hist
        • cfxdm - dm:bin
        • cfxdm - dm:fixcolumns
        • cfxdm - dm:save
        • cfxdm - dm:savedlist
        • cfxdx - dm:recall
        • cfxdm - dm:concat
        • cfxdm - dm:groupby
        • cfxdm - dm:enrich
        • cfxdm - dm:to_type
        • cfxdm - dm:dns_ip_to_name
        • cfxdm - dm:dns_name_to_ip
        • cfxdm - files:loadfile
      • RDA Terminal
        • Examples using Terminal / Commandline
    • RDA - Bot Documentation
    • RDA - Datasource Integrations
      • AppDynamics
      • Dynatrace
      • Dell EMC Unity
      • Elasticsearch
      • Infoblox NetMRI
      • Kubernetes Cluster
      • Linux OS
      • Microsoft Windows Server OS
      • Nagios XI
      • NetApp Clustered ONTAP
      • PRTG Network Monitor
      • VMware vCenter
      • VMware vRealize Operations
    • RDA - Python API
      • Class CaaSDataset
      • Class CaaSClient
      • Python API Example
    • RDA - FAQ
      • Download and Installation
      • Troubleshooting
  • CFXQL - CFX Query Language
    • CFXQL User Interface
  • Operations Intelligence Analytics (OIA)
    • Solution Overview
      • Navigating cfxOIA
      • Any title
    • Administration Guide
      • Active Directory Integration
      • Add Customer
      • First Steps
      • OIA Users
      • Add New Users and Assign Roles
      • Add Environment
      • Add Gateway
      • Add DataSource
      • Add Project
      • Project Configuration
      • Stacks
      • Teams
    • User Guide
      • Incidents
        • Incident
        • Stack
      • Alerts
      • Outcomes
  • INTEGRATIONS
    • Zabbix
    • AppDynamics
    • Dynatrace
    • NetApp C Mode
    • NetApp 7 Mode
    • Splunk Enterprise
    • VMware vCenter
  • OBSERVABILITY
  • Observability - IT Infrastructure Monitoring (cfxPulse)
    • Getting Started
    • Solution Overview
    • Installing cfxPulse Collector
    • Administrator Quick Start Guide
      • Prerequisites
      • Accessing cfxPulse
      • Configuration of cfxPulse
      • Setting Up Monitoring
      • Monitoring Using Prometheus Agents
      • How to add Prometheus Agent Details
      • Discovery of Devices
      • Monitoring Dashboard
    • End User Quick Start Guide
      • Portal Navigation
      • Monitoring Analysis
      • Monitoring
      • Alerts and Incidents
      • Reports
      • NOC/Ops
      • Configuration Backups
      • Interacting With Tabular Reports
      • Creating Custom Dashboards
  • Observability - Log Monitoring & Analytics (CFX LogAnalytics or CLA)
    • Getting Started
    • Solution Overview
    • Solution Key Components
    • Log Forwarding
      • Install and Configure Logstash
      • Sending Logs to Logstash Forwarder
    • Log Collection
      • Collecting Logs from Linux
    • Log Transformation & Enrichment
    • Logstash Installation
      • How to install Java / Logstash on client side
  • Asset Intelligence Analytics (AIA) Solution
    • Getting Started
    • Solution Overview
    • AIA Roles
      • Platform Admin
        • Managed Service Provider (MSP)
        • Authentication Server
        • Set Up Services
        • Organizations
        • Users
      • Organization Admin
        • My Organizations
      • Organization Executive
      • Organization User
    • AIA Tasks, Functions
      • Home Page Navigation
      • Filters
      • Settings Menu
      • Notifications
      • Authentical Server
      • How to Add, Edit, Delete MSP
      • Actions
        • Services
        • Files
        • Dictionaries
        • Discovery Jobs
        • Snapshots
        • Clambda Jobs
        • State Operations
        • Replacement Rate
      • Details
        • Overall
        • POR Insights
        • HW Assets
        • SW Assets
        • Contracts
        • App Dependency
        • Asset List
    • AIA API
    • Enterprise Discovery
      • cfxEdgeCollector
        • Deployment of cfxEdgeCollector
        • Configuration of cfxEdgeCollector
        • cfxEdgeCollector Command Line Options
        • cfxEdgeCollector Help Command
        • Working With cfxEdgeCollector
        • cfxEdgeCollector Auto Export
    • Asset Intelligence & Analytics (AIA) (Delete)
  • CloudFabrix SaaS
    • Signup
    • Navigation
    • User Roles
  • Support
    • Contact Support
Powered by GitBook
On this page
  1. CloudFabrix RDA
  2. RDA - User Guide
  3. RDA - AIOps Studio
  4. AIOps Studio - Examples

Data Mapping cfxdm - dm:enrich

Enrich the data using dictionaries

PreviousData Mapping cfxdm - dm:to_typeNextData Mapping cfxdm - dm:dns_ip_to_name

Last updated 3 years ago

dm:enrich: This cfxdm function allows the user to enrich an existing dataset by looking into additional datasets or dictionaries and brings in additional enriched information from them as per the user's selection and requirement.

dm: enrich syntax:

  • dict (mandatory): Dictionary name (named dataset) which has additional enrichment data.

  • src_key_cols (mandatory): Named dataset's (source) key columns, comma separated.

  • dict_key_cols (mandatory): Dictionary name's (named dataset) key columns, comma separated.

  • enrich_cols (mandatory): Enriched column names from Dictionary (named dataset) selected under 'dict' option, comma-separated.

The number of selected columns (count), for both src_key_cols & dict_key_cols options should be same.

i.e. if two columns are specified in src_key_cols, make sure two columns are specified in dict_key_cols too.

Example 1:

Step 1: Create an empty dm_enrich_example_1 using AIOps studio as shown in the below screenshot

Step 2: Add the following pipeline code/commands into the above-created pipeline as shown in the below screenshot:

You can copy the below code into your pipeline and execute that in your environment.

###### Pipeline created two datasets simulating two datasets app-processes-list and ###### apps-services-list from windows environment/inventory. ###### Pipeline uses dm:enrich function to enrich two datasets that are stored in RDA ##### Pipeline also uses, dm:recall ####### app-processes-list dataset @dm:empty --> @dm:addrow ip_address = '10.95.108.100' & process_name = 'System Idle Process' & pid = 0 --> @dm:addrow ip_address = '10.95.108.100' & process_name = 'svchost.exe' & pid = 1020 --> @dm:addrow ip_address = '10.95.108.100' & process_name = 'svchost.exe' & pid = 1020 --> @dm:addrow ip_address = '10.95.108.100' & process_name = 'silsvc.exe' & pid = 1064 --> @dm:addrow ip_address = '10.95.108.100' & process_name = 'smbhash.exe' & pid = 1152 --> @dm:addrow ip_address = '10.95.108.100' & process_name = 'svchost.exe' & pid = 1172 --> @dm:addrow ip_address = '10.95.108.100' & process_name = 'taskhostex.exe' & pid = 1432 --> @dm:addrow ip_address = '10.95.108.100' & process_name = 'Microsoft AD' & pid = 1500 --> @dm:addrow ip_address = '10.95.108.100' & process_name = 'svchost.exe' & pid = 1520 --> @dm:addrow ip_address = '10.95.108.100' & process_name = 'dwm.exe' & pid = 1524 --> @dm:addrow ip_address = '10.95.108.100' & process_name = 'dfsrs.exe' & pid = 1556 --> @dm:addrow ip_address = '10.95.108.100' & process_name = 'svchost.exe' & pid = 776 --> @dm:save name = 'app-processes-list' @dm:empty --> @dm:addrow ip_address = '10.95.108.100' & service_name = 'ADWS' & State = 'Running' & pid = 1500 --> @dm:addrow ip_address = '10.95.108.100' & service_name = 'AppHostSvc' & State = 'Running' & pid = 1524 --> @dm:addrow ip_address = '10.95.108.100' & service_name = 'BFE' & State = 'Running' & pid = 1172 --> @dm:addrow ip_address = '10.95.108.100' & service_name = 'BITS' & State = 'Running' & pid = 1020 --> @dm:addrow ip_address = '10.95.108.100' & service_name = 'BrokerInfrastructure' & State = 'Running' & pid = 776 --> @dm:addrow ip_address = '10.95.108.100' & service_name = 'ComsysApp' & State = 'Running' & pid = 3584 --> @dm:addrow ip_address = '10.95.108.100' & service_name = 'CertPropsSvc' & State = 'Running' & pid = 1020 --> @dm:addrow ip_address = '10.95.108.100' & service_name = 'CryptSvc' & State = 'Running' & pid = 820 --> @dm:addrow ip_address = '10.95.108.100' & service_name = 'DFSR' & State = 'Running' & pid = 1556 --> @dm:addrow ip_address = '10.95.108.100' & service_name = 'DNS' & State = 'Running' & pid = 1624 --> @dm:addrow ip_address = '10.95.108.100' & service_name = 'DPS' & State = 'Running' & pid = 1172 --> @dm:save name = 'app-services-list' --> @c:new-block --> @dm:recall name = 'app-services-list' --> @dm:enrich dict = 'app-processes-list' & src_key_cols = 'ip_address,pid' & dict_key_cols = 'ip_address,pid' & enrich_cols = 'process_name'

Step 3: Click verify button to verify the pipeline. RDA will verify the pipeline without any errors (as shown below)

Step 4: Click execute button to execute the pipeline. RDA will execute the pipeline without any errors (as shown below)

Step 5: Verify that the output data is enriched with the requested column names and print the output as shown in the below screenshot.

As shown in the above screenshot, 'app-services-list' is enriched using additional dictionary 'app-processes-list' using key columns that match from both datasets. In addition, the pipeline enriches the output with process_name. In this example, only a single column was enriched from the process list. But, the same enrichment process can be extended to other columns based on the other datasets that users want to enrich.

Empty Pipeline
Pipeline code added
Screenshot -1
Screenshot - 2
Screenshot -1
Screenshot -2