elinuxhenrik | 1cd0f2e | 2022-03-28 09:48:52 +0200 | [diff] [blame] | 1 | .. This work is licensed under a Creative Commons Attribution 4.0 International License. |
| 2 | .. SPDX-License-Identifier: CC-BY-4.0 |
PatrikBuhr | 619bfb3 | 2022-04-20 17:40:05 +0200 | [diff] [blame^] | 3 | .. Copyright (C) 2022 Nordix |
elinuxhenrik | 1cd0f2e | 2022-03-28 09:48:52 +0200 | [diff] [blame] | 4 | |
| 5 | |
| 6 | DMaaP Adapter |
| 7 | ~~~~~~~~~~~~~ |
| 8 | |
PatrikBuhr | 619bfb3 | 2022-04-20 17:40:05 +0200 | [diff] [blame^] | 9 | ************ |
| 10 | Introduction |
| 11 | ************ |
elinuxhenrik | 1cd0f2e | 2022-03-28 09:48:52 +0200 | [diff] [blame] | 12 | |
PatrikBuhr | 619bfb3 | 2022-04-20 17:40:05 +0200 | [diff] [blame^] | 13 | This is a generic information producer using the Information Coordination Service (ICS) Data Producer API. It can get information from DMaaP (ONAP) or directly from Kafka topics and deliver the |
| 14 | information to data consumers using REST calls (POST). |
| 15 | |
| 16 | The DMaaP Adapter registers itself as an information producer along with its information types in Information Coordination Service (ICS). |
elinuxhenrik | 1cd0f2e | 2022-03-28 09:48:52 +0200 | [diff] [blame] | 17 | The information types are defined in a configuration file. |
elinuxhenrik | 1cd0f2e | 2022-03-28 09:48:52 +0200 | [diff] [blame] | 18 | |
PatrikBuhr | 619bfb3 | 2022-04-20 17:40:05 +0200 | [diff] [blame^] | 19 | A data consumer can create an information job (data subscription) using the ICS consumer API (for R-Apps) or the A1-EI (Enrichment Information) API (for NearRT-RICs) based on the registered information types. |
| 20 | This service will get data from DMaaP MR or Kafka topics and deliver it to the data consumers based on their created subscription jobs. |
| 21 | |
| 22 | So, a data consumer may be decoupled from DMaaP and/or Kafka this way. |
| 23 | |
| 24 | The service is implemented in Java Spring Boot (DMaaP Adapter Service). |
| 25 | |
| 26 | .. image:: ./Architecture.png |
| 27 | :width: 500pt |
| 28 | |
| 29 | ****************** |
| 30 | Configuration File |
| 31 | ****************** |
| 32 | |
| 33 | The configuration file defines which DMaaP and Kafka topics that should be listened to and registered as subscribeable information types. |
| 34 | There is an example configuration file in config/application_configuration.json |
| 35 | |
| 36 | Each entry will be registered as a subscribe information type in ICS. The following attributes can be used in each entry: |
| 37 | |
| 38 | * id, the information type identifier. |
| 39 | |
| 40 | * dmaapTopicUrl, a URL to use to retrieve information from DMaaP. Defaults to not listen to any topic. |
| 41 | |
| 42 | * kafkaInputTopic, a Kafka topic to listen to. Defaults to not listen to any topic. |
| 43 | |
| 44 | * useHttpProxy, indicates if a HTTP proxy shall be used for data delivery (if configured). Defaults to false. |
| 45 | This parameter is only relevant if a HTTPproxy is configured in the application.yaml file. |
| 46 | |
| 47 | * dataType, this can be set to "pmData" which gives a possibility to perform a special filtering of PM data. |
| 48 | |
| 49 | These parameters will be used to choose which parameter schemas that defines which parameters that can be used when creating an information job/data subscription. |
| 50 | |
| 51 | Below follows an example of a configuration file. |
| 52 | |
| 53 | .. code-block:: javascript |
| 54 | |
| 55 | { |
| 56 | "types": [ |
| 57 | { |
| 58 | "id": "DmaapInformationType", |
| 59 | "dmaapTopicUrl": "/dmaap-topic-1", |
| 60 | "useHttpProxy": true |
| 61 | }, |
| 62 | { |
| 63 | "id": "KafkaInformationType", |
| 64 | "kafkaInputTopic": "TutorialTopic", |
| 65 | }, |
| 66 | { |
| 67 | "id": "PmInformationType", |
| 68 | "dmaapTopicUrl": "/dmaap-topic-2", |
| 69 | "dataType": "PmData" |
| 70 | } |
| 71 | ] |
| 72 | } |
| 73 | |
| 74 | ************************** |
| 75 | Information Job Parameters |
| 76 | ************************** |
| 77 | |
| 78 | When an information consumer creates an information job,it can provide type specific parameters. The allowed parameters are defined by a Json Schema. |
| 79 | The following schemas can be used by the component (are located in dmaapadapter/src/main/resources): |
| 80 | |
| 81 | ==================== |
| 82 | typeSchemaDmaap.json |
| 83 | ==================== |
| 84 | This schema will be registered when dmaapTopicUrl is defined for the type. You can provide two parameters when creating the job which are |
| 85 | used for filtering of the data. |
| 86 | |
| 87 | * filterType, selects the type of filtering that will be done. This can be one of: "regexp", "json-path", "jslt". |
| 88 | |
| 89 | * regexp is for standard regexp matching of text. Objects that contains a match of the expression will be pushed to the consumer. |
| 90 | * json-path can be used for extracting relevant data from json. |
| 91 | * jslt, which is an open source language for JSON processing. It can be used both for selecting matching json objects and for extracting or even transforming of json data. This is very powerful. |
| 92 | |
| 93 | * filter, the value of the filter expression. |
| 94 | |
| 95 | Below follows examples of a filters. |
| 96 | |
| 97 | .. code-block:: javascript |
| 98 | |
| 99 | { |
| 100 | "filterType":"regexp", |
| 101 | "filter": ".*" |
| 102 | } |
| 103 | |
| 104 | |
| 105 | .. code-block:: javascript |
| 106 | |
| 107 | { |
| 108 | "filterType":"jslt", |
| 109 | "filter": "if(.event.commonEventHeader.sourceName == \"O-DU-1122\") .event.perf3gppFields.measDataCollection.measInfoList[0].measValuesList[0].measResults[0].sValue" |
| 110 | } |
| 111 | |
| 112 | |
| 113 | .. code-block:: javascript |
| 114 | |
| 115 | { |
| 116 | "filterType":"json-path", |
| 117 | "filter": "$.event.perf3gppFields.measDataCollection.measInfoList[0].measTypes.sMeasTypesList[0]" |
| 118 | } |
| 119 | |
| 120 | |
| 121 | |
| 122 | ========================== |
| 123 | typeSchemaPmDataDmaap.json |
| 124 | ========================== |
| 125 | This schema will be registered when dmaapTopicUrl is defined and the dataType is "pmData" for the type. |
| 126 | This will extend the filtering capabilities so that a special filter for PM data can be used. Here it is possible to |
| 127 | define which meas types to get from which resources. |
| 128 | |
| 129 | The filterType parameter is extended to have value "pmdata" that can be used for PM data filtering. |
| 130 | |
| 131 | * sourceNames an array of source names for wanted PM reports. |
| 132 | * measObjInstIds an array of meas object instances for wanted PM reports. If a the given filter value is contained in the filter definition, it will match (partial matching). |
| 133 | For instance a value like "NRCellCU" will match "ManagedElement=seliitdus00487,GNBCUCPFunction=1,NRCellCU=32". |
| 134 | * measTypes selects the meas types to get |
| 135 | * measuredEntityDns partial match of meas entity DNs. |
| 136 | |
| 137 | All PM filter properties are optional and a non given will result in "match all". |
| 138 | The result of the filtering is still following the structure of a 3GPP PM report. |
| 139 | |
| 140 | Below follows an example on a PM filter. |
| 141 | |
| 142 | .. code-block:: javascript |
| 143 | |
| 144 | { |
| 145 | "filterType":"pmdata" |
| 146 | "filter": { |
| 147 | "sourceNames":[ |
| 148 | "O-DU-1122" |
| 149 | ], |
| 150 | "measObjInstIds":[ |
| 151 | "UtranCell=dGbg-997" |
| 152 | ], |
| 153 | "measTypes":[ |
| 154 | "succImmediateAssignProcs" |
| 155 | ],eparate call. |
| 156 | "measuredEntityDns":[ |
| 157 | "ManagedElement=RNC-Gbg-1" |
| 158 | ] |
| 159 | } |
| 160 | } |
| 161 | |
| 162 | |
| 163 | ==================== |
| 164 | typeSchemaKafka.json |
| 165 | ==================== |
| 166 | This schema will be registered when kafkaInputTopic is defined for the type. |
| 167 | |
| 168 | * filterType, see above. |
| 169 | * filter, see above. |
| 170 | * bufferTimeout can be used to buffer several json objects received from Kafka when kafkaInputTopic is defined into one json array. This contains: |
| 171 | |
| 172 | * maxSize, the maximum number of objects to collect before delivery to the consumer |
| 173 | * maxTimeMiliseconds, the maximum time to delay delivery (to buffer). |
| 174 | |
| 175 | * maxConcurrency, defines max how many paralell REST calls the consumer wishes to receive. 1, which is default, means sequential. A higher values may increase throughput. |
| 176 | |
| 177 | If bufferTimeout is used, the delivered data will be a Json array of the objects received. If not, each received object will be delivered in a separate call. |
| 178 | |
| 179 | Below follows an example. |
| 180 | |
| 181 | .. code-block:: javascript |
| 182 | |
| 183 | { |
| 184 | "bufferTimeout":{ |
| 185 | "maxSize":123, |
| 186 | "maxTimeMiliseconds":456 |
| 187 | }, |
| 188 | "maxConcurrency":1 |
| 189 | } |
| 190 | |
| 191 | |
| 192 | ========================== |
| 193 | typeSchemaPmDataKafka.json |
| 194 | ========================== |
| 195 | This schema will be registered when kafkaInputTopic is defined and the dataType is "pmData" for the type. |
| 196 | |
| 197 | This schema will allow all parameters above. |
| 198 | |
| 199 | * filterType (one of: "regexp", "json-path", "jslt" or "pmdata") |
| 200 | * filter, see above. |
| 201 | * bufferTimeout, see above. |
| 202 | |
| 203 | |