blob: 5029350a4c29ee50404e23fd8ae2a948ae423ccc [file] [log] [blame]
PatrikBuhr608883c2023-04-06 13:17:36 +02001.. This work is licensed under a Creative Commons Attribution 4.0 International License.
2.. SPDX-License-Identifier: CC-BY-4.0
3.. Copyright (C) 2023 Nordix
4
5
JohnKeeneyff7ea9d2023-04-25 19:32:40 +01006Non-RT RIC PM Producer
7~~~~~~~~~~~~~~~~~~~~~~
PatrikBuhr608883c2023-04-06 13:17:36 +02008
9************
10Introduction
11************
12
13The task of the PM Producer is to process PM reports and to distribute requested information to subscribers.
14The main use case is:
15
16* The PM Producer receives a Json object from Kafka which notifies that a new PM report is fetched and is available to processed.
17
18* The actual PM report is in a file, which is stored in an S3 Object store bucket or in the file system (in a mounted volume). The file has the same structure as 3GPP TS 32.432/3GPP TS 32.435, but is converted to json and is extended to contain the information that is encoded the 3GPP measurement report xml file name.
19
20* The PM Producer loads the file and distribute the contents to the subscribers over Kafka according to their subscription parameters. These subscription parameters defines wanted measurement types from given parts of of the network.
21
22The PM Producer registers itself as an information producer of PM measurement data in Information Coordination Service (ICS).
23
24A data consumer can create an information job (data subscription) using the ICS consumer API (for rApps) or the A1-EI (Enrichment Information) API (for NearRT-RICs).
25The PM Producer will get notified when information jobs of type 'PM measurements' are created.
26
27The service is implemented in Java Spring Boot.
28
29.. image:: ./Architecture.png
30 :width: 500pt
31
32This product is a part of :doc:`NONRTRIC <nonrtric:index>`.
33
34**************
35Delivered data
36**************
37When a data consumer (e.g an rApp) creates an Information Job, a Kafka Topic is given as output for the job.
38After filtering, the data will be delivered to the output topic.
39
40The format of the delivered PM measurement is the same as the input format (which in turn is a Json mapping done from
413GPP TS 32.432/3GPP TS 32.435).
PatrikBuhr725b5992023-05-16 17:13:11 +020042The data can be delivered in gzipped format or in cleartext (indicated by an element in the Kafka header).
PatrikBuhr608883c2023-04-06 13:17:36 +020043
44The result of the PM filtering preserves the structure of a 3GPP PM report.
45Here follows an example of a resulting delivered PM report.
46
47.. code-block:: javascript
48
49 {
50 "event":{
51 "commonEventHeader":{
52 "domain":"perf3gpp",
53 "eventId":"9efa1210-f285-455f-9c6a-3a659b1f1882",
54 "eventName":"perf3gpp_gnb-Ericsson_pmMeasResult",
55 "sourceName":"O-DU-1122",
56 "reportingEntityName":"",
57 "startEpochMicrosec":951912000000,
58 "lastEpochMicrosec":951912900000,
59 "timeZoneOffset":"+00:00"
60 },
61 "perf3gppFields":{
62 "perf3gppFieldsVersion":"1.0",
63 "measDataCollection":{
64 "granularityPeriod":900,
65 "measuredEntityUserName":"RNC Telecomville",
66 "measuredEntityDn":"SubNetwork=CountryNN,MeContext=MEC-Gbg-1,ManagedElement=RNC-Gbg-1",
67 "measuredEntitySoftwareVersion":"",
68 "measInfoList":[
69 {
70 "measInfoId":{
PatrikBuhr32de6c42023-04-20 06:42:23 +020071 "sMeasInfoId":"PM=1,PmGroup=NRCellDU_GNBDU"
PatrikBuhr608883c2023-04-06 13:17:36 +020072 },
73 "measTypes":{
PatrikBuhr608883c2023-04-06 13:17:36 +020074 "sMeasTypesList":[
75 "succImmediateAssignProcs"
76 ]
77 },
78 "measValuesList":[
79 {
80 "measObjInstId":"RncFunction=RF-1,UtranCell=Gbg-997",
81 "suspectFlag":"false",
82 "measResults":[
83 {
84 "p":1,
85 "sValue":"1113"
86 }
87 ]
88 },
89 {
90 "measObjInstId":"RncFunction=RF-1,UtranCell=Gbg-998",
91 "suspectFlag":"false",
92 "measResults":[
93 {
94 "p":1,
95 "sValue":"234"
96 }
97 ]
98 },
99 {
100 "measObjInstId":"RncFunction=RF-1,UtranCell=Gbg-999",
101 "suspectFlag":"true",
102 "measResults":[
103 {
104 "p":1,
105 "sValue":"789"
106 }
107 ]
108 }
109 ]
110 }
111 ]
112 }
113 }
114 }
115 }
116
117==================
118Sent Kafka headers
119==================
120
JohnKeeneyff7ea9d2023-04-25 19:32:40 +0100121For each filtered result sent to a Kafka topic, there will the following properties in the Kafka header:
PatrikBuhr608883c2023-04-06 13:17:36 +0200122
JohnKeeneyff7ea9d2023-04-25 19:32:40 +0100123* type-id, this property is used to indicate the ID of the information type. The value is a string.
124* gzip, if this property exists the object is gzip'ed (otherwise not). The property has no value.
125* source-name, the name of the source RAN traffic-handling element from which the measurements will originate.
PatrikBuhr608883c2023-04-06 13:17:36 +0200126
127
PatrikBuhr725b5992023-05-16 17:13:11 +0200128*************
129Configuration
130*************
131
132The component is configured by a configuration file and by the normal spring boot configuration file (apoplication.yaml).
133
134==================
135Configuration file
136==================
PatrikBuhr608883c2023-04-06 13:17:36 +0200137
JohnKeeneyff7ea9d2023-04-25 19:32:40 +0100138The configuration file defines Kafka topics that should be listened to and registered as information types which can be subscribed to.
PatrikBuhr608883c2023-04-06 13:17:36 +0200139There is an example configuration file in config/application_configuration.json
140
141Each entry will be registered as a subscribe information type in ICS. The following attributes can be used in each entry:
142
143* id, the information type identifier.
144
145* kafkaInputTopic, a Kafka topic to listen to for new file events.
146
147* inputJobType, the information type for new file events subscription.
148
149* inputJobDefinition, the parameters for the new file events subscription.
150
151The last two parameters are used to create the subscription for the input to this component (subscription of file ready events).
152
153
154Below follows an example of a configuration file.
155
156.. code-block:: javascript
157
158 {
159 "types": [
160 {
161 "id": "PmDataOverKafka",
162 "kafkaInputTopic": "FileReadyEvent",
163 "inputJobType": "xml-file-data-to-filestore",
164 "inputJobDefinition": {
165 "kafkaOutputTopic": "FileReadyEvent",
166 "filestore-output-bucket": "pm-files-json",
167 "filterType": "pmdata",
168 "filter": {
169 "inputCompression": "xml.gz",
170 "outputCompression": "none"
171 }
172 }
173 }
174 ]
175 }
176
PatrikBuhr725b5992023-05-16 17:13:11 +0200177 ================
178 application.yaml
179 ================
180
181An example application.yaml configuration file: ":download:`link <../config/application.yaml>`"
182
183
PatrikBuhr608883c2023-04-06 13:17:36 +0200184**************************
185Information Job Parameters
186**************************
187
188The schema for the parameters for PM measurements subscription is defined in file src/main/resources/typeSchemaPmData.json.
189
190=====================
191typeSchemaPmData.json
192=====================
193
194The type specific json schema for the subscription of PM measurement:
195
196.. code-block:: javascript
197
198 {
199 "$schema": "http://json-schema.org/draft-04/schema#",
200 "type": "object",
201 "additionalProperties": false,
202 "properties": {
203 "filter": {
204 "type": "object",
205 "additionalProperties": false,
206 "properties": {
207 "sourceNames": {
208 "type": "array",
209 "items": [
210 {
211 "type": "string"
212 }
213 ]
214 },
215 "measObjInstIds": {
216 "type": "array",
217 "items": [
218 {
219 "type": "string"
220 }
221 ]
222 },
223 "measTypeSpecs": {
224 "type": "array",
225 "items": [
226 {
227 "type": "object",
228 "properties": {
229 "measuredObjClass": {
230 "type": "string"
231 },
232 "measTypes": {
233 "type": "array",
234 "items": [
235 {
236 "type": "string"
237 }
238 ]
239 }
240 },
241 "required": [
242 "measuredObjClass"
243 ]
244 }
245 ]
246 },
247 "measuredEntityDns": {
248 "type": "array",
249 "items": [
250 {
251 "type": "string"
252 }
253 ]
254 },
255 "pmRopStartTime": {
256 "type": "string"
257 },
258 "pmRopEndTime": {
259 "type": "string"
260 }
261 }
262 },
263 "deliveryInfo": {
264 "type": "object",
265 "additionalProperties": false,
266 "properties": {
267 "topic": {
268 "type": "string"
269 },
270 "bootStrapServers": {
271 "type": "string"
272 }
273 },
274 "required": [
275 "topic"
276 ]
277 }
278 },
279 "required": [
280 "filter", "deliveryInfo"
281 ]
282 }
283
284
285The following properties are defined:
286
287* filter, the value of the filter expression. This selects which data to subscribe for. All fields are optional and excluding a field means that everything is selected.
288
JohnKeeneyff7ea9d2023-04-25 19:32:40 +0100289 * sourceNames, section of the names of the reporting RAN traffic-handling nodes
290 * measObjInstIds, selection of the measured resources. This is the Relative Distinguished Name (RDN) of the MO that
PatrikBuhr608883c2023-04-06 13:17:36 +0200291 has the counter.
292 If a given value is contained in the filter definition, it will match (partial matching).
293 For instance a value like "NRCellCU" will match "ManagedElement=seliitdus00487,GNBCUCPFunction=1,NRCellCU=32".
294 * measTypeSpecs, selection of measurement types (counters). This consists of:
295
296 * measuredObjClass, the name of the class of the measured resources.
297 * measTypes, the name of the measurement type (counter). The measurement type name is only
298 unique in the scope of an MO class (measured resource).
299
JohnKeeneyff7ea9d2023-04-25 19:32:40 +0100300 * measuredEntityDns, selection of DNs for the RAN traffic-handling elements.
PatrikBuhr608883c2023-04-06 13:17:36 +0200301
302 * pmRopStartTime, if this parameter is specified already collected PM measurements files will be scanned to retrieve historical data.
303 The start file is the time from when the information shall be returned.
304 In this case, the query is only done for files from the given "sourceNames".
JohnKeeneyff7ea9d2023-04-25 19:32:40 +0100305 If this parameter is excluded, only "new" reports will be delivered as they are collected from the RAN traffic-handling nodes.
PatrikBuhr28bb3da2023-05-16 08:56:41 +0200306 How old information that can be retrieved depends on the retention time for the storage (if minio it used, it is a S3 bucket).
PatrikBuhr608883c2023-04-06 13:17:36 +0200307
308 * pmRopEndTime, for querying already collected PM measurements. Only relevant if pmRopStartTime.
309 If this parameters is given, no reports will be sent as new files are collected.
310
311* deliveryInfo, defines where the subscribed PM measurements shall be sent.
312
313 * topic, the name of the kafka topic
314 * bootStrapServers, reference to the kafka bus to used. This is optional, if this is omitted the default configured kafka bus is used (which is configured in the application.yaml file).
315
316
317
318Below follows examples of some filters.
319
320.. code-block:: javascript
321
322 {
323 "filter":{
324 "sourceNames":[
325 "O-DU-1122"
326 ],
327 "measObjInstIds":[
328 "UtranCell=Gbg-997"
329 ],
330 "measTypeSpecs":[
331 {
332 "measuredObjClass":"UtranCell",
333 "measTypes":[
334 "succImmediateAssignProcs"
335 ]
336 {
337 ]
338 }
339 }
340
341Here follows an example of a filter that will
JohnKeeneyff7ea9d2023-04-25 19:32:40 +0100342match two counters from all cells in two RAN traffic-handling nodes.
PatrikBuhr608883c2023-04-06 13:17:36 +0200343
344.. code-block:: javascript
345
346 {
347 "filterType":"pmdata",
348 "filter": {
349 "sourceNames":[
350 "O-DU-1122", "O-DU-1123"
351 ],
352 "measTypeSpecs":[
353 {
354 "measuredObjClass":"NRCellCU",
355 "measTypes":[
356 "pmCounterNumber0", "pmCounterNumber1"
357 ]
358 }
359 ],
360
361 }
362 }
363
364
365****************************
366PM measurements subscription
367****************************
368
369The sequence is that a "new file event" is received (from a Kafka topic).
370The file is read from local storage (file storage or S3 object store). For each Job, the specified PM filter is applied to the data
371and the result is sent to the Kafka topic specified by the Job (by the data consumer).
372
373.. image:: ./dedicatedTopics.png
374 :width: 500pt
375
PatrikBuhr725b5992023-05-16 17:13:11 +0200376=========================================
377Several Jobs sharing the same Kafka topic
378=========================================
379
PatrikBuhr608883c2023-04-06 13:17:36 +0200380If several jobs publish to the same Kafka topic (shared topic), the resulting filtered output will be an aggregate of all matching filters.
381So, each consumer will then get more data than requested.
382
383.. image:: ./sharedTopics.png
384 :width: 500pt
385