Differences in query logs when read by STH-Comet caused by Cygnus version upgrade

138 Views Asked by At

I am investigating Cygnus and STH-Comet and would like to know the impact of the Cygnus version upgrade.

I have created an environment with the following docker-compose.yml. And, I have switched the Cygnus version in the commented out section to investigate.

docker-compose.yml:

version: "3"

services:
  orion:
    container_name: orion
    image: fiware/orion:3.7.0
    links:
      - mongo
    ports:
      - "1026:1026"
    command: -dbhost mongo -logLevel DEBUG

  cygnus:
    container_name: cygnus
    image: fiware/cygnus-ngsi:2.20.0
    # image: fiware/cygnus-ngsi:3.2.0
    links:
      - mongo
    ports:
      - "5051:5051"
      - "5080:5080"
    environment:
      - CYGNUS_MONGO_HOSTS=mongo:27017
      - CYGNUS_API_PORT=5080
      - CYGNUS_MONGO_DATA_MODEL=dm-by-service-path
      - CYGNUS_STH_DATA_MODEL=dm-by-service-path

  sth:
    container_name: sth
    image: fiware/sth-comet:2.10.0
    depends_on:
      - "mongo"
    links:
      - mongo
    ports:
      - "8666:8666"
    environment:
      - STH_HOST=0.0.0.0
      - DB_URI=mongo:27017
      - DATA_MODEL=collection-per-service-path

  mongo:
    container_name: mongo
    image: mongo:4.4
    ports:
      - "27017:27017"
    command: --logpath=/var/log/mongodb/mongo.log --profile=1 --slowms=-1

I have created the following entities and also the following subscription.

entity:

{
    "id": "Car1",
    "type": "Car",
    "speed": {
        "type": "Float",
        "value": 80,
        "metadata": {}
    }
}

subscription:

{
  "description": "Test",
  "subject": {
    "entities": [
      {
        "idPattern": "Car1$",
        "type": "Car"
      }
    ]
  },
  "notification": {
    "http": {
      "url": "http://cygnus:5051/notify"
    },
    "attrsFormat": "legacy"
  },
  "expires": "2040-01-01T14:00:00.000Z"
}

After creating the above subscription, I will update the entity several times to create historical data.

I ran the following command to get the historical data.

curl "localhost:8666/STH/v2/entities/Car1/attrs/speed?type=Car&lastN=5" -H 'fiware-service:example' -H 'fiware-servicepath:/'

The following query logs are mongodb logs when I read the historical data by STH-Comet.

// Log of STH-Comet query to get data written by Cygnus 2.20.0
{"t":{"$date":"2023-06-28T03:54:01.910+00:00"},"s":"I",  "c":"COMMAND",  "id":51803,   "ctx":"conn11","msg":"Slow query","attr":{"type":"command","ns":"sth_example.$cmd","command":{"listCollections":1,"filter":{"name":"sth_/"},"cursor":{},"nameOnly":true,"lsid":{"id":{"$uuid":"a7f6f881-1db3-42f0-86b6-c14fc3425419"}},"$db":"sth_example"},"numYields":0,"reslen":166,"locks":{"ParallelBatchWriterMode":{"acquireCount":{"r":1}},"FeatureCompatibilityVersion":{"acquireCount":{"r":1}},"ReplicationStateTransition":{"acquireCount":{"w":1}},"Global":{"acquireCount":{"r":1}},"Database":{"acquireCount":{"r":1}},"Collection":{"acquireCount":{"r":2}},"Mutex":{"acquireCount":{"r":1}}},"protocol":"op_msg","durationMillis":0}}
{"t":{"$date":"2023-06-28T03:54:01.916+00:00"},"s":"I",  "c":"COMMAND",  "id":51803,   "ctx":"conn11","msg":"Slow query","attr":{"type":"command","ns":"sth_example.sth_/","command":{"count":"sth_/","query":{"entityId":"Car1","entityType":"Car","attrName":"speed"},"skip":0,"limit":0,"$db":"sth_example"},"planSummary":"COUNT_SCAN { entityId: 1, entityType: 1, attrName: 1, recvTime: 1 }","keysExamined":3,"docsExamined":0,"numYields":0,"queryHash":"41C2776D","planCacheKey":"2D54FA2D","reslen":45,"locks":{"FeatureCompatibilityVersion":{"acquireCount":{"r":1}},"ReplicationStateTransition":{"acquireCount":{"w":1}},"Global":{"acquireCount":{"r":1}},"Database":{"acquireCount":{"r":1}},"Collection":{"acquireCount":{"r":1}},"Mutex":{"acquireCount":{"r":1}}},"storage":{},"protocol":"op_msg","durationMillis":0}}
{"t":{"$date":"2023-06-28T03:54:01.920+00:00"},"s":"I",  "c":"COMMAND",  "id":51803,   "ctx":"conn11","msg":"Slow query","attr":{"type":"command","ns":"sth_example.sth_/","command":{"find":"sth_/","filter":{"entityId":"Car1","entityType":"Car","attrName":"speed"},"sort":{"recvTime":-1},"limit":5,"returnKey":false,"showRecordId":false,"lsid":{"id":{"$uuid":"a7f6f881-1db3-42f0-86b6-c14fc3425419"}},"$db":"sth_example"},"planSummary":"IXSCAN { entityId: 1, entityType: 1, attrName: 1, recvTime: 1 }","keysExamined":2,"docsExamined":2,"cursorExhausted":true,"numYields":0,"nreturned":2,"queryHash":"C0A2C615","planCacheKey":"26B66C8B","reslen":380,"locks":{"FeatureCompatibilityVersion":{"acquireCount":{"r":1}},"ReplicationStateTransition":{"acquireCount":{"w":1}},"Global":{"acquireCount":{"r":1}},"Database":{"acquireCount":{"r":1}},"Collection":{"acquireCount":{"r":1}},"Mutex":{"acquireCount":{"r":1}}},"storage":{},"protocol":"op_msg","durationMillis":0}}

// Log of STH-Comet query to get data written by Cygnus 3.2.0
{"t":{"$date":"2023-06-28T03:43:44.631+00:00"},"s":"I",  "c":"COMMAND",  "id":51803,   "ctx":"conn11","msg":"Slow query","attr":{"type":"command","ns":"sth_example.$cmd","command":{"listCollections":1,"filter":{"name":"sth_/"},"cursor":{},"nameOnly":true,"lsid":{"id":{"$uuid":"8dde283d-302e-4374-aa83-a3c78a19080c"}},"$db":"sth_example"},"numYields":0,"reslen":166,"locks":{"ParallelBatchWriterMode":{"acquireCount":{"r":1}},"FeatureCompatibilityVersion":{"acquireCount":{"r":1}},"ReplicationStateTransition":{"acquireCount":{"w":1}},"Global":{"acquireCount":{"r":1}},"Database":{"acquireCount":{"r":1}},"Collection":{"acquireCount":{"r":2}},"Mutex":{"acquireCount":{"r":1}}},"protocol":"op_msg","durationMillis":0}}
{"t":{"$date":"2023-06-28T03:43:44.634+00:00"},"s":"I",  "c":"COMMAND",  "id":51803,   "ctx":"conn11","msg":"Slow query","attr":{"type":"command","ns":"sth_example.sth_/","command":{"count":"sth_/","query":{"entityId":"Car1","entityType":"Car","attrName":"speed"},"skip":0,"limit":0,"$db":"sth_example"},"planSummary":"COLLSCAN","keysExamined":0,"docsExamined":2,"numYields":0,"queryHash":"41C2776D","planCacheKey":"2D54FA2D","reslen":45,"locks":{"FeatureCompatibilityVersion":{"acquireCount":{"r":1}},"ReplicationStateTransition":{"acquireCount":{"w":1}},"Global":{"acquireCount":{"r":1}},"Database":{"acquireCount":{"r":1}},"Collection":{"acquireCount":{"r":1}},"Mutex":{"acquireCount":{"r":1}}},"storage":{},"protocol":"op_msg","durationMillis":0}}
{"t":{"$date":"2023-06-28T03:43:44.636+00:00"},"s":"I",  "c":"COMMAND",  "id":51803,   "ctx":"conn11","msg":"Slow query","attr":{"type":"command","ns":"sth_example.sth_/","command":{"find":"sth_/","filter":{"entityId":"Car1","entityType":"Car","attrName":"speed"},"sort":{"recvTime":-1},"limit":5,"returnKey":false,"showRecordId":false,"lsid":{"id":{"$uuid":"8dde283d-302e-4374-aa83-a3c78a19080c"}},"$db":"sth_example"},"planSummary":"IXSCAN { recvTime: 1, entityId: 1, entityType: 1, attrName: 1, attrType: 1, attrValue: 1 }","keysExamined":2,"docsExamined":2,"cursorExhausted":true,"numYields":0,"nreturned":2,"queryHash":"C0A2C615","planCacheKey":"26B66C8B","reslen":380,"locks":{"FeatureCompatibilityVersion":{"acquireCount":{"r":1}},"ReplicationStateTransition":{"acquireCount":{"w":1}},"Global":{"acquireCount":{"r":1}},"Database":{"acquireCount":{"r":1}},"Collection":{"acquireCount":{"r":1}},"Mutex":{"acquireCount":{"r":1}}},"storage":{},"protocol":"op_msg","durationMillis":0}}

When I run the following code in STH-Comet, the "planSummary" goes from "COUNT_SCAN" to "COLLSCAN".

https://github.com/telefonicaid/fiware-sth-comet/blob/2.10.0/lib/database/sthDatabase.js#L510-L523

The more historical data, the slower that query becomes, is this a bug or am I misconfigured?

Thank you in advance.

The index of Cygnus 2.20.0 is shown below.

> db['sth_/'].getIndexes()
[
        {
                "v" : 2,
                "key" : {
                        "_id" : 1
                },
                "name" : "_id_"
        },
        {
                "v" : 2,
                "key" : {
                        "entityId" : 1,
                        "entityType" : 1,
                        "attrName" : 1,
                        "recvTime" : 1
                },
                "name" : "cyg_raw_opt"
        },
        {
                "v" : 2,
                "key" : {
                        "recvTime" : 1
                },
                "name" : "cyg_raw_exp",
                "expireAfterSeconds" : NumberLong(1576800000)
        }
]

The index of Cygnus 3.2.0 is shown below. However, the index of "entityId_1_entityType_1_attrName_1_recvTime_1" was added by me for improvement.

rs0:PRIMARY> db['sth_/'].getIndexes()
[
    {
        "v" : 2,
        "key" : {
            "_id" : 1
        },
        "name" : "_id_"
    },
    {
        "v" : 2,
        "key" : {
            "entityId" : 1,
            "entityType" : 1,
            "attrName" : 1,
            "recvTime" : 1
        },
        "name" : "entityId_1_entityType_1_attrName_1_recvTime_1"
    },
    {
        "v" : 2,
        "key" : {
            "recvTime" : 1,
            "entityId" : 1,
            "entityType" : 1,
            "attrName" : 1,
            "attrType" : 1,
            "attrValue" : 1
        },
        "name" : "cyg_raw_opt"
    },
    {
        "v" : 2,
        "key" : {
            "recvTime" : 1
        },
        "name" : "cyg_raw_exp",
        "expireAfterSeconds" : NumberLong(1576800000)
    }
]
1

There are 1 best solutions below

0
On

We have created an issue at Cygnus repository to analyze this in detail and eventually provide a fix.

The workaround by the moment, according to @ctc-watanabe feedback, is to create manually this index:

{
    "v" : 2,
    "key" : {
        "entityId" : 1,
        "entityType" : 1,
        "attrName" : 1,
        "recvTime" : 1
    },
    "name" : "entityId_1_entityType_1_attrName_1_recvTime_1"
}