#StackBounty: #mongodb #mongodb-query #nosql #aggregation-framework #nosql-aggregation How to create complicated mongodb queries?

Bounty: 50

I am a novice when it comes to mongo as I have traditionally only worked with Oracle database.
I have a mongo database that’s storing bitbucket data in columns like so:

_id | _class | collectorItemId| firstEverCommit | scmUrl | scmBranch | scmAuthor | scmCommitTimestamp

There are a few more columns in there that I’ve omitted for the sake of time. For the scmBranch column, the column is populated with one of two strings: “master” or “develop”.
Here is a sample of what the data looks like:
enter image description here

Here is the document view of one of the rows:

{
"_id" : ObjectId("5e39d6a0330c130006a042c6"),
"collectorItemId" : ObjectId("5e33a6b9887ef5000620a0c0"),
"firstEverCommit" : false,
"scmUrl" : "sampleRepo1",
"scmBranch" : "master",
"scmRevisionNumber" : "a2ad6842468eb55bffcbe7d700b6addd3eb11629",
"scmAuthor" : "son123",
"scmCommitTimestamp" : NumberLong(1580841662000)
}

I am now trying to formulate mongo queries that will get me the following data:

 1. For each scmUrl, If max(scmCommitTimestamp) where scmBranch =
    "develop" > max(scmCommitTimestamp) where scmBranch = "master" THEN
    count the number of rows (i.e commits) where scmBranch = "develop"
    AND scmCommitTimestamp > max(scmCommitTimestamp) where scmBranch =
    "master"

 2. For the results found in #1, find the oldest commit and newest
    commit

So far, the best mongo query I’ve been able to come up with is the following:

db.bitbucket.aggregate([{
    "$group": {
        "_id": {
            "scmUrl": "$scmUrl",
            "scmBranch": "$scmBranch"
        },
        "MostRecentCommit": {
            "$max": {"$toDate":"$scmCommitTimestamp"}
        }
    }
},{
    "$project": {
        "RepoName": {"$substr": ["$_id.scmUrl",39,-1]},
        "Branch": "$_id.scmBranch",
        "MostRecentCommit": "$MostRecentCommit"
    }
},{
   "$sort":{
       "RepoName":1,
       "Branch":1
       }

}
])

But this only gets me back the most recent commit for the develop branch and the master branch of each scmUrl (i.e repo), like so:
enter image description here

Ideally, I’d like to get back a table of results with the following columns:

scmUrl/RepoName | Number of commits on develop branch that are not on master branch| oldest commit in develop branch that's not in master branch | newest commit in develop branch that's not in master branch

How can I modify my mongo query to extract the data that I want?


Get this bounty!!!

#StackBounty: #web-apps #database #modeling #mongodb #er-diagram MongoDB ERD Tool

Bounty: 50

I am looking for a tool to generate an ERD (Entity Relationship Diagram) from JSON for MongoDB.

Requirements:

  • Works with MongoDB (Other NoSQL databases would be nice as well)
  • Allows you to type create a MongoDB schema visually and present an ERD
  • Free
  • Preferably web based, but desktop will work as well.
  • Not made in 1980

I’ve looked into quite a few, but most I’ve seen are very expensive and aren’t web based.


Get this bounty!!!

#StackBounty: #java #spring #mongodb #spring-data-jpa #spring-data-mongodb How to utilize Pageable when running a custom delete query i…

Bounty: 50

I am working on creating a tool allowing admins to purge data from the database. Our one collection has millions of records making deletes seize up the system. Originally I was just running a query with that returns a Page and dropping that into the standard delete. Ideally i’d prefer to run the query and delete in one go.

@Query(value = "{ 'timestamp' : {$gte : ?0, $lte: ?1 }}")
public Page deleteByTimestampBetween(Date from, Date to, Pageable pageable);

Is this possible, using the above code the system behaves the same where the program doesnt continue the delete function and the data isnt removed from mongo. Or is there a better approach?


Get this bounty!!!

#StackBounty: #node.js #angular #mongodb #express #mongoose Server-side pagination using ngx-pagination

Bounty: 200

I got the ngx-pagination module working with all the listings in the GET, but I want the pagination to work server-side too, but I’m unsure how to implement it further than what I have. I’m looking at the documentation for ngx-pagination, but I’m a little bit confused. Here’s what I have.

html

<body [ngClass]="[(this.isOpen && this.mobile) || (this.isOpen && this.tablet) ? 'hideContent' : 'showContent']">
    <div class="loading">
        <!-- <mat-spinner class="loader" *ngIf="isLoading"></mat-spinner> -->

        <ngx-spinner id="loadingIcon" *ngIf="isLoading" type="cog" size="large" color="#3071a9">


            <p class="loadingTitle">Loading...</p>
        </ngx-spinner>

    </div>

    <div class="spacing"></div>
    <div class="container">
        <div class="row no-gutters"
            *ngIf="!this.isOpen && this.mobile || this.isOpen && !this.mobile || !this.isOpen && !this.mobile">
            <div class="class col-md-7"></div>

        </div>

        <!-- /|slice:0:show -->
        <!--; let i = index-->
        <div class="row"
            *ngFor="let auction of posts | paginate: { itemsPerPage: 10, currentPage: p, totalItems: this.posts.count }">
            <div class="col-md-12 col-centered">

                <div class="listingCard" [@simpleFadeAnimation]="'in'">

                    <div class=container>

                        <div class="row">
                            <div class="col-md-3">

                            </div>
                            <div class="col-md-6">
                                <div id="title">{{auction.title}}</div>
                            </div>

                        </div>

                    </div>



                </div>
            </div>

        </div>

    </div>

    <pagination-controls (pageChange)="p = $event"></pagination-controls>

</body>

</html>

<!DOCTYPE html>
<html>

<head>
</head>

<body [ngClass]="[(this.isOpen && this.mobile) || (this.isOpen && this.tablet) ? 'hideContent' : 'showContent']">
    <div class="loading">
        <!-- <mat-spinner class="loader" *ngIf="isLoading"></mat-spinner> -->
        <ngx-spinner id="loadingIcon" *ngIf="isLoading" type="cog" size="large" color="#3071a9">

            <p class="loadingTitle">Loading...</p>
        </ngx-spinner>
    </div>
    <div class="spacing"></div>
    <div class="container">
        <div class="row no-gutters"
            *ngIf="!this.isOpen && this.mobile || this.isOpen && !this.mobile || !this.isOpen && !this.mobile">
            <div class="class col-md-7"></div>
        </div>
        <!-- /|slice:0:show -->
        <!--; let i = index-->
        <div class="row"
            *ngFor="let auction of posts | paginate: { itemsPerPage: 10, currentPage: p, totalItems: this.posts.count }">
            <div class="col-md-12 col-centered">
                <div class="listingCard" [@simpleFadeAnimation]="'in'">
                    <div class=container>
                        <div class="row">
                            <div class="col-md-3">
                            </div>
                            <div class="col-md-6">
                                <div id="title">{{listing.title}}</div>
                            </div>
                        </div>
                    </div>
                    =
                </div>
            </div>
        </div>
    </div>
    <pagination-controls (pageChange)="p = $event"></pagination-controls>
</body>

component

 p: number = 1;

ngOnInit(){
    this.submitListingService.getListings(this.postsPerPage, this.currentPage);
    this.listingService
      .getPostUpdateListener()
      .pipe(takeUntil(this.destroy))
      .subscribe((postData: { listing: Listing[]; postCount: number }) => {
        this.isLoading = false;
        this.totalPosts = postData.postCount;
        this.posts = postData.listing;
        this.filteredPosts = postData.listing;
      });
}

angular service

getListings(postsPerPage: number, currentPage: number) {
    let listings = "Get Listings";
    const params = new HttpParams().set("listings", listings);
    const queryParams = `?pagesize=${postsPerPage}&page=${currentPage}`;
    this.http
      .get<{ message: string; posts: any; maxPosts: number }>(
        "http://localhost:3000/api/listings" + queryParams,
        { params }
      )
      .pipe(
        map(postData => {
          return {
            posts: postData.posts.map(post => {
              return {
               title: post.title,                   
                id: post._id
              };
            }),
            maxPosts: postData.maxPosts
          };
        })
      )
      .pipe(takeUntil(this.destroy))
      .subscribe(transformedPostData => {
        this.posts = transformedPostData.posts;
        this.postsUpdated.next({
          listing: [...this.posts],
          postCount: transformedPostData.maxPosts
        });
      });
  }

app.js

app.get("/api/listings", (req, res, next) => {
  Post.find({ auctionEndDateTime: { $gte: Date.now() } })
      .populate("creator", "username")
      .then(documents => {
        req.params.Id = mongoose.Types.ObjectId(req.params.Id);
        res.status(200).json({
          message: "Auction listings retrieved successfully!",
          posts: documents
        });
      });
});


Get this bounty!!!

#StackBounty: #mongodb #nosql #dimensional-modeling #node.js Cascading One-to-Many relationships modeling

Bounty: 50

Are there any drawbacks or better alternatives of this non relational model ? I note that it remains easy to understand but the concern is with the code interacting with.

First, as I was introduced to the no-SQL world, in many occasions I confronted a one-to-many relationship between entities. Today, I have a really relatively cascading example that might grow in the future.

Based on functionalities assumptions, I came up with a simple Snowflake model. Specifically a cascading one-to-many relationships with some describing data.

[User] 1 — * [Session] 1 — * [Execution] 1 — * [Report]

The data model as it seems at first is easy to deal with, but I finally found that acting on data using Mongoose (a NodeJS library) can become complex and less performant, especially in a web application context (request and response cycle). The first way of thinking is to simple refer to parents by children in a normalization fashion. Another way to implement this data model is using document embedding approach: https://docs.mongodb.com/manual/tutorial/model-embedded-one-to-many-relationships-between-documents/ which is easier to interact with if you just model all in one entity; However this comes at the expense of performance; Because whenever you load a user, you load all sessions, executions and reports with it.

I found a compromise between a normalized model and the one using embedded documents; Modeled here:

Normalize and embed

The compromise consist of embedding a minimal variant of the child entity like Executions of type ExecutionsMini in Sessions. While maintaining the child entity Executions separate.

The concern grows because between Users and Loggings, there might be other entities added, in a one-to-many kind or not, and this could complex more the solution (not the data model).


Get this bounty!!!

#StackBounty: #node.js #angular #mongodb #express #mongoose GET not returning sent message. Only inbox items

Bounty: 300

When a user sends a message, it generates a messageTrackingId. Right now it $unwinds the creatorName as a unique returned value in inbox. I want only one user entry. No duplicates of the same user. Currently though they can send multiple messages if the other user hasn’t responded generating new messageTrackingIds as a result. How can I make the initial sent message appear in the inbox as well so that I can use that messageTrackingId instead of generating new ones? I’ve been stuck on this for awhile so I appreciate any help.

app.get

app.get("/api/messages", (req, res, next) => {
  query = {};
  inbox = false;
  messageId = false;
  if (req.query.recipientId) {
    query = { recipientId: req.query.recipientId };
    inbox = true;

    Messages.aggregate(
      // Pipeline
      [
        {
          $lookup: {
            from: "users", // other table name
            localField: "creator", // name of users table field
            foreignField: "_id", // name of userinfo table field
            as: "creatorName" // alias for userinfo table
          }
        },
        { $unwind: "$creatorName" },
        {
          $match: {
            recipientId: { $eq: req.query.recipientId }
          }
        },

        // Stage 1
        {
          $group: {
            _id: "$messageTrackingId",
            message: { $addToSet: "$message" },
            recipientId: { $addToSet: "$recipientId" },
            creator: { $addToSet: "$creator" },
            messageTrackingId: { $addToSet: "$messageTrackingId" },
            creatorName: { $addToSet: "$creatorName.instagramName" },
            creationDate: { $addToSet: "$creationDate" }
          }
        },

        // Stage 2
        {
          $project: {
            _id: 1,
            message: { $arrayElemAt: ["$message", 0] },
            recipientId: { $arrayElemAt: ["$recipientId", 0] },
            creator: { $arrayElemAt: ["$creator", 0] },
            messageTrackingId: { $arrayElemAt: ["$messageTrackingId", 0] },
            creatorName: { $arrayElemAt: ["$creatorName", 0] },
            creationDate: { $arrayElemAt: ["$creationDate", 0] }
          }
        }
      ]
    )
      //.populate('creator', 'instagramName')

      .then(documents => {
        if (res.subject === "Test") {
          console.log("Nice");
        }
        if (inbox === false && messageId === false) {
          res.status(200).json({
            message: "User's Sent Messages Retrieved!",
            posts: documents
          });
        }
        if (inbox === true) {
          res.status(200).json({
            message: "User's Inbox Retrieved!",
            posts: documents
          });
        }
        if (messageId === true) {
          res.status(200).json({
            message: "Message Chain Retrieved!",
            posts: documents
          });
        }
      });

    //   .exec((err, locations) => {
    //     if (err) throw err;
    //     console.log(locations);
    // });
  } else if (req.query.creator) {
    query = { creator: req.query.creator };
    inbox = false;
    Messages.find(query)
      .populate("creator", "instagramName")
      .then(documents => {
        if (inbox === false && messageId === false) {
          res.status(200).json({
            message: "User's Sent Messages Retrieved!",
            posts: documents
          });
        }
        if (inbox === true) {
          res.status(200).json({
            message: "User's Inbox Retrieved!",
            posts: documents
          });
        }
        if (messageId === true) {
          res.status(200).json({
            message: "Message Chain Retrieved!",
            posts: documents
          });
        }
      });
  } else if (req.query.messageId) {
    query = { messageTrackingId: req.query.messageId };
    console.log(req.query.messageId);
    messageId = true;
    console.log("MESSAGE ID IS TRUE");
    Messages.find(query)
      .populate("creator", "instagramName")
      .then(documents => {
        if (inbox === false && messageId === false) {
          res.status(200).json({
            message: "User's Sent Messages Retrieved!",
            posts: documents
          });
        }
        if (inbox === true) {
          res.status(200).json({
            message: "User's Inbox Retrieved!",
            posts: documents
          });
        }
        if (messageId === true) {
          res.status(200).json({
            message: "Message Chain Retrieved!",
            posts: documents
          });
        }
      });
  }
});

app.post

app.post("/api/messages", checkAuth, (req, res, next) => {
  console.log("Made It")
  messagingTrackingIDValue = "";

  const messaging = new Messages({
    creator: req.userData.userId,
    recipient: req.body.recipient,
    recipientId: req.body.recipientId,
    message: req.body.message,
    //message: req.body.message,
    messageTrackingId: req.body.messageTrackingId,
    creatorName: req.userData.username,
    creationDate: req.body.creationDate
  });

  //saves to database with mongoose
  messaging.save().then(result => {
    if (result.creator !== messaging.creator) {
    } else if (result.creator === req.userData.userId) {
    }
    console.log(result);
    res.status(201).json({
      message: "Message Sent Successfully!",
      postId: result._id
    });
  });
});

angular service

  sendMessage(
    recipient: string,
    message: string,
    creationDate: Date,
    recipientId: string,
    creatorName: string,
    messageTrackingId: string
  ) {
    const messaging: Messages = {
      id: null,
      recipient: recipient,
      message: message,
      creationDate: creationDate,
      creator: null,
      recipientId: recipientId,
      creatorName: creatorName,
      messageTrackingId: messageTrackingId
    };

    this.http
      .post<{ message: string; messagingId: string; creator: string }>(
        "http://localhost:3000/api/messages",
        messaging
      )
      .subscribe(responseData => {
        console.log(responseData);
        const id = responseData.messagingId;
        messaging.id = id;

        console.log("Message sent successfully!");

        //   window.location.reload();
        //  this.posts.push();
        //  this.postsUpdated.next([...this.posts]);
      });
  }




  replyToMessage(
    recipient: string,
    message: string,
    creationDate: Date,
    recipientId: string,
    creatorName: string,
    messageTrackingId: string
  ) {
    const messaging: Messages = {
      id: null,
      recipient: recipient,
      message: message,
      creationDate: creationDate,
      creator: null,
      recipientId: recipientId,
      creatorName: creatorName,
      messageTrackingId: messageTrackingId
    };

    this.http
      .post<{ message: string; messagingId: string; creator: string }>(
        "http://localhost:3000/api/messages",
        messaging
      )
      .subscribe(responseData => {
        console.log(responseData);
        const id = responseData.messagingId;
        messaging.id = id;

        console.log("Message sent successfully!");
      });
  }







  getMessages(recipientId: string) {
    return this.http
      .get<{
        message: string;
        posts: any;
        maxPosts: number;
        messageList: string;
      }>("http://localhost:3000/api/messages?recipientId=" + recipientId)
      .pipe(
        map(retrievedData => {
          return {
            posts: retrievedData.posts.map(post => {
              return {
                creator: post.creator,
                recipientId: post.recipientId,
                creationDate: post.creationDate,
                messageTrackingId: post.messageTrackingId,
                creatorName: post.creatorName,
                id: post._id
              };
            }),
            maxPosts: retrievedData.maxPosts
          };
        })
      );
  }

Here’s an example of the recipient replying to message so sender gets messageTrackingId to use

First message and then reply message. Since the recipient replied, the sender has the messageTrackingId to use for next message to same user.

Made It
{ _id: 5e0674ddd55aae5294370870,
  creator: 5df0014e25ee451beccf588a,
  recipient: 'joe',
  recipientId: '5df00d08c713f722909c99c1',
  message: 'This is the initial message',
  messageTrackingId: '3cb3f5bb-5e17-49a7-8aca-4a61ddd1d847',
  creatorName: 'andy',
  creationDate: 2019-12-27T21:17:17.155Z,
  __v: 0 }
Made It
{ _id: 5e067529d55aae5294370872,
  creator: 5df00d08c713f722909c99c1,
  recipient: 'andy',
  recipientId: '5df0014e25ee451beccf588a',
  message: 'This is the reply message',
  messageTrackingId: '3cb3f5bb-5e17-49a7-8aca-4a61ddd1d847',
  creatorName: 'joe',
  creationDate: 2019-12-27T21:18:33.947Z,
  __v: 0 }

If recipient never replies and sender sends another message this happens:

Made It
{ _id: 5e06756bd55aae5294370873,
  creator: 5df00d08c713f722909c99c1,
  recipient: 'andy',
  recipientId: '5df0014e25ee451beccf588a',
  message: 'This is the first message',
  messageTrackingId: '2077a8e6-844c-4639-a4fa-7aee0b8beaf4',
  creatorName: 'joe',
  creationDate: 2019-12-27T21:19:39.217Z,
  __v: 0 }
Made It
{ _id: 5e06757cd55aae5294370874,
  creator: 5df00d08c713f722909c99c1,
  recipient: 'andy',
  recipientId: '5df0014e25ee451beccf588a',
  message: 'This is another message to same user.',
  messageTrackingId: 'feeb0e20-432e-4c9a-9f59-45913c194edc',
  creatorName: 'joe',
  creationDate: 2019-12-27T21:19:56.257Z,
  __v: 0 }


Get this bounty!!!

#StackBounty: #node.js #mongodb #mongoose #latency Why are mongodb queries to a localhost instance of mongo so much faster than to a cl…

Bounty: 100

I’m using this code to run the tests outlined in this blog post.

(For posterity, relevant code pasted at the bottom).

What I’ve found is that if I run these experiments with a local instance of Mongo (in my case, using docker)

docker run -d -p 27017:27017 -v ~/data:/data/db mongo

Then I get pretty good performance, similar results as outlined in the blog post:

finished populating the database with 10000 users
default_query: 277.986ms
query_with_index: 262.886ms
query_with_select: 157.327ms
query_with_select_index: 136.965ms
lean_query: 58.678ms
lean_with_index: 65.777ms
lean_with_select: 23.039ms
lean_select_index: 21.902ms
[nodemon] clean exit - waiting 

However, when I switch do using a cloud instance of Mongo, in my case an Atlas sandbox instance, with the following configuration:

CLUSTER TIER
M0 Sandbox (General)
REGION
GCP / Iowa (us-central1)
TYPE
Replica Set - 3 nodes
LINKED STITCH APP
None Linked

(Note that I’m based in Melbourne, Australia).

Then I get much worse performance.

default_query: 8353.110ms
query_with_index: 8114.474ms
query_with_select: 3603.191ms
query_with_select_index: 4609.637ms
lean_query: 8455.082ms
lean_with_index: 7885.048ms
lean_with_select: 4209.963ms
lean_select_index: 3798.596ms

I get that obviously there’s going to be some round trip overhead between my computer and the mongo instance, but I would expect that to add 200ms max.

It seems that that round trip time must be being added multiple times, or something completely else that I’m not aware of – can someone explain just what it is that would cause this to blow out?

A good answer might involve doing an explain plan, and explaining that in terms of network latency.

The test code:

(async () => {
  try {
    await mongoose.connect('mongodb://localhost:27017/perftest', {
      useNewUrlParser: true,
      useCreateIndex: true
    })

    await init()

    // const query = { age: { $gt: 22 } }
    const query = { favoriteFruit: 'potato' }

    console.time('default_query')
    await User.find(query)
    console.timeEnd('default_query')

    console.time('query_with_index')
    await UserWithIndex.find(query)
    console.timeEnd('query_with_index')

    console.time('query_with_select')
    await User.find(query)
      .select({ name: 1, _id: 1, age: 1, email: 1 })
    console.timeEnd('query_with_select')

    console.time('query_with_select_index')
    await UserWithIndex.find(query)
      .select({ name: 1, _id: 1, age: 1, email: 1 })
    console.timeEnd('query_with_select_index')

    console.time('lean_query')
    await User.find(query).lean()
    console.timeEnd('lean_query')

    console.time('lean_with_index')
    await UserWithIndex.find(query).lean()
    console.timeEnd('lean_with_index')

    console.time('lean_with_select')
    await User.find(query)
      .select({ name: 1, _id: 1, age: 1, email: 1 })
      .lean()
    console.timeEnd('lean_with_select')

    console.time('lean_select_index')
    await UserWithIndex.find(query)
      .select({ name: 1, _id: 1, age: 1, email: 1 })
      .lean()
    console.timeEnd('lean_select_index')
    process.exit(0)
  } catch (err) {
    console.error(err)
  }
})()


Get this bounty!!!

#StackBounty: #mongodb #mongodb-query How can I reduce time spent acquiring the mongodb schema lock?

Bounty: 50

We have a mongodb cluster in production, with a 3 node replica set consisting of a master and two slaves. One of the slaves has another service co-located that queries the slave extensively. In addressing some slowness in the co-located service I’m seeing a lot of surprisingly slow queries. This one took 3.3 seconds:

  find: "myColl",
  filter: { myField: "myValue" },
  projection: { name: 1 },
  $db: "myDb",
  $clusterTime: { clusterTime: Timestamp(1568198047, 3), signature: { hash: BinData(0, 0000000000000000000000000000000000000000), keyId: 0 } },
  lsid: { id: UUID("2ed823aa-e6af-4898-a4c1-c039d28a32ab") },
  $readPreference: { mode: "secondary" } }
  planSummary: IXSCAN { myField: 1 } keysExamined:0 docsExamined:0 cursorExhausted:1 numYields:0 nreturned:0 reslen:232
  locks:{ Global: { acquireCount: { r: 1 } },
          Database: { acquireCount: { r: 1 } },
          Collection: { acquireCount: { r: 1 } } }
  storage:{ data: { bytesRead: 355, timeReadingMicros: 4 }, timeWaitingMicros: { schemaLock: 3284692 }

The line that stands out to me here, is the last one, indicating that it spends 99.9% of its time waiting to acquire something called a schema lock.

I checked this particular database and collection and it turns out the collection had 50 items at query time. Furthermore, there’s also an index on myField.

After searching a little I get the impression that the schema lock has to do with the schema validation offered by mongodb. This is something we use and we have a handful of such (very simple) rules, but none of them are for myColl which this particular query targets. These validation rules have been in place for years and no new rules were being applied when this query ran.

Is the schema lock tied to changes to the collection schemas at all? Why is a read query waiting to acquire a schema lock? What can I do to eliminate this long wait?


Get this bounty!!!

#StackBounty: #python #mongodb #aggregate #pymongo #updates MongoDB collection update with $set and aggregate

Bounty: 50

I need to change the type of timestamp_ms filed from string to double and create FixedDate field which is based on this new timestamp_ms field. Than, change timestamp info into ISO date in NewDate field.

I used this code:

collection.update({
    "FixedDate": {"$exists": False}
},[{
    "$set":
        {"FixedDate":
            {"$convert":
                {"input": "$timestamp_ms",
                "to": "double"
                }
            }
        }
    },
    {"$set":
        {"NewDate":
                {"$toDate": "$FixedDate"
                }
            }
    }
], multi=True)

It gives this error message:

TypeError: document must be an instance of dict, bson.son.SON, or other type that inherits from collections.Mapping

My data looks like this:

{'_id': ObjectId('5afea920d326051990a7f337'), 'created_at': 'Fri May 18 10:21:07 +0000 2018', 'timestamp_ms': '1526638867739'}
{'_id': ObjectId('5afea920d326051990a7f339'), 'created_at': 'Fri May 18 10:21:08 +0000 2018', 'timestamp_ms': '1526638868310'}
{'_id': ObjectId('5afea972d326051c5c05bc11'), 'created_at': 'Fri May 18 10:22:30 +0000 2018', 'timestamp_ms': '1526638950799'}
{'_id': ObjectId('5afea974d326051c5c05bc16'), 'created_at': 'Fri May 18 10:22:32 +0000 2018', 'timestamp_ms': '1526638952160'}
{'_id': ObjectId('5afea974d326051c5c05bc17'), 'created_at': 'Fri May 18 10:22:32 +0000 2018', 'timestamp_ms': '1526638952841'}


Get this bounty!!!

#StackBounty: #linux #ssh #proxy #mongodb #ssh-tunnel RoboMongo Proxy server with two ssh keys | RoboMongo Proxyjump

Bounty: 50

We have two servers one with internet access(srv1) and the without(srv2), Mongo DB is running on the server which has no internet access(srv2). The only way to access srv2 is via ssh to srv1 and then ssh to srv2 with a key(srv2-key). logging into srv1 also needs a key(srv1-key).

For normal ssh or scp to both the server we use ssh-config file as mentioned below with the help of ProxyJump

Host srv1
     Hostname [srv1-ip]
     User ec2-user
     IdentityFile /location/of/srv1/key.pem

Host srv2
     Hostname [srv2-ip]
     User ec2-user
     ProxyJump srv1                   
     IdentityFile /location/of/srv2/key.pem

The problem being is it possible to use robomongo to connect to the mongo DB which is running on the second server(srv2)


Get this bounty!!!