What's the downside of using global mongo (mgo) database in golang?

Are there any downsides if we use a global variable to handle database operations instead of passing it as an argument to functions and methods or storing it as a field in structs?

What are these downsides (if there are)?

Let's say we create a package inside a project called database, inside that package define a variable called DB var DB *mgo.Database, and then in project's main function fill it with our mongo database:

func main() {
    session, err := mgo.Dial("localhost")
    if err != nil {
        fmt.Println(err)
        return
    }
    database.DB = session.DB("mydatabase")
    // project code
    defer session.Close()
}

After that, we use database.DB to interact with our database.

Note that there will be lots of goroutines using database.DB (if it makes any difference)

The question is not opinion based, please take more time to read and understand

1 answer

  • answered 2019-07-10 23:34 Markus W Mahlberg

    The procedure of using a globally initialized database to be handed down to the places it is needed is well established and reasonable.

    However, just using database.DB in a manner like

    _ = database.DB.C(foo).Find(q).One(&bar)
    

    all over the place holds a significant disadvantage: you only use one connection of the underlying connection pool, practically ensuring that all requests are processed sequentially.

    So what you rather want to do is something like this:

    s := database.DB.Session.Copy()
    _ = database.DB.C(foo).With(s).Find(q).One(&bar)
    defer s.Close()
    

    for „parallel“ requests (there are some caveats about parallel requests, which I leave out for the sake of brevity).