You will have no problem using Postgres for your json data. It will take some time to get used to the json and array functions in Postgres, but they are great and will cover most of your needs. Postgres' Json support is mature and awesome.
Try to think it over before storing data in json fields. This is an easy way, but it can come back and bite you if your data is actually relational. Good old normal forms remain today.
I have used MongoDB in several projects, and I doubt that I will ever use it again. I think most real-world use cases are actually relational. Postgres has achieved the perfect balance (in my opinion) between relational and documents with great json support.
If you just want to keep the "blob" of json data, you can use the json data type. But if you know that you will need to manipulate objects or pull values โโfrom them, you should use jsonb . Jsonb will also verify the validity of your json data. I also think jsonb is more efficient for indexing.
Using json datatype or json array as input parameter in Postgres functions is a fantastic imo. I often get json from my API server, which needs to be split into smaller places when the data should be stored as relational. This is easy to do in Postgres. I also sometimes use it differently, that is, the Postgres function merges relational data and perfectly supplies json objects to REST for output. Thus, I can hide the fact that I actually store data in a normalized database.
source share