Our current database has almost 200 MB, but as soon as the application launches, we expect that it will grow to a large volume .. maybe 20-30 GB of data in this.
We plan to use "dacpac" (generated by the database project - SSDT) ββfor deployment on a production server. A database will be created with multiple tables and a variety of source data in lookup tables.
However, the problem is in future deployments, when we will use "dacpac" (generated by the database project - SSDT) ββto update the database on the production server.
Since I have no experience using dacpac for deployment, can anyone suggest me the following -
- Does deployment affect data volume? Or does it depend only on circuit changes? For example, if the target database is 20-30 GB, how much approximate time can it take only to update it?
- How can we change the database schema?
- Is it possible to roll back the update process if something goes wrong?
And finally, is this a better way to manually write sql scripts to update the database?
source share