When we launched our last project (large structure), we started with ProGet for our package server and TeamCity for assembly (Visual Studio Team Services is SC). One of the solutions in the structure contains about 60 libraries of our code that implement everything: from redis to shells for external APIs and common models. Each of these libraries is a nuget package. At the beginning of the project, it was very easy to make changes to the main library, check it, TeamCity will build and click on proget and quick update, and you will be turned off and running.
Soon, although this became unmanageable, the team decided that during the development process no nuget packages in the general library solution would refer to each other through their package, but rather would be direct links. This, of course, accelerated the pace of development, but had an unpleasant side effect on consuming applications. Although the shared libraries were direct links, the 7 main parts of the microservice infrastructure (web api, several mvc, and some working roles) when updating any of our internal packages will receive several copies of the same core libraries that depend on all other libs on.
For example, there is a single lib named "core" that has building blocks for which almost everything is built in shared libraries. It has many interfaces, etc. Well, since all other libraries consume it directly, they all get a direct copy of the kernel in their release, and even more so that our teamcity server handles version control for us, so not only do they each have a copy of the kernel, but it the version is the same as a copy of the nuget package consuming it.
Although not very important, this is not a problem yet. During nuget updates in consuming applications, each library in the application may refer to a different version of the kernel depending on the order in which the updated packages sometimes lead to build errors and finding links to outcasts.
Now that the project is entering the final phase, I want to solve it forever, but I'm not sure how to do it.
For nuget packages to consume each other as nuget packages, one update may take several hours, when one dependent package is updated, you rebuild, it creates another nuget package that requires the package above in the chain, etc.
However, the correct version is critical, because when making changes to the changes, we want to use the nuget dependencies to prevent updates where necessary.
Does anyone else come across this and solve it? It seems like this will be quite common if nuget fully embraces any development team producing a significant project.
Update:
An example of what happens under the hood.
CoreLib (interfaces, etc.) Lib1 (Corelib links directly, current version = v1.0.17) Lib2 (Corelib links directly, current version = v1.0.99)
Both Lib1 and Lib2 are nuget packages. An update has been added to Lib1 that includes an inextricable change in CoreLib. When Lib1 is checked, TeamCity starts the build and a new nuget package, v1.0.18) is created.
When Lib1 is updated in consuming projects, Lib1 is a copy of CoreLib, also v1.0.18, because AssemblyVersion is managed by TeamCity) has a lower version than Lib2 (v1.0.99), although it is a version.
The ultimate goal is to get all of these interdependent packages to recreate, update, and repackage, but how to do it automatically, it really eludes me.