skip to Main Content

This question has been asked a few times but the answers seem a bit outdated (~5 years old) so I’m asking again specifically for .NET 6 and up.

Paket is an option but I’d like to hear about others (Paket has its own issues we’d like to steer away from).

I’m using VS Code and the dotnet CLI, in the official tutorials they seem to refer the project directly from the .csproj (and I think even projects from across solutions).

Reading around, the idiomatic way nowadays seems to be NuGet, maybe even setting up a local server (file hierarchy).

Are there even better ways?

The problem I encountered with referring projects directly is when the code base gets significantly big enough and complicated, compilation will sometimes fail, because of dependency hell (projects would try to refer and compile the same sources at the same time, or some projects will not compile in order, referring to dlls from outdated code), and I couldn’t find a way to set up a compilation order (if there is a way please let me know).

That said it was nice being able to work and debug across different projects from a single individual project, and for the compiler to detect, chain build and compile only the dependencies that received code changes. A significant time saver (when it works).

Now NuGet seems to mitigate the previous issue but the problem I see with it is that I’d have to build the packages individually (before building the project I’m currently working on) every time I change code and update the dependency versions in every dependent project manually each time I change something, which will greatly hinder and slow down workflow (unless, again, it can detect, build and update automatically when there are code changes in dependencies from within dependent projects in external solutions, but I haven’t figured out how, please let me know if that’s the case).

I also haven’t figure out a clean and speedy way to update all NuGet package versions across the whole code base, multiple solutions and projects.

I’ve also read some folks refer to the dlls directly but it apparently is an even worse solution than the two above, it won’t build the dependencies when there’s code changed and doesn’t play nice with the debugger (symbols/pdb and source code stepping).

2

Answers


  1. The way I’m used to doing it is:

    1. One solution per ‘product’, i.e. something that is released as a unit. Usually one solution per repository.
    2. Projects within the same repository uses regular project references.
    3. 1st party Libraries that are shared between repositories are placed in one or more separate repositories, and deployed from the continuous integration server using nuget.
    4. Nuget Libraries are updated on a solution basis, it is expected that different solutions uses different versions of nuget libraries.
    5. 3rd party libraries are refereed thru nuget whenever possible. Referring directly to .dll files is only done when there are no other option.

    A downside with this structure is that it is difficult to test your changes in a nuget deployed library without actually deploying the library. I have not found any great solution to this, but unit testing might help.

    I’m not aware of the problems you describe just using project references. The project dependency graph should be a directed acyclic graph, and visual studio should know what projects contain changes, and rebuild any project that needs rebuilding, and in the correct order. Granted, I have mostly used the classic Visual studio, but any competent build tool should be able handle the dependency graph correctly.

    Login or Signup to reply.
  2. I also haven’t figure out a clean and speedy way to update all NuGet package versions across the whole code base, multiple solutions and projects.

    Have you tried using MSBuild and one or multiple "Directory.Build.props" file/s and PackageReferences? By default, if you include this, every project in your solution will have the package added with the specified (minimum) version. You can even use multiple ones and merge them if neccessary, include/exclude projects based on conditions, and much more to organize solutions and projects on a big scale.

    "Directory.Build.props" is part of MSBuild, which is shipped with Visual Studio (I think 2019+), but there are also MSBuild Tools extensions for Visual Studio Code.

    How it works it is that MSBuild is looking in the folder where your .csproject file/s are stored and if it finds a file with the given name, it will import it. Otherwise it will go to the parent folder until it hits the root folder. So depending on your directory/file structure (aka where you want to place the Directory.Build.props file) it’s possible to reuse a single file for different solutions or use one for each solution. (It’s also possible to create a single file and add them as a link to the different solutions for MSBuild to pick it up)

    However what I’m talking about is just the surface of what you can do with MSBuild, Directory.Build.props and Directory.Build.targets.

    Here is a small example from the documentation: MSBuild – customize your build

    EDIT: I personally like to use it to make sure all projects in all my solutions have a reference to the StyleCop.Analyzers package and a link to the StyleCop.json and StyleCop.ruleset files without having to manually add them every time a new project is created.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search