When we started coding on NuKeeper, things were different - there were automated update tools for other ecosystems,
but not for .NET, and I was working with a lot of services with a lot of dependencies.
Most (but not all) of these at my work were solutions in .NET full framework version 4.6.2
or the like.
I tried to find the .NET support on "Greenkeeper" before realising what the docs didn't even feel the need to spell out: this was for node.js
and NPM, only. So there was a gap and a need for .NET package automation.
Well, things changed. Most notably, Dependabot got .NET Core support,
and was acquired by GitHub, which was in turn acquired my Microsoft.
Automated package update management is a win for developers, so this is good for people working in .NET.
NuKeeper grew and stalled. My priorities and job changed. It is clear that with the resources that Dependabot has, such as full-time development people, and support from Microsoft, that Nukeeper won't have the same level of polish and integration and general support. I now find the sprawling NuKeeper project hard to reason about.
And I think that suggests a path forward.
What if parts were extracted into separate libraries (nuget packages even) that can be tested and matured separately, with more than one command-line or other interface. They can also have different maintainers, making life easier for people, and better-defined contracts between them.
For instance, a library for reading dependencies out of a solution would depend on little else other than file IO and xml parsing. That can be made stable. Combine with a library for reading nuget package servers, and you can have lists of outdated packages, libyear metrics and other reports. And this doesn't have to worry about how LibGit2Sharp or the GitHub Api works.
If there are issues with transitive dependencies or server credentials then this can be addressed at that point.
GitHub integration can be separated from Azure DevOps integration. New command-line tools can be built.
What if the part that applied package updates by running commandlines knew nothing about the other things, but was a separate tool - just accepted commandline args, unix style to "update Foo.Lib to v 3.4.5 in these projects". That might make it easier to run in different scenarios.
Essentially I'm looking at how to decompose the problem and make the code useful by other people in novel contexts, as these seem come up - I had never even thought of docker containers or GitHub actions, and now these are common.
The current big .NET tool would still be possible, but perhaps it would seem more relevant to extract parts. More smaller tools might benefit developers in ways that Dependabot does not, and might be easier to combine or plug into new scenarios.
This also allows some new ideas to be tried out, those in my mind involve - reworking tests to be far more at the boundary and not so intensive on mocks, so allowing easier refactoring. And considering if it's finally time to embrace netstandard2.1
, c# 8 and not-nullable types.
There is the question of which version of .NET needs to be targeted by NuKeeper because it's not simple. Since almost any version can run the code to read xml files and query the various APIs and launch commandlines, NuKeeper code should be able to run in whichever version it wants - ideally the latest core 3.1 now. But when NuKeeper is working with updating a different codebase, you might only have that codebase's target SDK version.
Specifically, an issue that we ran into was - we compiled NuKeeper for NetCore 2.1 as that version was still most widespread when 3.x was new. So we made a Docker image for NuKeeper that contains .NET Core 2.1. Then this doesn't work for .NET Core 3.x solutions as it doesn't have the required version of dotnet
command line.
I have free time now, and I want to keep practising my .NET skills, so I am keen to spend some time on this. Although like for many people, the future is unclear and my availability could change quickly.