Skip to content

Instantly share code, notes, and snippets.

@raphw
Last active December 8, 2024 12:29
Show Gist options
  • Save raphw/78b0d5264636c3d2a3af339c3259c17c to your computer and use it in GitHub Desktop.
Save raphw/78b0d5264636c3d2a3af339c3259c17c to your computer and use it in GitHub Desktop.
Maven POM resolution (standalone)
package codes.rafael.mavenpom;
import org.w3c.dom.Document;
import org.w3c.dom.Node;
import org.w3c.dom.NodeList;
import org.xml.sax.SAXException;
import javax.xml.parsers.DocumentBuilderFactory;
import javax.xml.parsers.ParserConfigurationException;
import java.io.IOException;
import java.io.InputStream;
import java.nio.file.Path;
import java.util.*;
import java.util.function.Supplier;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import java.util.stream.IntStream;
import java.util.stream.Stream;
public class MavenPomResolver {
private static final String NAMESPACE_4_0_0 = "http://maven.apache.org/POM/4.0.0";
private static final Set<String> IMPLICITS = Set.of("groupId", "artifactId", "version", "packaging");
private static final Pattern PROPERTY = Pattern.compile("(\\$\\{([\\w.]+)})");
private final MavenRepository repository;
private final DocumentBuilderFactory factory = DocumentBuilderFactory.newInstance();
public MavenPomResolver(MavenRepository repository) {
this.repository = repository;
factory.setNamespaceAware(true);
}
public List<MavenDependency> dependencies(String groupId,
String artifactId,
String version,
MavenDependencyScope scope) throws IOException {
SequencedMap<DependencyKey, DependencyInclusion> dependencies = new LinkedHashMap<>();
Map<DependencyKey, MavenDependencyScope> overrides = new HashMap<>();
Map<DependencyCoordinates, UnresolvedPom> poms = new HashMap<>();
Queue<ContextualPom> queue = new ArrayDeque<>(Set.of(new ContextualPom(
resolve(assembleOrCached(groupId, artifactId, version, new HashSet<>(), poms), poms, null),
scope,
Set.of(),
Map.of())));
do {
ContextualPom current = queue.remove();
Map<DependencyKey, DependencyValue> managedDependencies = new HashMap<>(current.pom().managedDependencies());
managedDependencies.putAll(current.managedDependencies());
for (Map.Entry<DependencyKey, DependencyValue> entry : current.pom().dependencies().entrySet()) {
if (!current.exclusions().contains(new DependencyExclusion(
entry.getKey().groupId(),
entry.getKey().artifactId()))) {
DependencyValue primary = current.managedDependencies().get(entry.getKey()), value = primary == null
? entry.getValue().with(current.pom().managedDependencies().get(entry.getKey()))
: primary.with(entry.getValue());
boolean optional = switch (value.optional()) {
case "true" -> true;
case "false" -> false;
case null -> false;
default -> throw new IllegalStateException("Unexpected value: " + value);
};
if (optional && current.pom().origin() != null) {
continue;
}
MavenDependencyScope actual = toScope(value.scope()), derived = switch (current.scope()) {
case null -> actual == MavenDependencyScope.IMPORT ? null : actual;
case COMPILE -> switch (actual) {
case COMPILE, RUNTIME -> actual;
default -> null;
};
case PROVIDED, RUNTIME, TEST -> switch (actual) {
case COMPILE, RUNTIME -> current.scope();
default -> null;
};
case SYSTEM, IMPORT -> null;
};
if (derived == null) {
continue;
}
DependencyInclusion previous = dependencies.get(entry.getKey());
if (previous != null) {
if (previous.scope().ordinal() > overrides.getOrDefault(entry.getKey(), derived).ordinal()) {
overrides.put(entry.getKey(), derived);
}
continue;
}
dependencies.put(entry.getKey(), new DependencyInclusion(value.version(),
optional,
derived,
value.systemPath() == null ? null : Path.of(value.systemPath()),
new HashSet<>()));
if (current.pom().origin() != null) {
dependencies.get(current.pom().origin()).transitives().add(entry.getKey());
}
Set<DependencyExclusion> exclusions;
if (value.exclusions() == null || value.exclusions().isEmpty()) {
exclusions = current.exclusions();
} else {
exclusions = new HashSet<>(current.exclusions());
exclusions.addAll(value.exclusions());
}
queue.add(new ContextualPom(resolve(assembleOrCached(entry.getKey().groupId(),
entry.getKey().artifactId(),
value.version(),
new HashSet<>(),
poms), poms, entry.getKey()), derived, exclusions, managedDependencies));
}
}
} while (!queue.isEmpty());
Queue<DependencyKey> keys = new ArrayDeque<>(overrides.keySet());
while (!keys.isEmpty()) {
DependencyKey key = keys.remove();
Set<DependencyKey> transitives = dependencies.get(key).transitives();
transitives.forEach(transitive -> {
MavenDependencyScope current = overrides.get(transitive), candidate = overrides.get(key);
if (current == null || candidate.ordinal() < current.ordinal()) {
overrides.put(transitive, candidate);
}
});
keys.addAll(transitives);
}
return dependencies.entrySet().stream().map(entry -> new MavenDependency(entry.getKey().groupId(),
entry.getKey().artifactId(),
entry.getValue().version(),
entry.getKey().type(),
entry.getKey().classifier(),
overrides.getOrDefault(entry.getKey(), entry.getValue().scope()),
entry.getValue().path(),
entry.getValue().optional())).toList();
}
private UnresolvedPom assemble(InputStream inputStream,
Set<DependencyCoordinates> children,
Map<DependencyCoordinates, UnresolvedPom> poms) throws IOException,
SAXException,
ParserConfigurationException {
Document document;
try (inputStream) {
document = factory.newDocumentBuilder().parse(inputStream);
}
return switch (document.getDocumentElement().getNamespaceURI()) {
case NAMESPACE_4_0_0 -> {
DependencyCoordinates parent = toChildren400(document.getDocumentElement(), "parent")
.findFirst()
.map(node -> new DependencyCoordinates(
toTextChild400(node, "groupId").orElseThrow(missing("parent.groupId")),
toTextChild400(node, "artifactId").orElseThrow(missing("parent.artifactId")),
toTextChild400(node, "version").orElseThrow(missing("parent.version"))))
.orElse(null);
Map<String, String> properties = new HashMap<>();
Map<DependencyKey, DependencyValue> managedDependencies = new HashMap<>();
SequencedMap<DependencyKey, DependencyValue> dependencies = new LinkedHashMap<>();
if (parent != null) {
if (!children.add(new DependencyCoordinates(parent.groupId(), parent.artifactId(), parent.version()))) {
throw new IllegalStateException("Circular dependency to "
+ parent.groupId() + ":" + parent.artifactId() + ":" + parent.version());
}
UnresolvedPom resolution = assembleOrCached(parent.groupId(),
parent.artifactId(),
parent.version(),
children,
poms);
properties.putAll(resolution.properties());
IMPLICITS.forEach(property -> {
String value = resolution.properties().get(property);
if (value != null) {
properties.put("parent." + property, value);
properties.put("project.parent." + property, value);
}
});
managedDependencies.putAll(resolution.managedDependencies());
dependencies.putAll(resolution.dependencies());
}
IMPLICITS.forEach(property -> toChildren400(document.getDocumentElement(), property)
.findFirst()
.ifPresent(node -> {
properties.put(property, node.getTextContent());
properties.put("project." + property, node.getTextContent());
}));
toChildren400(document.getDocumentElement(), "properties")
.limit(1)
.flatMap(MavenPomResolver::toChildren)
.filter(node -> node.getNodeType() == Node.ELEMENT_NODE)
.forEach(node -> properties.put(node.getLocalName(), node.getTextContent()));
toChildren400(document.getDocumentElement(), "dependencyManagement")
.limit(1)
.flatMap(node -> toChildren400(node, "dependencies"))
.limit(1)
.flatMap(node -> toChildren400(node, "dependency"))
.map(MavenPomResolver::toDependency400)
.forEach(entry -> managedDependencies.put(entry.getKey(), entry.getValue()));
toChildren400(document.getDocumentElement(), "dependencies")
.limit(1)
.flatMap(node -> toChildren400(node, "dependency"))
.map(MavenPomResolver::toDependency400)
.forEach(entry -> dependencies.putLast(entry.getKey(), entry.getValue()));
yield new UnresolvedPom(properties, managedDependencies, dependencies);
}
case null, default -> throw new IllegalArgumentException(
"Unknown namespace: " + document.getDocumentElement().getNamespaceURI());
};
}
private UnresolvedPom assembleOrCached(String groupId,
String artifactId,
String version,
Set<DependencyCoordinates> children,
Map<DependencyCoordinates, UnresolvedPom> poms) throws IOException {
if (version == null) {
throw new IllegalArgumentException("No version specified for " + groupId + ":" + artifactId);
}
DependencyCoordinates coordinates = new DependencyCoordinates(groupId, artifactId, version);
UnresolvedPom pom = poms.get(coordinates);
if (pom == null) {
try {
pom = assemble(repository.fetch(groupId,
artifactId,
version,
"pom",
null,
null).toInputStream(), children, poms);
} catch (RuntimeException | SAXException | ParserConfigurationException e) {
throw new IllegalStateException("Failed to resolve " + groupId + ":" + artifactId + ":" + version, e);
}
poms.put(coordinates, pom);
}
return pom;
}
private ResolvedPom resolve(UnresolvedPom pom,
Map<DependencyCoordinates, UnresolvedPom> poms,
DependencyKey origin) throws IOException {
Map<DependencyKey, DependencyValue> managedDependencies = new HashMap<>();
for (Map.Entry<DependencyKey, DependencyValue> entry : pom.managedDependencies().entrySet()) {
DependencyKey key = entry.getKey().resolve(pom.properties());
DependencyValue value = entry.getValue().resolve(pom.properties());
if (Objects.equals(value.scope(), "import") && Objects.equals(key.type(), "pom")) {
UnresolvedPom imported = assembleOrCached(key.groupId(),
key.artifactId(),
value.version(),
new HashSet<>(),
poms);
imported.managedDependencies().forEach((importedKey, importedValue) -> managedDependencies.putIfAbsent(
importedKey.resolve(imported.properties()),
importedValue.resolve(imported.properties())));
} else {
managedDependencies.put(key, value);
}
}
SequencedMap<DependencyKey, DependencyValue> dependencies = new LinkedHashMap<>();
pom.dependencies().forEach((key, value) -> dependencies.putLast(
key.resolve(pom.properties()),
value.resolve(pom.properties())));
return new ResolvedPom(managedDependencies, dependencies, origin);
}
private static Stream<Node> toChildren(Node node) {
NodeList children = node.getChildNodes();
return IntStream.iterate(0,
index -> index < children.getLength(),
index -> index + 1).mapToObj(children::item);
}
private static Stream<Node> toChildren400(Node node, String localName) {
return toChildren(node).filter(child -> Objects.equals(child.getLocalName(), localName)
&& Objects.equals(child.getNamespaceURI(), NAMESPACE_4_0_0));
}
private static Optional<String> toTextChild400(Node node, String localName) {
return toChildren400(node, localName).map(Node::getTextContent).findFirst();
}
private static Map.Entry<DependencyKey, DependencyValue> toDependency400(Node node) {
return Map.entry(
new DependencyKey(
toTextChild400(node, "groupId").orElseThrow(missing("groupId")),
toTextChild400(node, "artifactId").orElseThrow(missing("artifactId")),
toTextChild400(node, "type").orElse("jar"),
toTextChild400(node, "classifier").orElse(null)),
new DependencyValue(
toTextChild400(node, "version").orElse(null),
toTextChild400(node, "scope").orElse(null),
toTextChild400(node, "systemPath").orElse(null),
toChildren400(node, "exclusions")
.findFirst()
.map(exclusions -> toChildren400(exclusions, "exclusion")
.map(child -> new DependencyExclusion(
toTextChild400(child, "groupId").orElseThrow(missing("exclusion.groupId")),
toTextChild400(child, "artifactId").orElseThrow(missing("exclusion.artifactId"))))
.toList())
.orElse(null),
toTextChild400(node, "optional").orElse(null)));
}
private static String property(String text, Map<String, String> properties) {
return property(text, properties, Set.of());
}
private static String property(String text, Map<String, String> properties, Set<String> previous) {
if (text != null && text.contains("$")) {
Matcher matcher = PROPERTY.matcher(text);
StringBuilder sb = new StringBuilder();
while (matcher.find()) {
String property = matcher.group(2);
String replacement = properties.get(property);
if (replacement == null) {
replacement = System.getProperty(property);
}
if (replacement == null) {
throw new IllegalStateException("Property not defined: " + property);
} else {
HashSet<String> duplicates = new HashSet<>(previous);
if (!duplicates.add(property)) {
throw new IllegalStateException("Circular property definition of: " + property);
}
matcher.appendReplacement(sb, property(replacement, properties, duplicates));
}
}
return matcher.appendTail(sb).toString();
} else {
return text;
}
}
private static MavenDependencyScope toScope(String scope) {
return switch (scope) {
case "compile" -> MavenDependencyScope.COMPILE;
case "provided" -> MavenDependencyScope.PROVIDED;
case "runtime" -> MavenDependencyScope.RUNTIME;
case "test" -> MavenDependencyScope.TEST;
case "system" -> MavenDependencyScope.SYSTEM;
case "import" -> MavenDependencyScope.IMPORT;
case null -> MavenDependencyScope.COMPILE;
default -> throw new IllegalArgumentException("");
};
}
private static Supplier<IllegalStateException> missing(String property) {
return () -> new IllegalStateException("Property not defined: " + property);
}
private record DependencyKey(String groupId,
String artifactId,
String type,
String classifier) {
private DependencyKey resolve(Map<String, String> properties) {
return new DependencyKey(property(groupId, properties),
property(artifactId, properties),
property(type, properties),
property(classifier, properties));
}
}
private record DependencyValue(String version,
String scope,
String systemPath,
List<DependencyExclusion> exclusions,
String optional) {
private DependencyValue resolve(Map<String, String> properties) {
return new DependencyValue(property(version, properties),
property(scope, properties),
property(systemPath, properties),
exclusions == null ? null : exclusions.stream().map(exclusion -> new DependencyExclusion(
property(exclusion.groupId(), properties),
property(exclusion.artifactId(), properties))).toList(),
property(optional, properties)
);
}
private DependencyValue with(DependencyValue supplement) {
if (supplement == null) {
return this;
}
return new DependencyValue(version == null ? supplement.version() : version,
scope == null ? supplement.scope() : scope,
systemPath == null ? supplement.systemPath() : systemPath,
exclusions == null ? supplement.exclusions() : exclusions,
optional == null ? supplement.optional() : optional);
}
}
private record DependencyInclusion(String version,
boolean optional,
MavenDependencyScope scope,
Path path,
Set<DependencyKey> transitives) {
}
private record DependencyExclusion(String groupId, String artifactId) {
}
private record DependencyCoordinates(String groupId, String artifactId, String version) {
}
private record UnresolvedPom(Map<String, String> properties,
Map<DependencyKey, DependencyValue> managedDependencies,
SequencedMap<DependencyKey, DependencyValue> dependencies) {
}
private record ResolvedPom(Map<DependencyKey, DependencyValue> managedDependencies,
SequencedMap<DependencyKey, DependencyValue> dependencies,
DependencyKey origin) {
}
private record ContextualPom(ResolvedPom pom,
MavenDependencyScope scope,
Set<DependencyExclusion> exclusions,
Map<DependencyKey, DependencyValue> managedDependencies) {
}
}
package codes.rafael.mavenpom;
import java.io.FilterInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.net.URI;
import java.nio.channels.FileChannel;
import java.nio.file.Files;
import java.nio.file.Path;
import java.security.DigestInputStream;
import java.security.MessageDigest;
import java.security.NoSuchAlgorithmException;
import java.util.Arrays;
import java.util.Base64;
import java.util.HashMap;
import java.util.Map;
import java.util.Optional;
public class MavenRepository implements Repository {
private final URI repository;
private final Path local;
private final Map<String, URI> validations;
public MavenRepository() {
String environment = System.getenv("MAVEN_REPOSITORY_URI");
if (environment != null && !environment.endsWith("/")) {
environment += "/";
}
repository = URI.create(environment == null ? "https://repo1.maven.org/maven2/" : environment);
Path local = Path.of(System.getProperty("user.home"), ".m2", "repository");
this.local = Files.isDirectory(local) ? local : null;
validations = Map.of("SHA1", repository);
}
public MavenRepository(URI repository, Path local, Map<String, URI> validations) {
this.repository = repository;
this.local = local;
this.validations = validations;
}
@Override
public InputStreamSource fetch(String coordinate) throws IOException {
String[] elements = coordinate.split(":");
return switch (elements.length) {
case 4 -> fetch(elements[0], elements[1], elements[2], "jar", null, null);
case 5 -> fetch(elements[0], elements[1], elements[2], elements[3], null, null);
case 6 -> fetch(elements[0], elements[1], elements[2], elements[4], elements[3], null);
default -> throw new IllegalArgumentException("Insufficient Maven coordinate: " + coordinate);
};
}
public InputStreamSource fetch(String groupId,
String artifactId,
String version,
String type,
String classifier,
String checksum) throws IOException {
return fetch(repository, groupId, artifactId, version, type, classifier, checksum).materialize();
}
private LazyInputStreamSource fetch(URI repository,
String groupId,
String artifactId,
String version,
String type,
String classifier,
String checksum) throws IOException {
String path = groupId.replace('.', '/')
+ "/" + artifactId
+ "/" + version
+ "/" + artifactId + "-" + version + (classifier == null ? "" : "-" + classifier)
+ "." + type + (checksum == null ? "" : ("." + checksum));
Path cached = local == null ? null : local.resolve(path);
if (cached != null) {
if (Files.exists(cached)) {
boolean valid = true;
if (checksum == null) {
Map<LazyInputStreamSource, byte[]> results = new HashMap<>();
for (Map.Entry<String, URI> entry : validations.entrySet()) {
LazyInputStreamSource source = fetch(entry.getValue(),
groupId,
artifactId,
version,
type,
classifier,
entry.getKey().toLowerCase());
if (valid) {
MessageDigest digest;
try {
digest = MessageDigest.getInstance(entry.getKey());
} catch (NoSuchAlgorithmException e) {
throw new IllegalStateException(e);
}
try (FileChannel channel = FileChannel.open(cached)) {
digest.update(channel.map(FileChannel.MapMode.READ_ONLY, 0, channel.size()));
}
byte[] expected;
try (InputStream inputStream = source.toInputStream()) {
expected = inputStream.readAllBytes();
}
results.put(source, expected);
valid = Arrays.equals(Base64.getDecoder().decode(expected), digest.digest());
} else {
results.put(source, null);
}
}
if (valid) {
for (Map.Entry<LazyInputStreamSource, byte[]> entry : results.entrySet()) {
entry.getKey().storeIfNotPresent(entry.getValue());
}
} else {
Files.delete(cached);
for (LazyInputStreamSource source : results.keySet()) {
source.deleteIfPresent();
}
}
}
if (valid) {
return new StoredInputStreamSource(cached);
}
} else {
Files.createDirectories(cached.getParent());
}
}
Map<LazyInputStreamSource, MessageDigest> digests = new HashMap<>();
if (checksum == null) {
for (Map.Entry<String, URI> entry : validations.entrySet()) {
LazyInputStreamSource source = fetch(entry.getValue(),
groupId,
artifactId,
version,
type,
classifier,
entry.getKey().toLowerCase());
MessageDigest digest;
try {
digest = MessageDigest.getInstance(entry.getKey());
} catch (NoSuchAlgorithmException e) {
throw new IllegalStateException(e);
}
digests.put(source, digest);
}
}
URI uri = repository.resolve(path);
if (cached == null) {
return () -> ValidatingInputStream.of(uri.toURL().openStream(), digests);
} else {
return new LatentInputStreamSource(cached,
uri,
digests,
artifactId + "-" + version + (classifier == null ? "" : "-" + classifier),
type + (checksum == null ? "" : ("." + checksum)));
}
}
@FunctionalInterface
private interface LazyInputStreamSource extends InputStreamSource {
default void deleteIfPresent() throws IOException {
}
default void storeIfNotPresent(byte[] bytes) throws IOException {
}
default InputStreamSource materialize() throws IOException {
return this;
}
}
record StoredInputStreamSource(Path path) implements LazyInputStreamSource {
@Override
public void deleteIfPresent() throws IOException {
Files.delete(path);
}
@Override
public InputStream toInputStream() throws IOException {
return Files.newInputStream(path);
}
@Override
public Optional<Path> getPath() {
return Optional.of(path);
}
}
record LatentInputStreamSource(Path path,
URI uri,
Map<LazyInputStreamSource, MessageDigest> digests,
String prefix,
String suffix) implements LazyInputStreamSource {
@Override
public InputStream toInputStream() throws IOException {
return ValidatingInputStream.of(uri.toURL().openStream(), digests);
}
@Override
public void storeIfNotPresent(byte[] bytes) throws IOException {
Path temporary = Files.createTempFile(prefix, suffix);
try (OutputStream outputStream = Files.newOutputStream(temporary)) {
outputStream.write(bytes);
} catch (Throwable t) {
Files.delete(temporary);
throw t;
}
Files.move(temporary, path);
}
@Override
public InputStreamSource materialize() throws IOException {
Path temporary = Files.createTempFile(prefix, suffix);
try (InputStream inputStream = toInputStream();
OutputStream outputStream = Files.newOutputStream(temporary)) {
inputStream.transferTo(outputStream);
} catch (Throwable t) {
Files.delete(temporary);
throw t;
}
return new StoredInputStreamSource(Files.move(temporary, path));
}
}
private static class ValidatingInputStream extends FilterInputStream {
private final Map<LazyInputStreamSource, MessageDigest> digests;
private ValidatingInputStream(InputStream inputStream, Map<LazyInputStreamSource, MessageDigest> digests) {
super(inputStream);
this.digests = digests;
}
private static InputStream of(InputStream inputStream, Map<LazyInputStreamSource, MessageDigest> digests) {
if (digests.isEmpty()) {
return inputStream;
}
for (MessageDigest digest : digests.values()) {
inputStream = new DigestInputStream(inputStream, digest);
}
return new ValidatingInputStream(inputStream, digests);
}
@Override
public void close() throws IOException {
super.close();
boolean valid = true;
Map<LazyInputStreamSource, byte[]> results = new HashMap<>();
for (Map.Entry<LazyInputStreamSource, MessageDigest> entry : digests.entrySet()) {
byte[] expected;
try (InputStream inputStream = entry.getKey().toInputStream()) {
expected = inputStream.readAllBytes();
}
results.put(entry.getKey(), expected);
if (!(valid = Arrays.equals(Base64.getDecoder().decode(expected), entry.getValue().digest()))) {
break;
}
}
if (valid) {
for (Map.Entry<LazyInputStreamSource, byte[]> entry : results.entrySet()) {
entry.getKey().storeIfNotPresent(entry.getValue());
}
} else {
for (LazyInputStreamSource source : digests.keySet()) {
source.deleteIfPresent();
}
throw new IOException("Failed checksum validation");
}
}
}
}
@Geolykt
Copy link

Geolykt commented Dec 5, 2024

For reference, https://github.com/stianloader/PicoResolve is my own implementation of the maven artifact resolver.

Right now you are still missing quite a few essential features - such as async resource fetching (I don't see any code that is responsible for that but that's the number 1 thing to have so maybe I'm just blind), resource locking, version negotiation, and resolver metadata. So don't be surprised if you'll encounter even more pain implementing this - it took me over two years to get to the current state of my own resolver and I'm still not too happy about it (though that's mainly caused by me putting off a minor refractor for a while now)

@raphw
Copy link
Author

raphw commented Dec 5, 2024

I'll add async if needed, it would not be a big deal to add futures to the dependency loop. As for version negotiation, you mean ranges? Does Maven not always pick the "shortest" distance?

I'd like this to stay as small as possible for now, but thanks for the reference, I'll have a look.

@Geolykt
Copy link

Geolykt commented Dec 5, 2024

Not entirely sure what you mean with "shortest", but it'll pick the newest version that is within the range as far as I can tell.
I suppose the catch is that definition of the version of an artifact happens breadth-first, meaning that transitive artifacts have no impact on the selected version.

@raphw
Copy link
Author

raphw commented Dec 6, 2024

I never really used version ranges myself, but when trying it seems that version ranges always superseed specific versions, even if the ranges are found further down in a tree. Can that be true? Is there any documentation on version ranges you are aware of? It seems to be left out from the official docs.

@Geolykt
Copy link

Geolykt commented Dec 6, 2024

Representing transitive artifacts in a tree is the wrong way of approaching the issue - at least internally (in the API you might want to pretend that it's a tree for the sake of the API consumer). See the "Dependency layers" section of the picoresolve readme as to how it handles the dependency tree.

Version ranges are indeed sloppily documented, but https://maven.apache.org/enforcer/enforcer-rules/versionRanges.html might give you a good idea of the syntax (the article I linked actually is a bit of a superset of the version ranges accepted by maven, but that should be fine). About the behaviour on version negotiation when multiple ranges and versions are defined for the same layer - I'm not exactly sure on the details - you'll have to do some empirical testing on that one. But I can guarantee you that child layers cannot impact the version used in their parent layers.

Oh, and don't forget RELEASE and LATEST. Same goes with snapshot versions - those are handled specially.

@raphw
Copy link
Author

raphw commented Dec 7, 2024

I am looking into version negotiation now and was about to look at your library. I can however not find an example what API to use. If I just wanted the dependencies of junit:junit:4.13.2, what method would I call?

@raphw
Copy link
Author

raphw commented Dec 7, 2024

As for version ranges, I found that Maven picks any range prior to a specific version, what beats my intuition on Maven dependency resolution that I had for a range of years. If any of your dependencies (or their transitives) declares a range it will beat anything you have defined. That does feel like a security issue, even, as any dependency can effectively dictate your version tree, even if you defined a specific version.

@Geolykt
Copy link

Geolykt commented Dec 7, 2024

resolveChildLayer would be the most simple way of achieving that - though the API currently lacks helper methods (one of the missing minor refractors it is missing right now) so the setup for it is a bit troublesome.

It should be something around

GAV gav = new GAV("junit", "junit", MavenVersion.parse("4.13.2"));

VersionRange version = VersionRange.parse(gav.version().getOriginText());
DependencyEdge edge = new DependencyEdge(gav.group(), gav.artifact(), null, null, version, Scope.COMPILE, ExclusionContainer.empty());
GAV dummyGAV = new GAV("dummy", "dummy", MavenVersion.parse("1.0.0"));
List<DependencyEdge> edges = Collections.singletonList(edge);
DependencyLayerElement dummyElement = new DependencyLayerElement(dummyGAV, null, null, ExclusionContainer.empty(), edges);
DependencyLayer layer = new DependencyLayer(null, Collections.singletonList(dummyElement));
resolver.resolveAllChildren(layer, ForkJoinPool.commonPool()).join();

Although the above example would resolve all transitive dependencies - for only the direct ones resolveChildLayer would be the more appropriate call

@Geolykt
Copy link

Geolykt commented Dec 7, 2024

As far as I understand, dependencies declared higher up take precedence over dependencies declared in deeper layers - thus it's incorrect that any dependency can dictate your version tree. As a rule of thumb, transitive dependencies have less to say during resolution than direct dependencies. Thus I don't quite get your security issue aspect. At the end of the day, any kind of version range is an issue of some kind regardless of how it is handled - be it ABI breakages or otherwise. But the same goes for all other kinds of version negotiation - and packing multiple versions of a dependency into the final executable is a bit overkill in general.

@raphw
Copy link
Author

raphw commented Dec 7, 2024

That's what I thought, but in lack of a specification, I started trying and transitive dependencies override their parents version if the parent does not specify a range. In this sense, the breath-first approach is broken and you will have do do multiple iterations through the tree to negotiate.

@Geolykt
Copy link

Geolykt commented Dec 7, 2024

You're absolutely right there and I now admit that I'm wrong. However, the fact that mvn dependency:tree does not print the tree in the state I'd expect is peculiar - though I guess after all unpinned versions are more of a suggestion than a requirement so that might be expected.

Ugh, that will require a revised API to fix within my project. Thankfully the way the issue represents itself means that this behaviour should only occur when the currently selected version conflicts with the specified range so that's our explanation to the underlying issue. Still a headache to grasp though.

@raphw
Copy link
Author

raphw commented Dec 7, 2024

I found some sonatype documentation that writes that a version number like 1.2.3 is merely a suggestion and you need to write [1.2.3] if you want to force a version, or use dependency management. I find it rather frustrating that this is how dependencies are resolved on the JVM, it does not appear like anybody thought this through.

@Geolykt
Copy link

Geolykt commented Dec 7, 2024

Well yeah, but pinning versions has it's own set of issues. Also, just to clarify - if the version suggestion is within the version range, then the suggestion takes precedence.

So the meaning of [1.2.0,1.4) should be more interpreted as "I am incompatible with versions below 1.2.0 and versions above or equal to 1.4" rather than "Use a version between 1.2.0 and 1.4". So if any other kind of behaviour was used, the resolution process would either be very quick to fail due to incompatible versions (when treating all versions as pinned by default) or take the version ranges as suggestions, too - completely ridding them of their purpose.

I think this choice was taken with some larger amount of consideration, just with different weights compared to what you and I might prioritize at first.

@raphw
Copy link
Author

raphw commented Dec 8, 2024

If any of my transitive dependencies chooses versions [3,7] and I choose version 5, Maven will choose 7. That cannot be the intend.

Even worse, if my transitive dependency chose [3,7), I might still end up with 7-alpha as a prerelease is lesser then 7. This does not appear to be thought through.

@raphw
Copy link
Author

raphw commented Dec 8, 2024

Playing with this, I encountered a row of other particularities. Given POM A declares a dependency B with version 3 and a dependency on C, which has a dependency on B with range 1-2. Given that B declares a transitive dependency on D which was upgraded from 1-2 to 3, the tree placement for breath first Resolution is not changed. That means the lower version of D will be considered before any subsequent versions that have the same depth, despite that the version is overridden from a range way further down the tree.

@raphw
Copy link
Author

raphw commented Dec 8, 2024

Another observation that blew my mind: if your test dependencies come first, their transitive versions will override those of your non-test dependencies. If you execute a program, your test dependencies will thus reflect in your runtime and compile environments.

That in itself is not fairly surprising, but if you now create an extended module of the above declaration, it will not consider the test dependencies during resolution, thus yielding a fully different dependency tree, even if the first module is the only dependency of the first one.

@Geolykt
Copy link

Geolykt commented Dec 8, 2024

Talking about the test scope and unintuitive behaviour concerning it: The test dependencies of a test dependency are transitively included, which is probably the most irritating behaviour I've seen so far.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment