相关文章:《Aether在Android中的适配探索

本文所用Aether版本为1.1.0。

Get Started

Aether作为一个具有依赖注入(Dependency Injection)设计思想的库,它的主体System只能接受所依赖工具的interface实现,所以我们在调用前需要完成所需工具类的初始化(或是指定类,由Aether进行主动实例化)。

Aether提供了org.apache.maven.repository.internal.MavenRepositorySystemUtils工具类以方便快速配置,虽然但是,我们仍需创建一个Factory来封装完整的初始化逻辑。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
import org.apache.maven.repository.internal.MavenRepositorySystemUtils;
import org.eclipse.aether.connector.wagon.WagonProvider;
import org.eclipse.aether.connector.wagon.WagonRepositoryConnectorFactory;

...
private static RepositorySystem newRepositorySystem()
{
DefaultServiceLocator locator = MavenRepositorySystemUtils.newServiceLocator();
locator.addService( RepositoryConnectorFactory.class, BasicRepositoryConnectorFactory.class );
locator.addService( TransporterFactory.class, FileTransporterFactory.class );
locator.addService( TransporterFactory.class, HttpTransporterFactory.class );

return locator.getService( RepositorySystem.class );
}

private static RepositorySystemSession newSession( RepositorySystem system )
{
DefaultRepositorySystemSession session = MavenRepositorySystemUtils.newSession();

LocalRepository localRepo = new LocalRepository("依赖库文件储存路径");
session.setLocalRepositoryManager(system.newLocalRepositoryManager(session,localRepo));

return session;
}
...

完成初始化逻辑封装后,假如需要下载一个远程依赖(例如,org.apache.maven:maven-profile:2.2.1)我们只需:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
public static void main( String[] args )
throws Exception
{
RepositorySystem repoSystem = newRepositorySystem();

RepositorySystemSession session = newSession( repoSystem );

Dependency dependency =
new Dependency( new DefaultArtifact( "org.apache.maven:maven-profile:2.2.1" ), "compile" );
RemoteRepository central = new RemoteRepository.Builder( "central", "default", "http://repo1.maven.org/maven2/" ).build();

CollectRequest collectRequest = new CollectRequest();
collectRequest.setRoot( dependency );
collectRequest.addRepository( central );
DependencyNode node = repoSystem.collectDependencies( session, collectRequest ).getRoot();

DependencyRequest dependencyRequest = new DependencyRequest();
dependencyRequest.setRoot( node );

repoSystem.resolveDependencies( session, dependencyRequest );

PreorderNodeListGenerator nlg = new PreorderNodeListGenerator();
node.accept( nlg );
System.out.println( nlg.getClassPath() );
}

运行后,便可在设置的本地仓库(Local Repository)文件夹下找到已下载的依赖库文件。

具体代码的内容也很清晰易懂:
首先,Factory完成的是仓库连接与传输、本地仓库缓存的配置设置;

在main()方法内,初始化system(Repository系统及其功能的主要入口点)以及session(定义控制Repository系统的设置和组件),再创建一个Dependency(需要下载的依赖);

依赖必定需要从MavenRepository内查找,所以我们还需要RemoteRepository作为指定的远程仓库(注意:RemoteRepository可以同时添加多个,会依次查找);

进行Dependency的下载前,需要构建其依赖关系树,以下载其及其所需子依赖,构建依赖关系树前需要查询,则下一步进行的是CollectRequest,设置其rootDependency为所需依赖(必定为依赖树的root),并添加需要查找的RemoteRepository;

下一步即为依赖树的构建,repoSystem.collectDependencies()获取DependencyResult,再调用DependencyResult.getRoot()获取树的根节点,得到依赖树;

DependencyRequest用于初始化一个Dependency下载请求,设置root为刚刚得到的依赖树根节点即可,调用RepositorySystem.resolveDependencies(session,dependencyRequest)即可进行下载工作。可能会比较疑惑下载到哪里了,实际上可以知道在newSystemSession()方法内,我们进行session的初始化时已经设置了LocalRepository作为本地仓库,即缓存文件存放位置。

Tips: 最后的PreorderNodeListGenerator实质上不会对依赖下载产生影响,只是官方提供的一个遍历依赖树的示例。

Aether开发逻辑介绍

API部分建议自行浏览Aether的源代码,只需要了解基本的开发逻辑,则根据名字以及注释很容易上手。

  • Aether使用的是依赖注入(Dependency Injection)这一设计模式,在使用RepositorySystem之前,必须从DefaultServiceLocator中进行初始化。DefaultServiceLocator在初始化时便以默认配置了需要类,我们按需添加缺失的必要类即可。在完成配置之后,Aether会通过DefaultServiceLocator.getService()方法得到需要的interface(当然是已经实例化的),所以我们可以通过addService()在不修改Aether代码的情况下,更改一些核心逻辑,实现高度的自定义化;同时需要强调的是,我们也应该通过DefaultServiceLocator.getService()进行所需interface的获取。
  • Aether的所有操作基本上都是先创建一个XxxRequest,然后通过System执行对应的操作逻辑,结果是返回一个XxxResult。例如,Get Started代码中main()方法内的CollectRequest和DependencyRequest。且XxxRequest为Final Class,自行初始化即可。
  • 特别注意,RepositorySystem需要通过DefaultServiceLocator得到的,通过依赖注入,RepositorySystem会自动获取DefaultServiceLocator中所需要的类;RepositorySystemSession则是配置信息的载体,需要自己通过代码设置配置信息。

常见操作需求

下载Dependency

Get Started 不再赘述。

获取某一Artifact的缓存路径

  1. 通过RepositorySystemSession.getLocalRepositoryManager()获取LocalRepositoryManager;
  2. 调用LocalRepositoryManager.getPathForLocalArtifact()

常见Model的初始化

  1. Artifact
    调用DefaultArtifact的构造方法即可,需要传入coords(坐标,诸如org.apache.maven:maven-profile:2.2.1格式),也有其他构造方法,具体自行浏览API。
  2. Dependency
    调用Dependency的构造方法,并传入Artifact及scope(Maven Scope)等参数即可。

常见问题(已解决)

无法自动识别依赖文件类型为AAR的Artifact

因为我的初衷是用在android ide上,下载aar文件是不可避免的,但是从org.eclipse.aether.artifact.DefaultArtifact的构造方法可以看到(见下面的代码),默认的extension是jar,并且extension是final修饰的。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
...
public DefaultArtifact(String coords, Map<String, String> properties) {
Matcher m = COORDINATE_PATTERN.matcher(coords);
if (!m.matches()) {
throw new IllegalArgumentException("Bad artifact coordinates " + coords
+ ", expected format is <groupId>:<artifactId>[:<extension>[:<classifier>]]:<version>");
}
groupId = m.group(1);
artifactId = m.group(2);
extension = get(m.group(4), "jar"); //this line
classifier = get(m.group(6), "");
version = m.group(7);
file = null;
this.properties = copyProperties(properties);
}
...

可能到这里会想,那就在coords里指明extension不就可以了吗?很遗憾,测试后,你会发现这个根依赖倒是下载了,但是其子依赖仍然无法正确的识别文件类型。

接下来我们从源码的层面分析一下:

首先,根据上文的研究,Collect阶段会进行依赖树的构建,我们以此为入口,分析子依赖的构建过程。需要注意的是interface对应的默认impl类为DefaultXxx,比如RepositorySystem对应的默认impl类为DefaultRepositorySystem。

1
2
3
4
5
6
7
8
9
10
11
12
13
package org.eclipse.aether.internal.impl;
public class DefaultRepositorySystem implements RepositorySystem, Service {
...
@Override
public CollectResult collectDependencies(RepositorySystemSession session, CollectRequest request)
throws DependencyCollectionException {
validateSession(session);
requireNonNull(request, "request cannot be null");

return dependencyCollector.collectDependencies(session, request);
}
...
}

可以看到dependencyCollector.collectDependencies()执行了具体的操作,从dependencyCollector的初始化代码以及DefaultServiceLocator的相关代码,得知默认impl类为org.eclipse.aether.internal.impl.collect.DefaultDependencyCollector,继续往下追踪。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
public class DefaultDependencyCollector implements DependencyCollector, Service {
...
@Override
public CollectResult collectDependencies(RepositorySystemSession session, CollectRequest request)
throws DependencyCollectionException {
String delegateName = ConfigUtils.getString(session, DEFAULT_COLLECTOR_IMPL, CONFIG_PROP_COLLECTOR_IMPL);
DependencyCollectorDelegate delegate = delegates.get(delegateName);
if (delegate == null) {
throw new IllegalArgumentException(
"Unknown collector impl: '" + delegateName + "', known implementations are " + delegates.keySet());
}
return delegate.collectDependencies(session, request);
}
...
}

好好好,继续套娃操作,看代码知默认是深度优先org.eclipse.aether.internal.impl.collect.df.DfDependencyCollector,并且DependencyCollectorDelegate会利用collectDependencies()包装了DependencyCollector.doCollectDependencies()方法。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
public abstract class DependencyCollectorDelegate implements DependencyCollector {
...
@Override
public final CollectResult collectDependencies(RepositorySystemSession session, CollectRequest request)
throws DependencyCollectionException {
requireNonNull(session, "session cannot be null");
requireNonNull(request, "request cannot be null");
session = optimizeSession(session);

RequestTrace trace = RequestTrace.newChild(request.getTrace(), request);

CollectResult result = new CollectResult(request);

DependencyTraverser depTraverser = session.getDependencyTraverser();
VersionFilter verFilter = session.getVersionFilter();

Dependency root = request.getRoot();
List<RemoteRepository> repositories = request.getRepositories();
List<Dependency> dependencies = request.getDependencies();
List<Dependency> managedDependencies = request.getManagedDependencies();

Map<String, Object> stats = new LinkedHashMap<>();
long time1 = System.nanoTime();

DefaultDependencyNode node;
if (root != null) {
List<? extends Version> versions;
VersionRangeResult rangeResult;
try {
VersionRangeRequest rangeRequest = new VersionRangeRequest(
root.getArtifact(), request.getRepositories(), request.getRequestContext());
rangeRequest.setTrace(trace);
rangeResult = versionRangeResolver.resolveVersionRange(session, rangeRequest);
versions = filterVersions(root, rangeResult, verFilter, new DefaultVersionFilterContext(session));
} catch (VersionRangeResolutionException e) {
result.addException(e);
throw new DependencyCollectionException(result, e.getMessage());
}

Version version = versions.get(versions.size() - 1);
root = root.setArtifact(root.getArtifact().setVersion(version.toString()));

ArtifactDescriptorResult descriptorResult; //关键代码
try {
ArtifactDescriptorRequest descriptorRequest = new ArtifactDescriptorRequest();
descriptorRequest.setArtifact(root.getArtifact());
descriptorRequest.setRepositories(request.getRepositories());
descriptorRequest.setRequestContext(request.getRequestContext());
descriptorRequest.setTrace(trace);
if (isLackingDescriptor(root.getArtifact())) {
descriptorResult = new ArtifactDescriptorResult(descriptorRequest);
} else {
descriptorResult = descriptorReader.readArtifactDescriptor(session, descriptorRequest);
}
} catch (ArtifactDescriptorException e) {
result.addException(e);
throw new DependencyCollectionException(result, e.getMessage());
}

root = root.setArtifact(descriptorResult.getArtifact());

if (!session.isIgnoreArtifactDescriptorRepositories()) {
repositories = remoteRepositoryManager.aggregateRepositories(
session, repositories, descriptorResult.getRepositories(), true);
}
dependencies = mergeDeps(dependencies, descriptorResult.getDependencies());
managedDependencies = mergeDeps(managedDependencies, descriptorResult.getManagedDependencies());

node = new DefaultDependencyNode(root);
node.setRequestContext(request.getRequestContext());
node.setRelocations(descriptorResult.getRelocations());
node.setVersionConstraint(rangeResult.getVersionConstraint());
node.setVersion(version);
node.setAliases(descriptorResult.getAliases());
node.setRepositories(request.getRepositories());
} else {
node = new DefaultDependencyNode(request.getRootArtifact());
node.setRequestContext(request.getRequestContext());
node.setRepositories(request.getRepositories());
}

result.setRoot(node);

boolean traverse = root == null || depTraverser == null || depTraverser.traverseDependency(root);
String errorPath = null;
if (traverse && !dependencies.isEmpty()) {
DataPool pool = new DataPool(session);

DefaultDependencyCollectionContext context = new DefaultDependencyCollectionContext(
session, request.getRootArtifact(), root, managedDependencies);

DefaultVersionFilterContext versionContext = new DefaultVersionFilterContext(session);

Results results = new Results(result, session);

doCollectDependencies(
session,
trace,
pool,
context,
versionContext,
request,
node,
repositories,
dependencies,
managedDependencies,
results);

errorPath = results.getErrorPath();
}

long time2 = System.nanoTime();

DependencyGraphTransformer transformer = session.getDependencyGraphTransformer();
if (transformer != null) {
try {
DefaultDependencyGraphTransformationContext context =
new DefaultDependencyGraphTransformationContext(session);
context.put(TransformationContextKeys.STATS, stats);
result.setRoot(transformer.transformGraph(node, context));
} catch (RepositoryException e) {
result.addException(e);
}
}

long time3 = System.nanoTime();
if (logger.isDebugEnabled()) {
stats.put(getClass().getSimpleName() + ".collectTime", time2 - time1);
stats.put(getClass().getSimpleName() + ".transformTime", time3 - time2);
logger.debug("Dependency collection stats {}", stats);
}

if (errorPath != null) {
throw new DependencyCollectionException(result, "Failed to collect dependencies at " + errorPath);
}
if (!result.getExceptions().isEmpty()) {
throw new DependencyCollectionException(result);
}

return result;
}
...
}
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
public class DfDependencyCollector extends DependencyCollectorDelegate implements Service {
...
@Override
protected void doCollectDependencies(
RepositorySystemSession session,
RequestTrace trace,
DataPool pool,
DefaultDependencyCollectionContext context,
DefaultVersionFilterContext versionContext,
CollectRequest request,
DependencyNode node,
List<RemoteRepository> repositories,
List<Dependency> dependencies,
List<Dependency> managedDependencies,
Results results) {
NodeStack nodes = new NodeStack();
nodes.push(node);

Args args = new Args(session, pool, nodes, context, versionContext, request);

process(
args,
trace,
results,
dependencies,
repositories,
session.getDependencySelector() != null
? session.getDependencySelector().deriveChildSelector(context)
: null,
session.getDependencyManager() != null
? session.getDependencyManager().deriveChildManager(context)
: null,
session.getDependencyTraverser() != null
? session.getDependencyTraverser().deriveChildTraverser(context)
: null,
session.getVersionFilter() != null ? session.getVersionFilter().deriveChildFilter(context) : null);
}

private void process(
final Args args,
RequestTrace trace,
Results results,
List<Dependency> dependencies,
List<RemoteRepository> repositories,
DependencySelector depSelector,
DependencyManager depManager,
DependencyTraverser depTraverser,
VersionFilter verFilter) {
for (Dependency dependency : dependencies) {
processDependency(
args, trace, results, repositories, depSelector, depManager, depTraverser, verFilter, dependency);
}
}

private void processDependency(
Args args,
RequestTrace trace,
Results results,
List<RemoteRepository> repositories,
DependencySelector depSelector,
DependencyManager depManager,
DependencyTraverser depTraverser,
VersionFilter verFilter,
Dependency dependency) {

List<Artifact> relocations = Collections.emptyList();
processDependency(
args,
trace,
results,
repositories,
depSelector,
depManager,
depTraverser,
verFilter,
dependency,
relocations,
false);
}

private void processDependency(
Args args,
RequestTrace parent,
Results results,
List<RemoteRepository> repositories,
DependencySelector depSelector,
DependencyManager depManager,
DependencyTraverser depTraverser,
VersionFilter verFilter,
Dependency dependency,
List<Artifact> relocations,
boolean disableVersionManagement) {
if (depSelector != null && !depSelector.selectDependency(dependency)) {
return;
}

RequestTrace trace = collectStepTrace(parent, args.request.getRequestContext(), args.nodes.nodes, dependency);
PremanagedDependency preManaged =
PremanagedDependency.create(depManager, dependency, disableVersionManagement, args.premanagedState);
dependency = preManaged.getManagedDependency();

boolean noDescriptor = isLackingDescriptor(dependency.getArtifact());

boolean traverse = !noDescriptor && (depTraverser == null || depTraverser.traverseDependency(dependency));

List<? extends Version> versions;
VersionRangeResult rangeResult;
try {
VersionRangeRequest rangeRequest =
createVersionRangeRequest(args.request.getRequestContext(), trace, repositories, dependency);

rangeResult = cachedResolveRangeResult(rangeRequest, args.pool, args.session);

versions = filterVersions(dependency, rangeResult, verFilter, args.versionContext);
} catch (VersionRangeResolutionException e) {
results.addException(dependency, e, args.nodes.nodes);
return;
}

for (Version version : versions) {
Artifact originalArtifact = dependency.getArtifact().setVersion(version.toString());
Dependency d = dependency.setArtifact(originalArtifact);

ArtifactDescriptorRequest descriptorRequest =
createArtifactDescriptorRequest(args.request.getRequestContext(), trace, repositories, d);//关键代码

final ArtifactDescriptorResult descriptorResult =
getArtifactDescriptorResult(args, results, noDescriptor, d, descriptorRequest);//关键代码
if (descriptorResult != null) {
d = d.setArtifact(descriptorResult.getArtifact());

DependencyNode node = args.nodes.top();

int cycleEntry = DefaultDependencyCycle.find(args.nodes.nodes, d.getArtifact());
if (cycleEntry >= 0) {
results.addCycle(args.nodes.nodes, cycleEntry, d);
DependencyNode cycleNode = args.nodes.get(cycleEntry);
if (cycleNode.getDependency() != null) {
DefaultDependencyNode child = createDependencyNode(
relocations, preManaged, rangeResult, version, d, descriptorResult, cycleNode);
node.getChildren().add(child);
continue;
}
}

if (!descriptorResult.getRelocations().isEmpty()) {
boolean disableVersionManagementSubsequently =
originalArtifact.getGroupId().equals(d.getArtifact().getGroupId())
&& originalArtifact
.getArtifactId()
.equals(d.getArtifact().getArtifactId());

processDependency(
args,
parent,
results,
repositories,
depSelector,
depManager,
depTraverser,
verFilter,
d,
descriptorResult.getRelocations(),
disableVersionManagementSubsequently);
return;
} else {
d = args.pool.intern(d.setArtifact(args.pool.intern(d.getArtifact())));

List<RemoteRepository> repos =
getRemoteRepositories(rangeResult.getRepository(version), repositories);

DefaultDependencyNode child = createDependencyNode(
relocations,
preManaged,
rangeResult,
version,
d,
descriptorResult.getAliases(),
repos,
args.request.getRequestContext());

node.getChildren().add(child);

boolean recurse =
traverse && !descriptorResult.getDependencies().isEmpty();
if (recurse) {
doRecurse(
args,
parent,
results,
repositories,
depSelector,
depManager,
depTraverser,
verFilter,
d,
descriptorResult,
child);
}
}
} else {
DependencyNode node = args.nodes.top();
List<RemoteRepository> repos = getRemoteRepositories(rangeResult.getRepository(version), repositories);
DefaultDependencyNode child = createDependencyNode(
relocations,
preManaged,
rangeResult,
version,
d,
null,
repos,
args.request.getRequestContext());
node.getChildren().add(child);
}
}
}
...
}

代码中已用注释写明了关键代码位置,可以得知在RootDependency的Collect操作中,会进行ArtifactDescriptor(即,依赖的pom文件的解析)的解析操作,得到该Dependency的直接依赖,此操作由ArtifactDescriptorReader进行,但是既然解析了pom文件,为什么不根据pom文件内声明的extension类型来修正Artifact对应的文件类型呢?只因你太美Artifact中的extension是final修饰的,咱也不懂Aether为什么不写个自动修正extension的操作,毕竟人家是apache foundation下的,咱也不敢问。

解决方法

前情回顾,Aether内部使用了Dependency Injection,我们如果需要更改内部的逻辑,最简单的当然还是利用DefaultLocatorService操作了。

我们只需这么修改代码,将默认的ArtifactDescriptorReader更改为可以自动修正的自定义ArtifactDescriptorReader即可:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
...
public static RepositorySystem newRepositorySystem() {
DefaultServiceLocator locator = new DefaultServiceLocator();
// locator.addService(ArtifactDescriptorReader.class, DefaultArtifactDescriptorReader.class); //默认的ArtifactDescriptorReader
locator.addService(ArtifactDescriptorReader.class, CompactAARArtifactDescriptorReader.class);//兼容AAR的CompactAARArtifactDescriptorReader
locator.addService(VersionResolver.class, DefaultVersionResolver.class);
locator.addService(VersionRangeResolver.class, DefaultVersionRangeResolver.class);
locator.addService(MetadataGeneratorFactory.class, SnapshotMetadataGeneratorFactory.class);
locator.addService(MetadataGeneratorFactory.class, VersionsMetadataGeneratorFactory.class);
locator.addService(RepositoryConnectorFactory.class, BasicRepositoryConnectorFactory.class);
locator.addService(TransporterFactory.class, FileTransporterFactory.class);
locator.addService(TransporterFactory.class, HttpTransporterFactory.class);
return locator.getService(RepositorySystem.class);
}
...

这个思路显然是对的,但是问题又来了,我们如何实现自动修正,毕竟人家extension是final修饰的?反射操作。

那,又怎么获取正确的extension呢?不妨来看看ArtifactDescriptorReader的代码,看看人家是怎么解析的。

org.apache.maven.repository.internal.DefaultArtifactDescriptorReader的代码摘要(具体片段懒得继续缩减了,就这个方法先凑活看好了):

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
public class DefaultArtifactDescriptorReader
implements ArtifactDescriptorReader, Service
{
...
private Model loadPom( RepositorySystemSession session, ArtifactDescriptorRequest request,
ArtifactDescriptorResult result )
throws ArtifactDescriptorException
{
RequestTrace trace = RequestTrace.newChild( request.getTrace(), request );

Set<String> visited = new LinkedHashSet<String>();
for ( Artifact artifact = request.getArtifact();; )
{
try
{
VersionRequest versionRequest =
new VersionRequest( artifact, request.getRepositories(), request.getRequestContext() );
versionRequest.setTrace( trace );
VersionResult versionResult = versionResolver.resolveVersion( session, versionRequest );

artifact = artifact.setVersion( versionResult.getVersion() );
}
catch ( VersionResolutionException e )
{
result.addException( e );
throw new ArtifactDescriptorException( result );
}

if ( !visited.add( artifact.getGroupId() + ':' + artifact.getArtifactId() + ':' + artifact.getBaseVersion() ) )
{
RepositoryException exception =
new RepositoryException( "Artifact relocations form a cycle: " + visited );
invalidDescriptor( session, trace, artifact, exception );
if ( ( getPolicy( session, artifact, request ) & ArtifactDescriptorPolicy.IGNORE_INVALID ) != 0 )
{
return null;
}
result.addException( exception );
throw new ArtifactDescriptorException( result );
}

Artifact pomArtifact = ArtifactDescriptorUtils.toPomArtifact( artifact );

ArtifactResult resolveResult;
try
{
ArtifactRequest resolveRequest =
new ArtifactRequest( pomArtifact, request.getRepositories(), request.getRequestContext() );
resolveRequest.setTrace( trace );
resolveResult = artifactResolver.resolveArtifact( session, resolveRequest );
pomArtifact = resolveResult.getArtifact();
result.setRepository( resolveResult.getRepository() );
}
catch ( ArtifactResolutionException e )
{
if ( e.getCause() instanceof ArtifactNotFoundException )
{
missingDescriptor( session, trace, artifact, (Exception) e.getCause() );
if ( ( getPolicy( session, artifact, request ) & ArtifactDescriptorPolicy.IGNORE_MISSING ) != 0 )
{
return null;
}
}
result.addException( e );
throw new ArtifactDescriptorException( result );
}

Model model;// Model为POM文件对应的模型类
try
{
ModelBuildingRequest modelRequest = new DefaultModelBuildingRequest();
modelRequest.setValidationLevel( ModelBuildingRequest.VALIDATION_LEVEL_MINIMAL );
modelRequest.setProcessPlugins( false );
modelRequest.setTwoPhaseBuilding( false );
modelRequest.setSystemProperties( toProperties( session.getUserProperties(),
session.getSystemProperties() ) );
modelRequest.setModelCache( DefaultModelCache.newInstance( session ) );
modelRequest.setModelResolver( new DefaultModelResolver( session, trace.newChild( modelRequest ),
request.getRequestContext(), artifactResolver,
remoteRepositoryManager,
request.getRepositories() ) );
if ( resolveResult.getRepository() instanceof WorkspaceRepository )
{
modelRequest.setPomFile( pomArtifact.getFile() );
}
else
{
modelRequest.setModelSource( new FileModelSource( pomArtifact.getFile() ) );
}

model = modelBuilder.build( modelRequest ).getEffectiveModel();
}
catch ( ModelBuildingException e )
{
for ( ModelProblem problem : e.getProblems() )
{
if ( problem.getException() instanceof UnresolvableModelException )
{
result.addException( problem.getException() );
throw new ArtifactDescriptorException( result );
}
}
invalidDescriptor( session, trace, artifact, e );
if ( ( getPolicy( session, artifact, request ) & ArtifactDescriptorPolicy.IGNORE_INVALID ) != 0 )
{
return null;
}
result.addException( e );
throw new ArtifactDescriptorException( result );
}

Relocation relocation = getRelocation( model );

if ( relocation != null )
{
result.addRelocation( artifact );
artifact =
new RelocatedArtifact( artifact, relocation.getGroupId(), relocation.getArtifactId(),
relocation.getVersion() );
result.setArtifact( artifact );
}
else
{
return model;
}
}
}
...
}

上面的loadPom()方法会在DefaultArtifactDescriptorReader.readArtifactDescriptor()中调用,代码中的org.apache.maven.model.Model类是POM文件的模型类,我们可以调用Model.getPackaging()获得extension,并且loadPom()接受一个ArtifactDescriptorResult参数,我们可以通过ArtifactDescriptorResult.getArtifact()取得当前操作的Artifact以更正extension字段。

Q:为什么我不使用setArtifact()方法来替换Artifact?
A:因为,lazy不想再实验了,并且用反射得到的程序可以正常运行(逃)。如果感兴趣可以自己试试

注意:代码中存在internal的类,我们不能在自己项目内直接调用,所以再加一层反射操作来替换这些操作internal的类即可。

CompactAARArtifactDescriptorReader代码如下(随手写的,建议用者再仔细看看,另外Reflect为团队内的反射工具,自己替换为正常的反射代码即可):

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
import org.apache.maven.model.DependencyManagement;
import org.apache.maven.model.DistributionManagement;
import org.apache.maven.model.License;
import org.apache.maven.model.Model;
import org.apache.maven.model.Prerequisites;
import org.apache.maven.model.Relocation;
import org.apache.maven.model.Repository;
import org.apache.maven.model.building.DefaultModelBuilderFactory;
import org.apache.maven.model.building.DefaultModelBuildingRequest;
import org.apache.maven.model.building.FileModelSource;
import org.apache.maven.model.building.ModelBuilder;
import org.apache.maven.model.building.ModelBuildingException;
import org.apache.maven.model.building.ModelBuildingRequest;
import org.apache.maven.model.building.ModelProblem;
import org.apache.maven.model.resolution.UnresolvableModelException;
import org.apache.maven.repository.internal.ArtifactDescriptorUtils;
import org.codehaus.plexus.component.annotations.Component;
import org.codehaus.plexus.component.annotations.Requirement;
import org.eclipse.aether.RepositoryEvent;
import org.eclipse.aether.RepositoryEvent.EventType;
import org.eclipse.aether.RepositoryException;
import org.eclipse.aether.RepositorySystemSession;
import org.eclipse.aether.RequestTrace;
import org.eclipse.aether.artifact.Artifact;
import org.eclipse.aether.artifact.ArtifactProperties;
import org.eclipse.aether.artifact.ArtifactType;
import org.eclipse.aether.artifact.ArtifactTypeRegistry;
import org.eclipse.aether.artifact.DefaultArtifact;
import org.eclipse.aether.artifact.DefaultArtifactType;
import org.eclipse.aether.graph.Dependency;
import org.eclipse.aether.graph.Exclusion;
import org.eclipse.aether.impl.ArtifactDescriptorReader;
import org.eclipse.aether.impl.ArtifactResolver;
import org.eclipse.aether.impl.RemoteRepositoryManager;
import org.eclipse.aether.impl.RepositoryEventDispatcher;
import org.eclipse.aether.impl.VersionResolver;
import org.eclipse.aether.repository.WorkspaceRepository;
import org.eclipse.aether.resolution.ArtifactDescriptorException;
import org.eclipse.aether.resolution.ArtifactDescriptorPolicy;
import org.eclipse.aether.resolution.ArtifactDescriptorPolicyRequest;
import org.eclipse.aether.resolution.ArtifactDescriptorRequest;
import org.eclipse.aether.resolution.ArtifactDescriptorResult;
import org.eclipse.aether.resolution.ArtifactRequest;
import org.eclipse.aether.resolution.ArtifactResolutionException;
import org.eclipse.aether.resolution.ArtifactResult;
import org.eclipse.aether.resolution.VersionRequest;
import org.eclipse.aether.resolution.VersionResolutionException;
import org.eclipse.aether.resolution.VersionResult;
import org.eclipse.aether.spi.locator.Service;
import org.eclipse.aether.spi.locator.ServiceLocator;
import org.eclipse.aether.spi.log.Logger;
import org.eclipse.aether.spi.log.LoggerFactory;
import org.eclipse.aether.spi.log.NullLoggerFactory;
import org.eclipse.aether.transfer.ArtifactNotFoundException;

import java.lang.reflect.Field;
import java.util.ArrayList;
import java.util.Collections;
import java.util.HashMap;
import java.util.LinkedHashMap;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.Map;
import java.util.Properties;
import java.util.Set;

import javax.inject.Inject;
import javax.inject.Named;

@Named
@Component(role = ArtifactDescriptorReader.class)
public class CompactAARArtifactDescriptorReader
implements ArtifactDescriptorReader, Service {

@SuppressWarnings("unused")
@Requirement(role = LoggerFactory.class)
private Logger logger = NullLoggerFactory.LOGGER;

@Requirement
private RemoteRepositoryManager remoteRepositoryManager;

@Requirement
private VersionResolver versionResolver;

@Requirement
private ArtifactResolver artifactResolver;

@Requirement
private RepositoryEventDispatcher repositoryEventDispatcher;

@Requirement
private ModelBuilder modelBuilder;

public CompactAARArtifactDescriptorReader() {
// enable no-arg constructor
}

@Inject
CompactAARArtifactDescriptorReader(RemoteRepositoryManager remoteRepositoryManager, VersionResolver versionResolver,
ArtifactResolver artifactResolver, ModelBuilder modelBuilder,
RepositoryEventDispatcher repositoryEventDispatcher, LoggerFactory loggerFactory) {
setRemoteRepositoryManager(remoteRepositoryManager);
setVersionResolver(versionResolver);
setArtifactResolver(artifactResolver);
setModelBuilder(modelBuilder);
setLoggerFactory(loggerFactory);
setRepositoryEventDispatcher(repositoryEventDispatcher);
}

public void initService(ServiceLocator locator) {
setLoggerFactory(locator.getService(LoggerFactory.class));
setRemoteRepositoryManager(locator.getService(RemoteRepositoryManager.class));
setVersionResolver(locator.getService(VersionResolver.class));
setArtifactResolver(locator.getService(ArtifactResolver.class));
setRepositoryEventDispatcher(locator.getService(RepositoryEventDispatcher.class));
modelBuilder = locator.getService(ModelBuilder.class);
if (modelBuilder == null) {
setModelBuilder(new DefaultModelBuilderFactory().newInstance());
}
}

public CompactAARArtifactDescriptorReader setLoggerFactory(LoggerFactory loggerFactory) {
this.logger = NullLoggerFactory.getSafeLogger(loggerFactory, getClass());
return this;
}

void setLogger(LoggerFactory loggerFactory) {
// plexus support
setLoggerFactory(loggerFactory);
}

public CompactAARArtifactDescriptorReader setRemoteRepositoryManager(RemoteRepositoryManager remoteRepositoryManager) {
if (remoteRepositoryManager == null) {
throw new IllegalArgumentException("remote repository manager has not been specified");
}
this.remoteRepositoryManager = remoteRepositoryManager;
return this;
}

public CompactAARArtifactDescriptorReader setVersionResolver(VersionResolver versionResolver) {
if (versionResolver == null) {
throw new IllegalArgumentException("version resolver has not been specified");
}
this.versionResolver = versionResolver;
return this;
}

public CompactAARArtifactDescriptorReader setArtifactResolver(ArtifactResolver artifactResolver) {
if (artifactResolver == null) {
throw new IllegalArgumentException("artifact resolver has not been specified");
}
this.artifactResolver = artifactResolver;
return this;
}

public CompactAARArtifactDescriptorReader setRepositoryEventDispatcher(RepositoryEventDispatcher repositoryEventDispatcher) {
if (repositoryEventDispatcher == null) {
throw new IllegalArgumentException("repository event dispatcher has not been specified");
}
this.repositoryEventDispatcher = repositoryEventDispatcher;
return this;
}

public CompactAARArtifactDescriptorReader setModelBuilder(ModelBuilder modelBuilder) {
if (modelBuilder == null) {
throw new IllegalArgumentException("model builder has not been specified");
}
this.modelBuilder = modelBuilder;
return this;
}

public ArtifactDescriptorResult readArtifactDescriptor(RepositorySystemSession session,
ArtifactDescriptorRequest request)
throws ArtifactDescriptorException {
ArtifactDescriptorResult result = new ArtifactDescriptorResult(request);

Model model = loadPom(session, request, result);

if (model != null) {
ArtifactTypeRegistry stereotypes = session.getArtifactTypeRegistry();

for (Repository r : model.getRepositories()) {
result.addRepository(ArtifactDescriptorUtils.toRemoteRepository(r));
}

for (org.apache.maven.model.Dependency dependency : model.getDependencies()) {
result.addDependency(convert(dependency, stereotypes));
}

DependencyManagement mngt = model.getDependencyManagement();
if (mngt != null) {
for (org.apache.maven.model.Dependency dependency : mngt.getDependencies()) {
result.addManagedDependency(convert(dependency, stereotypes));
}
}

Map<String, Object> properties = new LinkedHashMap<String, Object>();

Prerequisites prerequisites = model.getPrerequisites();
if (prerequisites != null) {
properties.put("prerequisites.maven", prerequisites.getMaven());
}

List<License> licenses = model.getLicenses();
properties.put("license.count", licenses.size());
for (int i = 0; i < licenses.size(); i++) {
License license = licenses.get(i);
properties.put("license." + i + ".name", license.getName());
properties.put("license." + i + ".url", license.getUrl());
properties.put("license." + i + ".comments", license.getComments());
properties.put("license." + i + ".distribution", license.getDistribution());
}

result.setProperties(properties);

setArtifactProperties(result, model);
}

return result;
}

private Model loadPom(RepositorySystemSession session, ArtifactDescriptorRequest request,
ArtifactDescriptorResult result)
throws ArtifactDescriptorException {
RequestTrace trace = RequestTrace.newChild(request.getTrace(), request);

Set<String> visited = new LinkedHashSet<String>();
for (Artifact artifact = request.getArtifact(); ; ) {
try {
VersionRequest versionRequest =
new VersionRequest(artifact, request.getRepositories(), request.getRequestContext());
versionRequest.setTrace(trace);
VersionResult versionResult = versionResolver.resolveVersion(session, versionRequest);

artifact = artifact.setVersion(versionResult.getVersion());
} catch (VersionResolutionException e) {
result.addException(e);
throw new ArtifactDescriptorException(result);
}

if (!visited.add(artifact.getGroupId() + ':' + artifact.getArtifactId() + ':' + artifact.getBaseVersion())) {
RepositoryException exception =
new RepositoryException("Artifact relocations form a cycle: " + visited);
invalidDescriptor(session, trace, artifact, exception);
if ((getPolicy(session, artifact, request) & ArtifactDescriptorPolicy.IGNORE_INVALID) != 0) {
return null;
}
result.addException(exception);
throw new ArtifactDescriptorException(result);
}

Artifact pomArtifact = ArtifactDescriptorUtils.toPomArtifact(artifact);

ArtifactResult resolveResult;
try {
ArtifactRequest resolveRequest =
new ArtifactRequest(pomArtifact, request.getRepositories(), request.getRequestContext());
resolveRequest.setTrace(trace);
resolveResult = artifactResolver.resolveArtifact(session, resolveRequest);
pomArtifact = resolveResult.getArtifact();
result.setRepository(resolveResult.getRepository());
} catch (ArtifactResolutionException e) {
if (e.getCause() instanceof ArtifactNotFoundException) {
missingDescriptor(session, trace, artifact, (Exception) e.getCause());
if ((getPolicy(session, artifact, request) & ArtifactDescriptorPolicy.IGNORE_MISSING) != 0) {
return null;
}
}
result.addException(e);
throw new ArtifactDescriptorException(result);
}

Model model;
try {
ModelBuildingRequest modelRequest = new DefaultModelBuildingRequest();
modelRequest.setValidationLevel(ModelBuildingRequest.VALIDATION_LEVEL_MINIMAL);
modelRequest.setProcessPlugins(false);
modelRequest.setTwoPhaseBuilding(false);
modelRequest.setSystemProperties(toProperties(session.getUserProperties(),
session.getSystemProperties()));
modelRequest.setModelCache(
Reflect.on("org.apache.maven.repository.internal.DefaultModelCache")
.call("newInstance", session).get());
//modelRequest.setModelCache( DefaultModelCache.newInstance( session ) );
modelRequest.setModelResolver(
Reflect.on("org.apache.maven.repository.internal.DefaultModelResolver")
.create(session, trace.newChild(modelRequest),
request.getRequestContext(), artifactResolver,
remoteRepositoryManager,
request.getRepositories())
.get());
// modelRequest.setModelResolver(new DefaultModelResolver(session, trace.newChild(modelRequest),
// request.getRequestContext(), artifactResolver,
// remoteRepositoryManager,
// request.getRepositories()));
if (resolveResult.getRepository() instanceof WorkspaceRepository) {
modelRequest.setPomFile(pomArtifact.getFile());
} else {
modelRequest.setModelSource(new FileModelSource(pomArtifact.getFile()));
}

model = modelBuilder.build(modelRequest).getEffectiveModel();
} catch (ModelBuildingException e) {
for (ModelProblem problem : e.getProblems()) {
if (problem.getException() instanceof UnresolvableModelException) {
result.addException(problem.getException());
throw new ArtifactDescriptorException(result);
}
}
invalidDescriptor(session, trace, artifact, e);
if ((getPolicy(session, artifact, request) & ArtifactDescriptorPolicy.IGNORE_INVALID) != 0) {
return null;
}
result.addException(e);
throw new ArtifactDescriptorException(result);
}

Relocation relocation = getRelocation(model);

String packaging = model.getPackaging();
if ("aar".equals(packaging)) { //适配aar类型
//根据POM重置artifact的extenstion
try {
Class<DefaultArtifact> artifactCls = DefaultArtifact.class;
Field extension = artifactCls.getDeclaredField("extension");
extension.setAccessible(true);
extension.set(result.getArtifact(), model.getPackaging());
} catch (Exception e) {
e.printStackTrace();
}
}

if (relocation != null) {
result.addRelocation(artifact);
artifact = Reflect.on("org.apache.maven.repository.internal.RelocatedArtifact")
.create(artifact, relocation.getGroupId(), relocation.getArtifactId(),
relocation.getVersion()).get();

// artifact =
// new RelocatedArtifact(artifact, relocation.getGroupId(), relocation.getArtifactId(),
// relocation.getVersion());
result.setArtifact(artifact);
} else {
return model;
}
}
}

private Properties toProperties(Map<String, String> dominant, Map<String, String> recessive) {
Properties props = new Properties();
if (recessive != null) {
props.putAll(recessive);
}
if (dominant != null) {
props.putAll(dominant);
}
return props;
}

private Relocation getRelocation(Model model) {
Relocation relocation = null;
DistributionManagement distMngt = model.getDistributionManagement();
if (distMngt != null) {
relocation = distMngt.getRelocation();
}
return relocation;
}

private void setArtifactProperties(ArtifactDescriptorResult result, Model model) {
String downloadUrl = null;
DistributionManagement distMngt = model.getDistributionManagement();
if (distMngt != null) {
downloadUrl = distMngt.getDownloadUrl();
}
if (downloadUrl != null && downloadUrl.length() > 0) {
Artifact artifact = result.getArtifact();
Map<String, String> props = new HashMap<String, String>(artifact.getProperties());
props.put(ArtifactProperties.DOWNLOAD_URL, downloadUrl);
result.setArtifact(artifact.setProperties(props));
}
}

private Dependency convert(org.apache.maven.model.Dependency dependency, ArtifactTypeRegistry stereotypes) {
ArtifactType stereotype = stereotypes.get(dependency.getType());
if (stereotype == null) {
stereotype = new DefaultArtifactType(dependency.getType());
}

boolean system = dependency.getSystemPath() != null && dependency.getSystemPath().length() > 0;

Map<String, String> props = null;
if (system) {
props = Collections.singletonMap(ArtifactProperties.LOCAL_PATH, dependency.getSystemPath());
}

Artifact artifact =
new DefaultArtifact(dependency.getGroupId(), dependency.getArtifactId(), dependency.getClassifier(), null,
dependency.getVersion(), props, stereotype);

List<Exclusion> exclusions = new ArrayList<Exclusion>(dependency.getExclusions().size());
for (org.apache.maven.model.Exclusion exclusion : dependency.getExclusions()) {
exclusions.add(convert(exclusion));
}

Dependency result = new Dependency(artifact, dependency.getScope(), dependency.isOptional(), exclusions);

return result;
}

private Exclusion convert(org.apache.maven.model.Exclusion exclusion) {
return new Exclusion(exclusion.getGroupId(), exclusion.getArtifactId(), "*", "*");
}

private void missingDescriptor(RepositorySystemSession session, RequestTrace trace, Artifact artifact,
Exception exception) {
RepositoryEvent.Builder event = new RepositoryEvent.Builder(session, EventType.ARTIFACT_DESCRIPTOR_MISSING);
event.setTrace(trace);
event.setArtifact(artifact);
event.setException(exception);

repositoryEventDispatcher.dispatch(event.build());
}

private void invalidDescriptor(RepositorySystemSession session, RequestTrace trace, Artifact artifact,
Exception exception) {
RepositoryEvent.Builder event = new RepositoryEvent.Builder(session, EventType.ARTIFACT_DESCRIPTOR_INVALID);
event.setTrace(trace);
event.setArtifact(artifact);
event.setException(exception);

repositoryEventDispatcher.dispatch(event.build());
}

private int getPolicy(RepositorySystemSession session, Artifact artifact, ArtifactDescriptorRequest request) {
ArtifactDescriptorPolicy policy = session.getArtifactDescriptorPolicy();
if (policy == null) {
return ArtifactDescriptorPolicy.STRICT;
}
return policy.getPolicy(session, new ArtifactDescriptorPolicyRequest(artifact, request.getRequestContext()));
}

}

提示:千万不要忘记在Factory内替换DefaultArtifactDescriptorReader为CompactAARArtifactDescriptorReader。

Aether的版本冲突解决方式与Gradle的不同

假如,使用已经兼容AAR依赖的Aether系统去下载Google的一些库,比如androidx系列,很大概率出现因为依赖冲突而无法下载的问题,或者即使下载了也无法正常编译。

这个问题是因为Aether和Gradle的默认版本冲突解决方式不同,Aether是就近原则,依赖程度浅的版本则优先度高,Gradle则是就近+最新原则,在本地配置内采用最近原则,在处理远程依赖的子依赖时采用最新版本。

解决方法

我们只需要改变依赖冲突的解决方式即可,这个需要通过RepositorySystemSession来设置。

1
2
3
4
5
6
7
8
9
10
public static RepositorySystemSession newSession(RepositorySystem system, LocalRepository localRepo) {
DefaultRepositorySystemSession session = MavenRepositorySystemUtils.newSession();
session.setConfigProperty(ConflictResolver.CONFIG_PROP_VERBOSE, true);
session.setConfigProperty(DependencyManagerUtils.CONFIG_PROP_VERBOSE, true);
session.setLocalRepositoryManager(system.newLocalRepositoryManager(session, localRepo));
session.setDependencyGraphTransformer(
//改为自定义的DependencyGraphTransformer
);
return session;
}

自己实现所需的依赖树裁枝方式即可。我这里比较懒省事,直接做一个LazyConflictResolver,下载依赖树上的所有依赖,之后在编译的时候取最新的版本。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
import org.eclipse.aether.RepositoryException;
import org.eclipse.aether.collection.DependencyGraphTransformationContext;
import org.eclipse.aether.collection.DependencyGraphTransformer;
import org.eclipse.aether.graph.DependencyNode;

/**
* 选择摆烂的ConflictResolver,遇到困难睡大觉,把困难丢给后面的人解决。
*/
public class LazyConflictResolver implements DependencyGraphTransformer {
@Override
public DependencyNode transformGraph(DependencyNode node, DependencyGraphTransformationContext context) throws RepositoryException {
return node;
}
}