Java Packaging HOWTO

İçindekiler

Authors and contributing

The authors of this document are:

  • Mikolaj Izdebski, Red Hat

  • Nicolas Mailhot, JPackage Project

  • Stanislav Ochotnicky, Red Hat

  • Ville Skyttä, JPackage Project

  • Michal Srb, Red Hat

  • Michael Simacek, Red Hat

  • Marián Konček, Red Hat

This document is free software; see the license for terms of use, distribution and / or modification.

The source code for this document is available in git repository. Instructions for building it from sources are available in README file.

This document is developed as part of Javapackages community project, which welcomes new contributors. Requests and comments related to this document should be reported at Red Hat Bugzilla.

Before contributing please read the README file, which among other things includes basic rules which should be followed when working on this document. You can send patches to the Red Hat Bugzilla. They should be in git format and should be prepared against master branch in git. Alternatively you can also send pull requests at Github repository.

Abstract

This document aims to help developers create and maintain Java packages in Fedora. It does not supersede or replace Java Packaging Guidelines, but rather aims to document tools and techniques used for packaging Java software on Fedora.

1. Introduction

Clean Java packaging has historically been a daunting task. Lack of any standard addressing the physical location of files on the system combined with the common use of licensing terms that only allow free redistribution of key components as a part of a greater ensemble has let to the systematic release of self-sufficient applications with built-in copies of external components.

As a consequence applications are only tested with the versions of the components they bundle, a complete Java system suffers from endless duplication of the same modules, and integrating multiple parts can be a nightmare since they are bound to depend on the same elements - only with different and subtly incompatible versions (different requirements, different bugs). Any security or compatibility upgrade must be performed for each of those duplicated elements.

This problem is compounded by the current practice of folding extensions in the JVM itself after a while; an element that could safely be embedded in a application will suddenly conflict with a JVM part and cause subtle failures.

It is not surprising then that complex Java systems tend to fossilize very quickly, with the cost of maintaining dependencies current growing too high so fast people basically give up on it.

This situation is incompatible with typical fast-evolving Linux platform. To attain the aim of user- and administrator-friendly RPM packaging of Java applications a custom infrastructure and strict packaging rules had to be evolved.

1.1. Basic introduction to packaging, reasons, problems, rationale

This section includes basic introduction to Java packaging world to people coming from different backgrounds. The goal is to understand language of all groups involved. If you are a Java developer coming into contact with RPM packaging for the first time start reading Java developer section. On the other hand if you are coming from RPM packaging background an introduction to Java world is probably a better starting point.

It should be noted that especially in this section we might sacrifice correctness for simplicity.

1.2. For Packagers

Java is a programming language which is usually compiled into bytecode for JVM (Java Virtual Machine). For more details about the JVM and bytecode specification see JVM documentation.

1.2.1. Example Java Project

To better illustrate various parts of Java packaging we will dissect simple Java Hello world application. Java sources are usually organized using directory hierarchies. Shared directory hierarchy creates a namespace called package in Java terminology. To understand naming mechanisms of Java packages see Java package naming conventions.

Let’s create a simple hello world application that will execute following steps when run:

  1. Ask for a name.

  2. Print out Hello World from and the name from previous step.

To illustrate certain points we artificially complicate things by creating:

  • Input class used only for input of text from terminal.

  • Output class used only for output on terminal.

  • HelloWorldApp class used as main application.

Directory listing of example project
$ find .
.
./Makefile
./src
./src/org
./src/org/fedoraproject
./src/org/fedoraproject/helloworld
./src/org/fedoraproject/helloworld/output
./src/org/fedoraproject/helloworld/output/Output.java
./src/org/fedoraproject/helloworld/input
./src/org/fedoraproject/helloworld/input/Input.java
./src/org/fedoraproject/helloworld/HelloWorld.java

In this project all packages are under src/ directory hierarchy.

HelloWorld.java listing
Unresolved directive in introduction_for_packagers.adoc - include::{EXAMPLE}java_project/src/org/fedoraproject/helloworld/HelloWorld.java[]
Java packages
org/fedoraproject/helloworld/input/Input.java
org/fedoraproject/helloworld/output/Output.java
org/fedoraproject/helloworld/HelloWorld.java

Although the directory structure of our package is hierarchical, there is no real parent-child relationship between packages. Each package is therefore seen as independent. The above example makes use of three separate packages:

  • org.fedoraproject.helloworld.input

  • org.fedoraproject.helloworld.output

  • org.fedoraproject.helloworld

Environment setup consists of two main parts:

  • Telling JVM which Java class contains main() method.

  • Adding required JAR files on JVM classpath.

Compiling our project

The sample project can be compiled to a bytecode by Java compiler. Java compiler can be typically invoked from command line by command javac.

javac $(find -name '*.java')

For every .java file corresponding .class file will be created. The .class files contain Java bytecode which is meant to be executed on JVM.

One could put invocation of javac to Makefile and simplify the compilation a bit. It might be sufficient for such a simple project, but it would quickly become hard to build more complex projects with this approach. Java world knows several high-level build systems which can highly simplify building of Java projects. Among others, probably the most known are Apache Maven and Apache Ant.

See also Maven and Ant sections.

JAR files

Having our application split across many .class files would not be very practical, so those .class files are assembled into ZIP files with specific layout and called JAR files. Most commonly these special ZIP files have .jar suffix, but other variations exist (.war, .ear). They contain:

  • Compiled bytecode of our project.

  • Additional metadata stored in META-INF/MANIFEST.MF file.

  • Resource files such as images or localisation data.

  • Optionaly the source code of our project (called source JAR then).

They can also contain additional bundled software which is something we do not want to have in packages. You can inspect the contents of given JAR file by extracting it. That can be done with following command:

jar -xf something.jar

The detailed description of JAR file format is in the JAR File Specification.

Classpath

The classpath is a way of telling JVM where to look for user classes and 3rd party libraries. By default, only current directory is searched, all other locations need to be specified explicitly by setting up CLASSPATH environment variable, or via -cp (-classpath) option of the Java Virtual Machine.

Setting the classpath
java -cp /usr/share/java/log4j.jar:/usr/share/java/junit.jar mypackage/MyClass.class
CLASSPATH=/usr/share/java/log4j.jar:/usr/share/java/junit.jar java mypackage/MyClass.class

Please note that two JAR files are separated by colon in a classpath definition.

See official documentation for more information about classpath.

Wrapper scripts

Classic compiled applications use dynamic linker to find dependencies (linked libraries), whereas dynamic languages such as Python, Ruby, Lua have predefined directories where they search for imported modules. JVM itself has no embedded knowledge of installation paths and thus no automatic way to resolve dependencies of Java projects. This means that all Java applications have to use wrapper shell scripts to setup the environment before invoking the JVM and running the application itself. Note that this is not necessary for libraries.

1.2.2. Build System Identification

The build system used by upstream can be usually identified by looking at their configuration files, which reside in project directory structure, usually in its root or in specialized directories with names such as build or make.

Maven

Build managed by Apache Maven is configured by an XML file that is by default named pom.xml. In its simpler form it usually looks like this:

<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>com.example.myproject</groupId>
  <artifactId>myproject</artifactId>
  <packaging>jar</packaging>
  <version>1.0</version>
  <name>myproject</name>
  <dependencies>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>4.1</version>
      <scope>test</scope>
    </dependency>
  </dependencies>
</project>

It describes project’s build process in a declarative way, without explicitly specifying exact steps needed to compile sources and assemble pieces together. It also specifies project’s dependencies which are usually the main point of interest for packagers. Another important feature of Maven that packagers should know about are plugins. Plugins extend Maven with some particular functionality, but unfortunately some of them may get in the way of packaging and need to be altered or removed. There are RPM macros provided to facilitate modifying Maven dependencies and plugins.

Ant

Apache Ant is also configured by an XML file. It is by convention named build.xml and in its simple form it looks like this:

<project name="MyProject" default="dist" basedir=".">
  <property name="src" location="src"/>
  <property name="build" location="build"/>
  <property name="dist" location="dist"/>

  <target name="init" description="Create build directory">
    <mkdir dir="${build}"/>
  </target>

  <target name="compile" depends="init"
        description="Compile the source">
    <javac srcdir="${src}" destdir="${build}"/>
  </target>

  <target name="dist" depends="compile"
        description="Generate jar">
    <mkdir dir="${dist}/lib"/>

    <jar jarfile="${dist}/myproject.jar" basedir="${build}"/>
  </target>

  <target name="clean" description="Clean build files">
    <delete dir="${build}"/>
    <delete dir="${dist}"/>
  </target>
</project>

Ant build file consists mostly of targets, which are collections of steps needed to accomplish intended task. They usually depend on each other and are generally similar to Makefile targets. Available targets can be listed by invoking ant -p in project directory containing build.xml file. If the file is named differently than build.xml you have to tell Ant which file should be used by using -f option with the name of the actual build file.

Some projects that use Apache Ant also use Apache Ivy to simplify dependency handling. Ivy is capable of resolving and downloading artifacts from Maven repositories which are declaratively described in XML. Project usually contains one or more ivy.xml files specifying the module Maven coordinates and its dependencies. Ivy can also be used directly from Ant build files. To detect whether the project you wish to package is using Apache Ivy, look for files named ivy.xml or nodes in the ivy namespace in project’s build file.

Make

While unlikely, it is still possible that you encounter a project whose build is managed by plain old Makefiles. They contain a list of targets which consist of commands (marked with tab at the begining of line) and are invoked by make target or simply make to run the default target.

1.2.3. Quiz for Packagers

At this point you should have enough knowledge about Java to start packaging. If you are not able to answer following questions return back to previous sections or ask experienced packagers for different explanations of given topics.

  1. What is the difference between JVM and Java?

  2. What is a CLASSPATH environment variable and how can you use it?

  3. Name two typical Java build systems and how you can identify which one is being used

  4. What is the difference between java and javac comands?

  5. What are contents of a typical JAR file?

  6. What is a pom.xml file and what information it contains?

  7. How would you handle packaging software that contains lib/junit4.jar inside source tarball?

  8. Name at least three methods for bundling code in Java projects

1.3. For Java Developers

Packaging Java software has specifics which we will try to cover in this section aimed at Java developers who are already familiar with Java language, JVM, classpath handling, Maven, pom.xml file structure and dependencies.

Instead we will focus on basic packaging tools and relationships between Java and RPM world. One of the most important questions is: What is the reason to package software in RPM (or other distribution-specific formats). There are several reasons for it, among others:

  • Unified way of installation of software for users of distribution regardless of upstream projects

  • Verification of authenticity of software packages by signing them

  • Simplified software updates

  • Automatic handling of dependencies for users

  • Common filesystem layout across distribution enforced by packaging standards

  • Ability to administer, monitor and query packages installed on several machines through unified interfaces

  • Distribution of additional metadata with the software itself such as licenses used, homepage for the project, changelogs and other information that users or administrators can find useful

1.3.1. Example RPM Project

RPM uses spec files as recipes for building software packages. We will use it to package example project created in previous section. If you did not read it you do not need to; the file listing is available here and the rest is not necessary for this section.

Directory listing
Makefile
src
src/org
src/org/fedoraproject
src/org/fedoraproject/helloworld
src/org/fedoraproject/helloworld/output
src/org/fedoraproject/helloworld/output/Output.java
src/org/fedoraproject/helloworld/input
src/org/fedoraproject/helloworld/input/Input.java
src/org/fedoraproject/helloworld/HelloWorld.java

We packed the project directory into file helloworld.tar.gz.

Example spec file
Unresolved directive in introduction_for_developers.adoc - include::{EXAMPLE}rpm_project/helloworld.spec[]

RPM spec files contain several basic sections:

Header, which contains:
  • Package metadata such as its name, version, release, license, …​

  • A Summary with basic one-line summary of package contents.

  • Package source URLs denoted with Source0 to SourceN directives.

    • Source files can then be referenced by %SOURCE0 to %SOURCEn, which expand to actual paths to given files.

    • In practice, the source URL shouldn’t point to a file in our filesystem, but to an upstream release on the web.

  • Patches - using Patch0 to PatchN.

  • Project dependencies.

    • Build dependencies specified by BuildRequires, which need to be determined manually.

    • Run time dependencies will be detected automatically. If it fails, you have to specify them with Requires.

    • More information on this topic can be found in the dependency handling section.

%description
  • Few sentences about the project and its uses. It will be displayed by package management software.

%prep section
  • Unpacks the sources using setup -q or manually if needed.

  • If a source file doesn’t need to be extracted, it can be copied to build directory by cp -p %SOURCE0 ..

  • Apply patches with %patch X, where X is the number of patch you want to apply. (You usually need the patch index, so it would be: %patch 0 -p1).

%build section
  • Contains project compilation instructions. Usually consists of calling the projects build system such as Ant, Maven or Make.

Optional %check section
  • Runs projects integration tests. Unit test are usually run in %build section, so if there are no integration tests available, this section is omitted.

%install section
  • Copies built files that are supposed to be installed into %{buildroot} directory, which represents target filesystem’s root.

%files section
  • Lists all files, that should be installed from %{buildroot} to target system.

  • Documentation files are prefixed by %doc and are taken from build directory instead of buildroot.

  • Directories that this package should own are prefixed with %dir.

%changelog
  • Contains changes to this spec file (not upstream).

  • Has prescribed format. To prevent mistakes in format, it is advised to use tool such as rpmdev-bumpspec from package rpmdevtools to append new changelog entries instead of editing it by hand.

To build RPM from this spec file save it in your current directory and run rpmbuild:

$ rpmbuild -bb helloworld.spec

If everything worked OK, this should produce RPM file ~/rpmbuild/RPMS/x86_64/helloworld-1.0-1.fc18.x86_64.rpm. You can use rpm -i or dnf install commands to install this package and it will add /usr/share/java/helloworld.jar and /usr/bin/helloworld wrapper script to your system. Please note that this specfile is simplified and lacks some additional parts, such as license installation.

Paths and filenames might be slightly different depending on your architecture and distribution. Output of the commands will tell you exact paths.

As you can see to build RPM files you can use rpmbuild command. It has several other options, which we will cover later on.

Other than building binary RPMs (-bb), rpmbuild can also be used to:

  • build only source RPMs (SRPMs), the packages containing source files which can be later build to RPMs (option -bs)

  • build all, both binary and source RPMs (option -ba)

See rpmbuild 's manual page for all available options.

1.3.2. Querying repositories

Fedora comes with several useful tools which can provide great assistance in getting information from RPM repositories.

dnf repoquery is a tool for querying information from RPM repositories. Maintainers of Java packages might typically query the repository for information like "which package contains the Maven artifact groupId:artifactId".

Find out which package provides given artifact
$ dnf repoquery --whatprovides 'mvn(commons-io:commons-io)'
apache-commons-io-1:2.4-9.fc19.noarch

The example above shows that one can get to commons-io:commons-io artifact by installing apache-commons-io package.

By default, dnf repoquery uses all enabled repositories in DNF configuration, but it is possible to explicitly specify any other repository. For example following command will query only Rawhide repository:

$ dnf repoquery --available --disablerepo \* --enablerepo rawhide --whatprovides 'mvn(commons-io:commons-io)'
apache-commons-io-1:2.4-10.fc20.noarch

Sometimes it may be useful to just list all the artifacts provided by given package:

$ dnf repoquery --provides apache-commons-io
apache-commons-io = 1:2.4-9.fc19
jakarta-commons-io = 1:2.4-9.fc19
mvn(commons-io:commons-io) = 2.4
mvn(org.apache.commons:commons-io) = 2.4
osgi(org.apache.commons.io) = 2.4.0

Output above means that package apache-commons-io provides two Maven artifacts: previously mentioned commons-io:commons-io and org.apache.commons:commons-io. In this case the second one is just an alias for same JAR file. See section about artifact aliases for more information on this topic.

Another useful tool is rpm. It can do a lot of stuff, but most importantly it can replace dnf repoquery if one only needs to query local RPM database. Only installed packages, or local .rpm files, can be queried with this tool.

Common use case could be checking which Maven artifacts provide locally built packages.

$ rpm -qp --provides simplemaven-1.0-2.fc21.noarch.rpm
mvn(com.example:simplemaven) = 1.0
mvn(simplemaven:simplemaven) = 1.0
simplemaven = 1.0-2.fc21

1.3.3. Quiz for Java Developers

  1. How would you build a binary RPM if you were given a source RPM?

  2. What is most common content of Source0 spec file tag?

  3. What is the difference between Version and Release tags?

  4. How would you apply a patch in RPM?

  5. Where on filesystem should JAR files go?

  6. What is the format of RPM changelog or what tool would you use to produce it?

  7. How would you install an application that needs certain layout (think ANT_HOME) while honoring distribution filesystem layout guidelines?

  8. How would you generate script for running a application with main class org.project.MainClass which depends on commons-lang jar?

2. Java Specifics in Fedora for Users and Developers

This section contains information about default Java implementation in Fedora, switching between different Java runtime environments and about few useful tools which can be used during packaging / development.

2.1. Java implementation in Fedora

Fedora ships with an open-source reference implementation of Java Standard Edition called OpenJDK. OpenJDK provides Java Runtime Environment for Java applications and set of development tools for Java developers.

From users point of view, java command is probably the most interesting. It is a Java application launcher which spawns Java Virtual Machine (JVM), loads specified .class file and executes its main method.

Here is an example how to run sample Java project from section Example Java Project:

$ java org/fedoraproject/helloworld/HelloWorld.class

OpenJDK provides a lot of interesting tools for Java developers:

  • javac is a Java compiler which translates source files to Java bytecode, which can be later interpreted by JVM.

  • jdb is a simple command-line debugger for Java applications.

  • javadoc is a tool for generating Javadoc documentation.

  • javap can be used for disassembling Java class files.

2.1.1. Switching between different Java implementations

Users and developers may want to have multiple Java environments installed at the same time. It is possible in Fedora, but only one of them can be default Java environment in system. Fedora uses alternatives for switching between different installed JREs/JDKs.

# alternatives --config java

There are 3 programs which provide 'java'.

  Selection    Command
  -----------------------------------------------
   1           java-17-openjdk.x86_64 (/usr/lib/jvm/java-17-openjdk-17.0.2.0.8-1.fc35.x86_64/bin/java)
*+ 2           java-11-openjdk.x86_64 (/usr/lib/jvm/java-11-openjdk-11.0.14.1.1-5.fc35.x86_64/bin/java)
   3           java-latest-openjdk.x86_64 (/usr/lib/jvm/java-18-openjdk-18.0.1.0.10-1.rolling.fc35.x86_64/bin/java)

Enter to keep the current selection[+], or type selection number:

Example above shows how to chose default Java environment. java command will then point to the Java implementation provided by given JRE.

See man alternatives for more information on how to use alternatives.

Developers may want to use Java compiler from different JDK. This can be achieved with alternatives --config javac.

2.2. Building classpath with build-classpath

Most of the Java application needs to specify classpath in order to work correctly. Fedora contains several tools which make working with classpaths easier.

build-classpath - this tool takes JAR filenames or artifact coordinates as arguments and translates them to classpath-like string. See the following example:

$ build-classpath log4j junit org.ow2.asm:asm
/usr/share/java/log4j.jar:/usr/share/java/junit.jar:/usr/share/java/objectweb-asm4/asm.jar

log4j corresponds to log4j.jar stored in %{_javadir}. If the JAR file is stored in subdirectory under %{_javadir}, it is neccessary to pass subdirectory/jarname as an argument to build-classpath. Example:

$ build-classpath httpcomponents/httpclient.jar
/usr/share/java/httpcomponents/httpclient.jar

2.3. Building JAR repository with build-jar-repository

Another tool is build-jar-repository. It can fill specified directory with symbolic / hard links to specified JAR files. Similarly to build-classpath, JARs can be identified by their names or artifact coordintes.

$ build-jar-repository my-repo log4j httpcomponents/httpclient junit:junit
$ ls -l my-repo/
total 0
lrwxrwxrwx. 1 msrb msrb 45 Oct 29 10:39 [httpcomponents][httpclient].jar -> /usr/share/java/httpcomponents/httpclient.jar
lrwxrwxrwx. 1 msrb msrb 25 Oct 29 10:39 [junit:junit].jar -> /usr/share/java/junit.jar
lrwxrwxrwx. 1 msrb msrb 25 Oct 29 10:39 [log4j].jar -> /usr/share/java/log4j.jar

Similar command rebuild-jar-repository can be used to rebuild JAR repository previously built by build-jar-repository. See man rebuild-jar-repository for more information.

build-classpath-directory is a small tool which can be used to build classpath string from specified directory.

$ build-classpath-directory /usr/share/java/xstream
/usr/share/java/xstream/xstream-benchmark.jar:/usr/share/java/xstream/xstream.jar
:/usr/share/java/xstream/xstream-hibernate.jar

3. Java Specifics in Fedora for Packagers

3.1. Directory Layout

This section describes most of directories used for Java packaging. Each directory is named in RPM macro form, which shows how it should be used in RPM spec files. Symbolic name is followed by usual macro expansion (i.e. physical directory location in the file system) and short description.

Directories commonly used by regular packages
%{_javadir} — /usr/share/java

Directory that holds all JAR files that do not contain or use native code and do not depend on a particular Java standard version. JAR files can either be placed directly in this directory or one of its subdirectories. Often packages create their own subdirectories there, in this case subdirectory name should match package name.

%{_jnidir} — /usr/lib/java

Directory where architecture-specific JAR files are installed. In particular, JAR files containing or using native code (Java Native Interface, JNI) should be installed there.

%{_javadocdir} — /usr/share/javadoc

Root directory where all Java API documentation (Javadoc) is installed. Each source package usually creates a single subdirectory containing aggregated Javadocs for all binary packages it produces.

%{_mavenpomdir} — /usr/share/maven-poms

Directory where Project Object Model (POM) files used by Apache Maven are installed. Each POM must have name that strictly corresponds to JAR file in %{_javadir} or %{_jnidir}.

%{_ivyxmldir} — /usr/share/ivy-xmls

Directory where ivy.xml files used by Apache Ivy are installed. Each XML must have name that strictly corresponds to JAR file in %{_javadir} or %{_jnidir}.

Other directories
%{_jvmdir} — /usr/lib/jvm

Root directory where different Java Virtual Machines (JVM) are installed. Each JVM creates a subdirectory, possibly with several alternative names implemented with symbolic links. Directories prefixed with java contain Java Development Kit (JDK), while directories which names start with jre hold Java Runtime Environment (JRE).

%{_jvmsysconfdir} — /etc/jvm
%{_jvmcommonsysconfdir} — /etc/jvm-common

Directories containing configuration files for Java Virtual Machines (JVM).

%{_jvmprivdir} — /usr/lib/jvm-private
%{_jvmlibdir} — /usr/lib/jvm
%{_jvmdatadir} — /usr/share/jvm
%{_jvmcommonlibdir} — /usr/lib/jvm-common
%{_jvmcommondatadir} — /usr/share/jvm-common

Directories containing implementation files of Java Virtual Machines (JVM). Describing them in detail is out of scope for this document. Purpose of each directory is commented briefly in macros.jpackage file in /etc/rpm. More detailed description can be found in JPackage policy.

%{_javaconfdir} — /etc/java

Directory containing Java configuration files. In particular it contains main Java configuration file — java.conf.

3.2. JAR File Identification

Complex Java applications usually consist of multiple components. Each component can have multiple implementations, called artifacts. Artifacts in Java context are usually JAR files, but can also be WAR files or any other kind of file.

There are multiple incompatible ways of identifying (naming) Java artifacts and each build system often encourages usage of specific naming scheme. This means that Linux distributions also need to allow each artifact to be located using several different identifiers, possible using different schemes. On the other hand it is virtually impossible to every naming scheme, so there are some simplifications.

This chapter describes artifact different ways to identify and locate artifacts in system repository.

3.2.1. Relative paths

JAR artifacts are installed in one of standard directory trees. Usually this is either %{_javadir} (/usr/share/java) or %{_jnidir} (/usr/lib/java).

The simplest way of identifying artifacts is using their relative path from one of standard locations. All artifact can be identified this way because each artifacts has a unique file name. Each path identifying artifact will be called artifact path in this document.

To keep artifact paths simpler and more readable, extension can be omitted if it is equal to jar. For non-JAR artifacts extension cannot be omitted and must be retained.

Additionally, if artifact path points to a directory then it represents all artifacts contained in this directory. This allows a whole set of related artifacts to be referenced easily by specifying directory name containing all of them.

If the same artifact path has valid expansions in two different root directories then it is unspecified which artifacts will be located.

3.2.2. Artifact specification

As noted in previous section, every artifact can be uniquely identified by its file path. However this is not always the preferred way of artifact identification.

Modern Java build systems provide a way of identifying artifacts with an abstract identifier, or more often, a pair of identifiers. The first if usually called group ID or organization ID while the second is just artifact ID. This pair of identifiers will be called artifact coordinates in this document. Besides group ID and artifact ID, artifact coordinates may also include other optional information about artifact, such as extension, classifier and version.

In Linux distributions it is important to stay close to upstreams providing software being packaged, so the ability to identify artifacts in the same way as upstream does is very important from the packaging point of view. Every artifact can optionally be identified by artifact coordinates assigned during package build. Packages built with Maven automatically use this feature, but all other packages, even these built with pure javac, can use this feature too (see description of %mvn_artifact and %add_maven_depmap macros).

3.2.3. Aliases

Aliases working in two ways:

  • Symlinks for paths

  • Additional mappings for artifact specifications

In the real world the same project can appear under different names as it was evolving or released differently. Therefore other projects may refer to those alternative names instead of using the name currently prefered by upstream.

Artifact aliases

XMvn provides a way to attach multiple artifact coordinates to a single artifact. Dependent projects that use alternative coordinates can then be built without the need to patch their POMs or alter the build by other means. It will also generate virtual provides for the alias, so it can be also used in Requires and BuildRequires. Creating an alias is achieved by %mvn_alias macro.

Example invocation
# com.example.foo:bar (the actual artifact existing in the project) will also
# be available as com.example.foo:bar-all
%mvn_alias com.example.foo:bar com.example.foo:bar-all

# You don't need to repeat the part of coordinates that stays the same
# (groupID in this case)
%mvn_alias com.example.foo:bar :bar-all

# You can specify multiple aliases at once
%mvn_alias com.example.foo:bar :bar-all :bar-lib

# The macro supports several shortcuts to generate multiple alisaes.
# Braces - {} - capture their content, which can then be referenced in the
# alias part with @N, where N is the index of the capture group.
# * acts as a wildcard (matching anything)
# The following generates aliases ending with shaded for all artifacts in the
# project
%mvn_alias 'com.example.foo:{*}' :@1-shaded

3.2.4. Compatibility versions

Handling of compatibility packages, versioned jars etc.

In Fedora we prefer to always have only the latest version of a given project. Unfortunately, this is not always possible as some projects change too much and it would be too hard to port dependent packages to the current version. It is not possible to just update the package and keep the old version around as the names, file paths and dependency provides would clash. The recommended practice is to update the current package to the new version and create new package representing the old version (called compat package). The compat package needs to have the version number (usually only the major number, unless further distinction is necessary) appended to the name, thus effectivelly having different name from RPM’s point of view. Such compat package needs to perform some additional steps to ensure that it can be installed and used along the non-compat one.

You should always evaluate whether creating a compat package is really necessary. Porting dependent projects to new versions of dependencies may be a complicated task, but your effort would be appreciated and it is likely that the patch will be accepted upstream at some point in time. If the upstream is already inactive and the package is not required by anything, you should also consider retiring it.

Maven Compat Versions

XMvn supports marking particular artifact as compat, performing the necessary steps to avoid clashes with the non-compat version. An artifact can be marked as compat by %mvn_compat_version. It accepts an artifact argument which will determine which artifact will be compat. The format for specifying artifact coordinates is the same as with %mvn_alias. In the common case you will want to mark all artifacts as compat. You can specify multiple compat versions at a time.

Dependency resolution of compat artifacts

When XMvn performs dependency resolution for a dependency artifact in a project, it checks the dependency version and compares it against all versions of the artifact installed in the buildroot. If none of the compat artifacts matches it will resolve the artifact to the non-compat one. This has a few implications:

  • The versions are compared for exact match. The compat package should provide all applicable versions that are present in packages that are supposed to be used with this version.

  • The dependent packages need to have correct BuildRequires on the compat package as the virtual provides is also different (see below).

File names and virtual provides

In order to prevent file name clashes, compat artifacts have the first specified compat version appended to the filename. Virtual provides for compat artifacts also contain the version as the last part of the coordinates. There are multiple provides for each specified compat version. Non-compat artifact do not have any version in the virtual provides.

Example invocation of %mvn_compat_version
# Assuming the package has name bar and version 3
# Sets the compat version of foo:bar artifact to 3
%mvn_compat_version foo:bar 3
# The installed artifact file (assuming it's jar and there were no
# %mvn_file calls) will be at %{_javadir}/bar/bar-3.jar
# The generated provides for foo:bar will be
# mvn(foo:bar:3) = 3
# mvn(foo:bar:pom:3) = 3

# Sets the compat versions of all artifacts in the build to 3 and 3.2
%mvn_compat_version : 3 3.2

3.3. Dependency Handling

RPM has multiple types of metadata to describe dependency relationships between packages. The two basic types are Requires and Provides. Requires denote that a package needs something to be present at runtime to work correctly and the package manager is supposed to ensure that requires are met. A single Requires item can specify a package or a virtual provide. RPM Provides are a way to express that a package provides certain capability that other packages might need. In case of Maven packages, the Provides are used to denote that a package contains certain Maven artifact. They add more flexibility to the dependency management as single package can have any number of provides, and they can be moved across different packages without breaking other packages' requires. Provides are usually generated by automatic tools based on the information from the built binaries or package source.

Dependency handling for Maven packages

The Java packaging tooling on Fedora provides automatic Requires and Provides generation for packages built using XMvn. The Provides are based on Maven artifact coordinates of artifacts that were installed by the currently being built. They are generated for each subpackage separately. They follow a general format mvn(groupId:artifactId:extension:classifier:version), where the extension is omitted if its jar and classifier is omitted if empty. Version is present only for compat artifacts, but the trailing colon has to be present unless it is a Jar artifact with no classifier.

# Example provide for Jar artifact
mvn(org.eclipse.jetty:jetty-server)
# Example provide for POM artifact
mvn(org.eclipse.jetty:jetty-parent:pom:)
# Example provide for Jar artifact with classifier
mvn(org.sonatype.sisu:sisu-guice::no_aop:)

The generated Requires are based on dependencies specified in Maven POMs in the project. Only dependencies with scope set to either compile, runtime or not set at all are used for Requires generation. Requires do not rely on package names and instead always use virtual provides that were described above, in exactly the same format, in order to be satisfiable by the already existing provides. For packages consisting of multiple subpackages, Requires are generated separately for each subpackage. Additionally, Requires that point to an artifact in a different subpackage of the same source package are generated with exact versions to prevent version mismatches between artifacts belonging to the same project.

The requires generator also always generates Requires on java-headless and javapackages-tools.

Dependency handling for non-Maven packages that ship POM files

If the package is built built using different tool than Apache Maven, but still ships Maven POM(s), the you will still get automatic provides generation if you install the POM using %mvn_artifact and %mvn_install. The requires generation will also be executed but the outcome largely depends on whether the POM contains accurate dependency insformation. If it contains dependency information, you should double-check that it is correct and up-to-date. Otherwise you need to add Requires tags manually as described in the next section.

Dependency handling for non-Maven packages that don’t ship POM files

For packages without POMs it is necessary to specify Requires tags manually. In order to build the package you needed to specify BuildRequires tags. Your Requires tags will therefore likely be a subset of your BuildRequires, excluding build tools and test only dependencies.

Querying Requires and Provides of built packages

The generated Requires and Provides of built packages can be queried using rpm:

rpm -qp --provides path/to/example-1.0.noarch.rpm
rpm -qp --requires path/to/example-1.0.noarch.rpm
Generating BuildRequires

While Requires and Provides generation is automated for Maven projects, BuildRequires still remains a manual task. However, there are tools to simplify it to some extent. XMvn ships a script xmvn-builddep that takes a build.log output from mock and prints Maven-style BuildRequires on artifacts that were actually used during the build. It does not help you to figure out what the BuildRequires are before you actually build it, but it may help you to have a minimal set of BuildRequires that are less likely to break, as they do not rely on transitive dependencies.

3.4. Javadoc packages

Javadoc subpackages in Fedora provide automatically generated API documentation for Java libraries and applications. Java Development Kit comes with tool called javadoc. This tool can be used for generating the documentation from specially formated comments in Java source files. Output of this tool, together with license files, usually represents only content of javadoc subpackages. Note javadoc invocation is typically handled by build system and package maintainer does not need to deal with it directly.

Javadoc subpackage shouldn’t depend on its base package and vice versa. The rationale behind this rule is that documentation can usually be used independently from application / library and therefore base package does not need to be always installed. Users are given an option to install application and documentation separately.

You can learn more about javadoc from official documentation.

3.5. Core Java packages

3.5.1. JVM

Fedora allows multiple Java Virtual Machines (JVMs) to be packaged independently. Java packages should not directly depend on any particulat JVM, but instead require one of three virtual JVM packages depending of what Java funtionality is required.

java-headless

This package provides a working Java Runtime Environment (JRE) with some functionality disabled. Graphics and audio support may be unavailable in this case. java-headless provides functionality that is enough for most of packages and avoids pulling in a number of graphics and audio libraries as dependencies. Requirement on java-headless is appropriate for most of Java packages.

java

Includes the same base functionality as java-headless, but also implements audio and graphics subsystems. Packages should require java if they need some functionality from these subsystems, for example creating GUI using AWT library.

java-devel

Provides full Java Development Kit (JDK). In most cases only packages related to Java development should have runtime dependencies on java-devel. Runtime packages should require java-headless or java. Some packages not strictly related to java development need access to libraries included with JDK, but not with JRE (for example tools.jar). That is one of few cases where requiring java-devel may be necessary.

Packages that require minimal Java standard version can add versioned dependencies on one of virtual packages providing Java environment. For example if packages depending on functionality of JDK 8 can require java-headless >= 1:1.8.0.

Epoch in versions of JVM packages

For compatibility with JPackage project packages providing Java 1.6.0 or later use epoch equal to 1. This was necessary because package java-1.5.0-ibm from JPackage project had epoch 1 for some reason. Therefore packages providing other implementations of JVM also had to use non-zero epoch in order to keep version ordering correct.

3.5.2. Java Packages Tools

Java Packages Tools are packaged as several binary RPM packages:

maven-local

This package provides a complete environment which is required to build Java packages using Apache Maven build system. This includes a default system version of Java Development Kit (JDK), Maven, a number of Maven plugins commonly used to build packages, various macros and utlilty tools. maven-local is usually declared as build dependency of Maven packages.

ivy-local

Analogously to maven-local, this package provides an environment required to build Java packages using Apache Ivy as dependency manager.

javapackages-local

Package providing a basic environment necessary to geterate and install metadata for system artifact repository.

javapackages-tools

Package owning basic Java directories and providing runtime support for Java packages. The great majority of Java packages depend on javapackages-tools.

4. Packaging Best Practices

Packaging Java has certain specifics that will be covered in this section which will cover basic packaging principles:

  • No bundling

  • Working with upstreams

  • Commenting workarounds

  • Single library version

  • Links to other appropriate documents

  • …​

5. Generic Java Builds

This chapter talks about basic build steps in Java such as invoking javac and using spec macros like build-claspath and build-jar-repository.

5.1. Generating Application Shell Scripts

As mentioned in section about Java packaging basics, all Java applications need wrapper shell scripts to setup the environment before running JVM and associated Java code.

The javapackages-tools package contains a convenience %jpackage_script macro that can be used to create scripts that work for the majority of packages. See its definition and documentation in /usr/lib/rpm/macros.d/macros.jpackage. One thing to pay attention to is the 6th argument to it - whether to prefer a JRE over a full SDK when looking up a JVM to invoke. Most packages that do not require the full Java SDK will want to set that to true to avoid unexpected results when looking up a JVM when some of the installed JREs do not have the corresponding SDK (*-devel package) installed.

%install
...
%jpackage_script msv.textui.Driver "" "" msv-msv:msv-xsdlib:relaxngDatatype:isorelax msv true
...

The previous example installs the msv script (5th argument) with main class being msv.textui.Driver (1st argument). No optional flags (2nd argument) or options (3rd argument) are used. This script will add several libraries to classpath before executing main class (4th argument, JAR files separated with :). build-classpath is run on every part of 4th argument to create full classpaths.

Sometimes it may be needed to replace all JAR files in current directory with symlinks to the system JARs located in %{_javadir}. This task can be achieved using tool called xmvn-subst.

$ ls -l
-rw-r--r--. 1 msrb msrb  40817 Oct 22 09:16 cli.jar
-rw-r--r--. 1 msrb msrb 289983 Oct 22 09:17 junit4.jar
-rw-r--r--. 1 msrb msrb 474276 Oct 22 09:14 log4j.jar
$ xmvn-subst .
[INFO] Linked ./cli.jar to /usr/share/java/commons-cli.jar
[INFO] Linked ./log4j.jar to /usr/share/java/log4j.jar
[INFO] Linked ./junit4.jar to /usr/share/java/junit.jar
$ ls -la
lrwxrwxrwx. 1 msrb msrb   22 Oct 22 10:08 cli.jar -> /usr/share/java/commons-cli.jar
lrwxrwxrwx. 1 msrb msrb   22 Oct 22 10:08 junit4.jar -> /usr/share/java/junit.jar
lrwxrwxrwx. 1 msrb msrb   22 Oct 22 10:08 log4j.jar -> /usr/share/java/log4j.jar

The example above shows how easy the symlinking can be. However, there are some limitations. Original JAR files need to carry metadata which tell xmvn-subst for what artifact given file should be substituted. Otherwise xmvn-subst won’t be able to identify the Maven artifact from JAR file.

See xmvn-subst -h for all available options.

6. Ant

Apache Ant is a Java library and command-line tool whose mission is to drive processes described in build files as targets and extension points dependent upon each other.

— https://ant.apache.org/

Apache Ant is one of the most popular Java build tools after Apache Maven. The main difference between these two tools is that Ant is procedural and Maven is declarative. When using Ant, it is neccessary to exactly describe the processes which lead to the result. It means that one needs to specify where the source files are, what needs to be done and when it needs to be done. On the other hand, Maven relies on conventions and doesn’t require specifying most of the process unless you need to override the defaults.

If upstream ships a Maven POM file, it must be installed even if you don’t build with Maven. If not, you should try to search Maven Central Repository for it, ship it as another source and install it.

Common spec file
BuildRequires: ant
BuildRequires: javapackages-local
...
%build
ant test

%install
%mvn_artifact pom.xml lib/%{name}.jar

%mvn_install -J api/

%files -f .mfiles
%files javadoc -f .mfiles-javadoc
Details
  • %build section uses ant command to build the project and run the tests. The used target(s) may vary depending on the build.xml file. You can use ant -p command to list the project info or manually look for <target> nodes in the build.xml file.

  • %mvn_artifact macro is used to request installation of an artifact that was not built using Maven. It expects a POM file and a JAR file. For POM only artifacts, the JAR part is omitted. See Installing additional artifacts for more information.

  • %mvn_install performs the actual installation. Optional -J parameter requests installation of generated Javadoc from given directory.

  • This method of artifact installation allows using other XMvn macros such as %mvn_alias or %mvn_package.

  • %mvn_install generates .mfiles file which should be used to populate %files section with -f switch. For each subpackage there would be separate generated file named .mfiles-subpackage-name.

  • All packages are required to own directories which they create (and which are not owned by other packages). JAR files are by default installed into subdirectory of %{_javadir}. To override this behavior, use %mvn_file.

6.1. Apache Ivy

Apache Ivy provides an automatic dependency management for Ant managed builds. It uses Maven repositories for retrieving artifacts and supports many declarative features of Maven such as handling transitive dependencies.

XMvn supports local resolution of Ivy artifacts, their installation and requires generation.

Spec file
BuildRequires: ivy-local
...
%build
ant -Divy.mode=local test

%install
%mvn_artifact ivy.xml lib/sample.jar

%mvn_install -J api/

%files -f .mfiles
%files -javadoc -f .mfiles-javadoc
Details
  • -Divy.mode=local tells Ivy to use XMvn local artifact resolution instead of downloading from the Internet.

  • If there is an ivy-settings.xml or similar file, which specifies remote repositories, it needs to be disabled, otherwise it would override local resolution.

  • %mvn_artifact supports installing artifacts described by Ivy configuration files.

  • %mvn_install performs the actual installation. Optional -J parameter requests installation of generated Javadoc from given directory.

Ivy files manipulation

A subset of macros used to modify Maven POMs also work with ivy.xml files allowing the maintainer to add / remove / change dependencies without the need of making patches and rebasing them with each change. You can use dependency handling macros %pom_add_dep, %pom_remove_dep, %pom_change_dep and generic %pom_xpath_* macros. For more details, see corresponding manpages.

# Remove dependency on artifact with org="com.example" and
# name="java-project" from ivy.xml file in current directory
%pom_remove_dep com.example:java-project

# Add dependency on artifact with org="com.example" and
# name="foobar" to ./submodule/ivy.xml
%pom_add_dep com.example:foobar submodule
Using the ivy:publish task

Ivy supports publishing built artifact with ivy:publish task. If your build.xml file already contains a task that calls ivy:publish, you can set the resolver attribute of the ivy:publish element to xmvn. This can be done with simple %pom_xpath_set call. Then when the task is run, XMvn can pick the published artifacts and install them during the run of %mvn_install without needing you to manually specify them with %mvn_artifact.

Spec file using the ivy:publish task
BuildRequires: ivy-local
...
%prep
%pom_xpath_set ivy:publish/@resolver xmvn build.xml

%build
ant -Divy.mode=local test publish-local

%install
%mvn_install -J api/

%files -f .mfiles
%files -javadoc -f .mfiles-javadoc
Details
  • The publish target may be named differently. Search the build.xml for occurences of ivy:publish.

  • %mvn_install will install all the published artifacts.

7. Maven

Apache Maven is a software project management and comprehension tool. Based on the concept of a project object model (POM), Maven can manage a project’s build, reporting and documentation from a central piece of information.

— https://maven.apache.org

Maven is by far the most consistent Java build system, allowing large amount of automation. In most common situations only following steps are necessary:

  1. In %build section of the spec file use %mvn_build macro.

  2. In %install section, use %mvn_install macro.

  3. Use generated file .mfiles lists to populate %files section with -f switch.

Common spec file sections
BuildRequires:  maven-local
...
%build
%mvn_build
...

%install
%mvn_install
...

%files -f .mfiles
%dir %{_javadir}/%{name}

%files javadoc -f .mfiles-javadoc

The macros %mvn_build and %mvn_install automatically handle building of the JAR files and their subsequent installation to the correct directory. The corresponding POM and metadata files are also installed.

7.1. Packaging Maven project

This step by step guide will show you how to package Maven project. Let’s start with probably the simplest spec file possible.

Unresolved directive in packaging_maven_project.adoc - include::{EXAMPLE}maven_project/simplemaven.spec[]

The spec file above is a real world example how it may look like for simple Maven project. Both %build and %install sections consist only of one line.

Another interesting line:

10: BuildRequires:  maven-local

All Maven projects need to have BuildRequires on maven-local. They also need to have Requires and BuildRequires on jpackages-utils, but build system adds these automatically. The package maintainer does not need to list them explicitly.

31: %dir %{_javadir}/%{name}

By default, resulting JAR files will be installed in %{_javadir}/%{name}, therefore the package needs to own this directory.

The build could fail from many reasons, but one probably most common is build failure due to missing dependencies.

We can try to remove these missing dependencies from pom.xml and make Maven stop complaining about them. However, these removed dependencies may be crucial for building of the project and therefore it may be needed to package them later. Let’s remove the dependencies from pom.xml.

Remove dependencies from pom.xml
...
%prep
%setup -q

# Add following lines to %prep section of a spec file
%pom_remove_dep :commons-io
%pom_remove_dep :junit

The package maintainer can use a wide variety of “pom_” macros for modifying pom.xml files. See the Macros for POM modification section for more information.

Now try to build the project again. The build will fail with a compilation failure.

Oops, another problem. This time Maven thought it had all the necessary dependencies, but Java compiler found otherwise.

Now it is possible to either patch the source code not to depend on missing libraries or to package them. The second approach is usually correct. It is not necessary to package every dependency right away. The maintainer could package compile time dependencies first and keep the rest for later (test dependencies, etc.). But Maven needs to know that it should not try to run tests now. This can be achieved by passing -f option to %mvn_build macro. Maven will stop complaining about missing test scoped dependencies from now on.

Another reason to disable the test phase is to speed up the local build process. This can also be achieved by specifying an additional switch --without=tests to the fedpkg or the mock tool instead of adding a switch to %mvn_build.

Another switch --without=javadoc causes the build to skip Javadoc generation.

It is always recommended to run all available test suites during build. It greatly improves quality of the package.

We already have package which provides commons-io:commons-io artifact, let’s add it to the BuildRequires. Also disable tests for now.

BuildRequires:  maven-local
BuildRequires:  apache-commons-io
...
%prep
%setup -q

# Comment out following lines in %prep section
#%%pom_remove_dep :commons-io
#%%pom_remove_dep :junit

%build
# Skip tests for now, missing dependency junit:junit:4.11
%mvn_build -f

One can easily search for package which provides the desired artifact. Try dnf repoquery --whatprovides 'mvn(commons-io:commons-io)', or see how to query repositories.

Now try to build the project one more time. The build should succeed now. Congrats, you managed to create an RPM from Maven project!

There is plenty of other things maintainer may want to do. For example, they may want to provide symbolic links to the JAR file in %{_javadir}.

This can be easily achieved with %mvn_file macro:

%prep
%setup -q

%mvn_file : %{name}/%{name} %{name}

See Alternative JAR File Names section for more information.

Another quite common thing to do is adding aliases to Maven artifact. Try to run rpm -qp --provides on your locally built RPM package:

$ rpm -qp --provides simplemaven-1.0-1.fc21.noarch.rpm
mvn(com.example:simplemaven) = 1.0
simplemaven = 1.0-1.fc21

The output above tells us that the RPM package provides Maven artifact com.example:simplemaven:1.0. Upstream may change the groupId:artifactId with any new release. And it happens. For example org.apache.commons:commons-io changed to commons-io:commons-io some time ago. It is not a big deal for package itself, but it is a huge problem for other packages that depends on that particular package. Some packages may still have dependencies on old groupId:artifactId, which is suddenly unavailable. Luckily, there is an easy way how to solve the problems like these. Package maintainer can add aliases to actually provided Maven artifact.

Add alias to Maven artifact
%mvn_alias org.example:simplemaven simplemaven:simplemaven

See Additional Mappings for more information on %mvn_alias.

Rebuild the pacakge and check rpm -qp --provides output again:

$ rpm -qp --provides simplemaven-1.0-2.fc21.noarch.rpm
mvn(com.example:simplemaven) = 1.0
mvn(simplemaven:simplemaven) = 1.0
simplemaven = 1.0-2.fc21

Now it does not matter if some other package depends on either of these listed artifact. Both dependencies will always be satisfied with your package.

One could try to fix dependencies in all the dependent packages instead of adding an alias to single package. It is almost always wrong thing to do.

7.2. Macros for Maven build configuration

Maven builds can be configured to produce alternative layout, include additional aliases in package metadata or create separate subpackages for certain artifacts.

7.2.1. Installing additional artifacts

It is possible to explicitly request an installation of any Maven artifact (JAR / POM file). Macro %mvn_install only knows about Maven artifacts that were created during execution of %mvn_build. Normally, any other artifacts which were built by some other method would need to be installed manually. %mvn_build macro does not even need to be used at all. Luckily, all artifacts created outside of %mvn_build can be marked for installation with %mvn_artifact macro. This macro creates configuration for %mvn_install.

Requesting installation of Maven artifact
%prep
...
# Request installation of POM and JAR file
%mvn_artifact subpackage/pom.xml target/artifact.jar
# Request installation of POM artifact (no JAR file)
%mvn_artifact pom.xml
# Request installation for JAR file specified by artifact coordinates
%mvn_artifact webapp:webapp:war:3.1 webapp.war

7.2.2. Additional Mappings

The macro %mvn_alias can be used to add additional mappings for given POM / JAR file. For example, if the POM file indicates that it contains groupId commons-lang, artifactId commons-lang, this macro ensures that we also add a mapping between groupId org.apache.commons and the installed JAR / POM files. This is necessary in cases where the groupId or artifactId may have changed, and other packages might require different IDs than those reflected in the installed POM.

Adding more mappings for JAR/POM files example
%prep
...
%mvn_alias "commons-lang:commons-lang" "org.apache.commons:commons-lang"

7.2.3. Alternative JAR File Names

In some cases, it may be important to be able to provide symbolic links to actual JAR files. This can be achieved with %mvn_file macro. This macro allows packager to specify names of the JAR files, their location in %{_javadir} directory and also can create symbolic links to the JAR files. These links can be possibly located outside of the %{_javadir} directory.

Adding file symlinks to compatibility
%prep
...
%mvn_file :guice google/guice guice

This means that JAR file for artifact with ID "guice" (and any groupId) will be installed in %{_javadir}/google/guice.jar and there also will be a symbolic links to this JAR file located in %{_javadir}/guice.jar. Note the macro will add .jar extensions automatically.

7.2.4. Single Artifact Per Package

If the project consists of multiple artifacts, it is recommended to install each artifact to the separate subpackage. The macro %mvn_build -s will generate separate .mfiles file for every artifact in the project. This file contains list of files related to specific artifact (typically JAR file, POM file and metadata). It can be later used in %files section of the spec file.

Creating one subpackage for each generated artifact
...
%description
The Maven Plugin Tools contains...

%package -n maven-plugin-annotations
Summary:        Maven Plugin Java 5 Annotations

%description -n maven-plugin-annotations
This package contains Java 5 annotations to use in Mojos.

%package -n maven-plugin-plugin
Summary:        Maven Plugin Plugin

%description -n maven-plugin-plugin
The Plugin Plugin is used to...
...

%build
%mvn_build -s

%install
%mvn_install

%files -f .mfiles-maven-plugin-tools
%doc LICENSE NOTICE
%files -n maven-plugin-annotations -f .mfiles-maven-plugin-annotations
%files -n maven-plugin-plugin      -f .mfiles-maven-plugin-plugin
%files -f .mfiles-javadoc
...

7.2.5. Assignment of the Maven Artifacts to the Subpackages

The macro %mvn_package allows maintainer to specify in which exact package the selected artifact will end up. It is something between singleton packaging, when each artifact has its own subpackage and default packaging, when all artifacts end up in the same package.

Assigning multiple artifacts to single subpackage
...
%prep
%mvn_package ":plexus-compiler-jikes"   plexus-compiler-extras
%mvn_package ":plexus-compiler-eclipse" plexus-compiler-extras
%mvn_package ":plexus-compiler-csharp"  plexus-compiler-extras

%build
%mvn_build

%install
%mvn_install

%files -f .mfiles
%files -f .mfiles-plexus-compiler-extras
%files -f .mfiles-javadoc

In above example, the artifacts plexus-compiler-jikes, plexus-compiler-eclipse, plexus-compiler-csharp will end up in package named plexus-compiler-extras. If there are some other artifacts beside these three mentioned (e.g. some parent POMs), then these will all end up in package named %{name}.

%mvn_package macro supports wildcards and brace expansions, so whole %prep section from previous example can be replaced with single line: %mvn_package ":plexus-compiler-{jikes,eclipse,csharp}" plexus-compiler-extras.

It is possible to assign artifacts into a package called __noinstall. This package name has a special meaning. And as you can guess, artifacts assigned into this package will not be installed anywhere and the package itself will not be created.

Skipping installation of an artifact
%prep
...
%mvn_package groupId:artifactId __noinstall

7.2.6. Modifying XMvn configuration from within spec file

Some packages might need to modify XMvn’s configuration in order to build successfully or from other reasons. This can be achieved with mvn_config macro. For example, some old package can use enum as an identifier, but it is also keyword since Java 1.5. Such package will probably fail to build on current systems. This problem can be easily solved by passing -source 1.4 to the compiler, so one could add following line to the spec file:

Overriding default XMvn configuration
%prep
...
%mvn_config buildSettings/compilerSource 1.4

XMvn’s configuration is quite complex, but well documented at the project’s official website. The website should always be used as a primary source of information about XMvn configuration.

Read about XMvn’s configuration basics and see the full configuration reference.

All %mvn_ macros have their own manual page which contains details on how to use them. All possible options should be documented there. These manual pages should be considered most up to date documentation right after source code. Try for example man mvn_file. These pages are also included in the Appendix.

7.3. Macros for POM modification

Sometimes Maven pom.xml files need to be patched before they are used to build packages. One could use traditional patches to maintain changes, but package maintainers should use %pom_* macros developed specially to ease this task. Using %pom_* macros not only increases readability of the spec file, but also improves maintainability of the package as there are no patches that would need to be rebased with each upstream release.

There are two categories of macros:

  • POM-specific macros - used to manipulate dependencies, modules, etc. Some of them also work on ivy.xml files.

  • Generic XML manipulation macros - used to add / remove / replace XML nodes.

The macros are designed to be called from %prep section of spec files. All the macros also have their own manual page. This document provides an overview of how they are used. For the technical details, refer to their respective manpages.

File specfication

By default, a macro acts on a pom.xml file (or ivy.xml file) in the current directory. Different path can be explicitly specified via an argument (the last one, unless stated otherwise). Multiple paths can be specified as multiple arguments. If a path is a directory, it looks for a pom.xml file in that directory. For example:

# The following works on pom.xml file in the current directory
%pom_remove_parent

# The following works on submodule/pom.xml
%pom_remove_parent submodule

# The following works on submodule/pom.xml as well
%pom_remove_parent submodule/pom.xml

# The following works on submodule2/pom.xml and submodule2/pom.xml
%pom_remove_parent submodule1 submodule2
Recursive mode

Most macros also support recursive mode, where the change is applied to the pom.xml and all its modules recursively. This can be used, for example, to remove a dependency from the whole project. It is activated by -r switch.

7.3.1. Dependency manipulation macros

Removing dependencies

Often dependencies specified in Maven pom.xml files need to be removed because of different reasons. %pom_remove_dep macro can be used to ease this task:

# Removes dependency with groupId "foo" and artifactId "bar" from pom.xml
%pom_remove_dep foo:bar

# Removes dependency on all artifacts with groupId "foo" from pom.xml
%pom_remove_dep foo:

# Removes dependency on all artifacts with artifactId "bar" from pom.xml
%pom_remove_dep :bar

# Removes dependency on all artifacts with artifactId "bar" from submodule1/pom.xml
%pom_remove_dep :bar submodule1

# Removes dependency on all artifacts with artifactId "bar" from pom.xml
# and all its submodules
%pom_remove_dep -r :bar

# Removes all dependencies from pom.xml
%pom_remove_dep :
Adding dependencies

Dependencies can also be added to pom.xml with %pom_add_dep macro. Usage is very similar to %pom_remove_dep, see $ man pom_add_dep for more information.

Changing dependencies

Sometimes the artifact coordinates used in upstream pom.xml do not correspond to ones used in Fedora and you need to modify them. %pom_change_dep macro will modify all dependencies matching the first argument to artifact coordinates specified by the second argument. Note this macro also works in recursive mode.

# For all artifacts in pom.xml that have groupId 'example' change it to
# 'com.example' while leaving artifactId and other parts intact
%pom_change_dep example: com.example:

7.3.2. Adding / removing plugins

%pom_remove_plugin macro works exactly as %pom_remove_dep, except it removes Maven plugin invocations. Some examples:

Removing Maven plugins from pom.xml files
# Disables maven-jar-plugin so that classpath isn't included in manifests
%pom_remove_plugin :maven-jar-plugin

# Disable a proprietary plugin that isn't packaged for Fedora
%pom_remove_plugin com.example.mammon:useless-proprietary-plugin submodule

Like in previous case, there is also a macro for adding plugins to pom.xml. See its manual page for more information.

7.3.3. Disabling unneeded modules

Sometimes some submodules of upstream project cannot be built for various reasons and there is a need to disable them. This can be achieved by using %pom_disable_module, for example:

Disabling specific project modules
# Disables child-module-1, a submodule of the main pom.xml file
%pom_disable_module child-module-1

# Disables grandchild-module, a submodule of child-module-2/pom.xml
%pom_disable_module grandchild-module child-module-2

7.3.4. Working with parent POM references

Macro %pom_remove_parent removes reference to a parent POM from Maven POM files. This can be useful when parent POM is not yet packaged (e.g. because of licensing issues) and at the same time it is not really needed for building of the project. There are also macros for adding parent POM reference (%pom_add_parent) and replacing existing reference with new one (%pom_set_parent).

Manipulating parent POM references
# Remove reference to a parent POM from ./pom.xml
%pom_remove_parent

# Remove reference to a parent POM from ./submodule/pom.xml
%pom_remove_parent submodule

# Add parent POM reference to ./pom.xml
%pom_add_parent groupId:artifactId

# Replace existing parent POM reference in ./pom.xml
%pom_set_parent groupId:artifactId:version

7.3.5. Macros for performing generic modifications

The above macros cover the most common cases of modifying pom.xml files, however if there is a need to apply some less-common patches there are also three generic macros for modifying pom.xml files. These generic macros can also be applied to other XML files, such as Ant’s build.xml files.

They all take a XPath 1.0 expression that selects XML nodes to be acted on (removed, replaced, etc.).

Handling XML namespaces

POM files use a specific namespace - http://maven.apache.org/POM/4.0.0. The easiest way to respect this namespace in XPath expressions is prefixing all node names with pom:. For example, pom:environment/pom:os will work because it selects nodes from pom namespace, but environment/os won’t find anything because it looks for nodes that do not belong to any XML namespace. It is needed even if the original POM file didn’t contain proper POM namespace, since it will be added automatically. Note that this requirement is due to limitation of XPath 1.0 and we cannot work it around.

Removing nodes

%pom_xpath_remove can be used to remove arbitrary XML nodes.

# Removes extensions from the build
%pom_xpath_remove "pom:build/pom:extensions" module/pom.xml
Injecting nodes

%pom_xpath_inject macro is capable of injecting arbitrary XML code to any pom.xml file. The injected code is the last argument - optional file paths go before it (unlike most other macros). To pass a multiline snippet, quote the argument as in the following example.

# Add additional exclusion into maven-wagon dependency
%pom_xpath_inject "pom:dependency[pom:artifactId='maven-wagon']/pom:exclusions" "
<exclusion>
  <groupId>antlr</groupId>
  <artifactId>antlr</artifactId>
</exclusion>"
# The same thing, but with explicit file path
%pom_xpath_inject "pom:dependency[pom:artifactId='maven-wagon']/pom:exclusions" pom.xml "
<exclusion>
  <groupId>antlr</groupId>
  <artifactId>antlr</artifactId>
</exclusion>"
Changing nodes' content

%pom_xpath_set replaces content of the arbitrary XML nodes with specified value (can contain XML nodes).

# Change groupId of a parent
%pom_xpath_set "pom:parent/pom:groupId" "org.apache"
Replacing nodes

%pom_xpath_replace replaces a XML node with specified XML code.

# Change groupId of a parent (note the difference from %pom_xpath_set)
%pom_xpath_replace "pom:parent/pom:groupId" "<groupId>org.apache</groupId>"

8. Common Errors

This section contains explanations and solutions/workarounds for common errors which can be encountered during packaging.

8.1. Missing dependency

[ERROR] Failed to execute goal on project simplemaven:
Could not resolve dependencies for project com.example:simplemaven:jar:1.0:
The following artifacts could not be resolved: commons-io:commons-io:jar:2.4, junit:junit:jar:4.11:
Cannot access central (http://repo.maven.apache.org/maven2) in offline mode and the artifact commons-io:commons-io:jar:2.4 has not been downloaded from it before. -> [Help 1]

Maven wasn’t able to build project com.example:simplemaven because it couldn’t find some dependencies (in this case commons-io:commons-io:jar:2.4 and junit:junit:jar:4.11)

You have multiple options here:

  • If you suspect that a dependency is not necessary, you can remove it from pom.xml file and Maven will stop complaining about it. You can use wide variety of macros for modifying POM files. The one for removing dependencies is called %pom_remove_dep.

  • There is a mock plugin that can automate installation of missing dependencies. When you’re using mock, pass additional --enable-plugin pm_request argument and the build process would be able to install missing dependencies by itself. You still need to add the BuildRequires later, because you need to build the package in Koji, where the plugin is not allowed. You should do so using xmvn-builddep build.log, where build.log is the path to mock’s build log. It will print a list of BuildRequires lines, which you can directly paste into the specfile. To verify that the BuildRequires you just added are correct, you can rebuild the package once more without the plugin enabled.

  • Add the artifacts to BuildRequires manually. Maven packages have virtual provides in a format mvn(artifact coordinates), where artifact coordinates are in the format which Maven used in the error message, but without version for non-compat packages (most of the packages you encounter). Virtual provides can be used directly in BuildRequires, so in this case it would be:

BuildRequires:  mvn(commons-io:commons-io)
BuildRequires:  mvn(junit:junit)

8.2. Compilation failure

[ERROR] Failed to execute goal
        org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile)
        on project simplemaven: Compilation failure: Compilation failure:
[ERROR] /builddir/build/BUILD/simplemaven-1.0/src/main/java/com/example/Main.java:[3,29] package org.apache.commons.io does not exist
[ERROR] /builddir/build/BUILD/simplemaven-1.0/src/main/java/com/example/Main.java:[8,9] cannot find symbol
[ERROR] symbol:   class FileUtils
[ERROR] location: class com.example.Main
[ERROR] -> [Help 1]

Java compiler couldn’t find given class on classpath or incompatible version was present. This could be caused by following reasons:

  • pom.xml requires different version of the Maven artifact than the local repository provides

  • pom.xml is missing a necessary dependency

Different versions of same library may provide slightly different API. This means that project doesn’t have to be buildable if different version is provided. If the library in local repository is older than the one required by project, then the library could be updated. If the project requires older version, then the project should be ported to latest stable version of the library (this may require cooperation with project’s upstream). If none of these is possible from some reason, it is still possible to introduce new compat package. See compat packages section for more information on this topic.

Sometimes pom.xml doesn’t list all the necessary dependencies, even if it should. Dependencies can also depend on some other and typically all these will be available to the project which is being built. The problem is that local repository may contain different versions of these dependencies. And even if these versions are fully compatible with the project, they may require slightly different set of dependencies. This could lead to build failure if pom.xml doesn’t specify all necessary dependencies and relies on transitive dependencies. Such a missing dependency may be considered a bug in the project. The solution is to explicitly add missing dependency to the pom.xml. This may be easily done by using %pom_add_dep macro. See the section about macros for POM modification for more information.

8.3. Requires cannot be generated

Following dependencies were not resolved and requires cannot be generated.
Either remove the dependency from pom.xml or add proper packages to
BuildRequires: org.apache.maven.doxia:doxia-core::tests:UNKNOWN

Most often this error happens when one part of the package depends on an attached artifact which is not being installed. Automatic RPM requires generator then tries to generate requires on artifact which is not being installed. This would most likely result in a broken RPM package so generator halts the build.

There are usually two possible solutions for this problem:

  • Install attached artifact in question. For the above error following macro would install artifacts with tests classifiers into tests subpackage.

%mvn_package :::tests: %{name}-tests
  • Remove dependency on problematic artifact. This can involve pom.xml modifications, disabling tests or even code changes so it is usually easier to install the dependency.

8.4. Dependencies with scope system

[ERROR] Failed to execute goal org.fedoraproject.xmvn:xmvn-mojo:1.2.0:install (default-cli) on project pom: Some reactor artifacts have dependencies with scope "system".
Such dependencies are not supported by XMvn installer.
You should either remove any dependencies with scope "system" before the build or not run XMvn instaler. -> [Help 1]

Some Maven artifacts try to depend on exact system paths. Most usually this dependency is either on com.sun:tools or sun.jdk:jconsole. Dependencies with system scope cause issues with our tooling and requires generators so they are not supported.

Easiest way to solve this for above two dependencies is by removing and adding back the dependency without <scope> or <systemPath> nodes:

%pom_remove_dep com.sun:tools
%pom_add_dep com.sun:tools

9. Migration from older tools

This section describes how to migrate packages that use older deprecated tools to current ones.

9.1. %add_maven_depmap macro

%add_maven_depmap macro was used to manually install Maven artifacts that were built with Apache Ant or mvn-rpmbuild. It is now deprecated and its invocations should be replaced with %mvn_artifact and %mvn_install.

Artifact files, Maven POM files and their installation directories no longer need to be manually installed, since that is done during run of %mvn_install. The installed files also don’t need to be explicitly enumerated in %files section. Generated file .mfiles should be used instead.

Relevant parts of specfile using %add_maven_depmap:

BuildRequires:  javapackages-tools

Requires:       some-library
...

%build
ant test

%install
install -d -m 755 $RPM_BUILD_ROOT%{_javadir}
install -m 644 target/%{name}.jar $RPM_BUILD_ROOT%{_javadir}/%{name}.jar

install -d -m 755 $RPM_BUILD_ROOT%{_mavenpomdir}
install -m 644 %{name}.pom $RPM_BUILD_ROOT/%{_mavenpomdir}/JPP-%{name}.pom

# Note that the following call is equivalent to invoking the macro
# without any parameters
%add_maven_depmap JPP-%{name}.pom %{name}.jar

# javadoc
install -d -m 755 $RPM_BUILD_ROOT%{_javadocdir}/%{name}
cp -pr api/* $RPM_BUILD_ROOT%{_javadocdir}/%{name}

%files
%{_javadir}/*
%{_mavenpomdir}/*
%{_mavendepmapfragdir}/*

%files javadoc
%doc %{_javadocdir}/%{name}

The same specfile migrated to %mvn_artifact and %mvn_install:

# mvn_* macros are located in javapackages-local package
BuildRequires:  javapackages-local

# Since XMvn generates requires automatically, it is no longer needed
# nor recommended to specify manual Requires tags, unless the dependency
# information in the POM is incomplete or you need to depend on non-java
# packages
...

%prep
# The default location for installing JAR files is %{_javadir}/%{name}/
# Because our original specfile put the JAR directly to %{_javadir}, we
# want to override this behavior. The folowing call tells XMvn to
# install the groupId:artifactId artifact as %{_javadir}/%{name}.jar
%mvn_file groupId:artifactId %{name}

%build
ant test

# Tell XMvn which artifact belongs to which POM
%mvn_artifact %{name}.pom target/%{name}.jar

%install
# It is not necessary to install directories and artifacts manually,
# mvn_install will take care of it

# Optionally use -J parameter to specify path to directory with built
# javadoc
%mvn_install -J api

# Use autogenerated .mfiles file instead of specifying individual files
%files -f .mfiles
%files javadoc -f .mfiles-javadoc
Aliases and subpackages

%add_maven_depmap had -a switch to specify artifact aliases and -f switch to support splitting artifacts across multiple subpackages. To achieve the same things with %mvn_* macros, see Additional Mappings and Assignment of the Maven Artifacts to the Subpackages.

If the project consists of multiple artifacts and parent POMs are among them, call %mvn_artifact on these parent POMs first.

Unresolved directive in sections.adoc - include::{EXAMPLE}manpages.adoc[]