Automatic javadoc subpackage generation

November 3, 2011 in fedora, java, macro, packaging, rpm

Do you hate repeating the same thing over and over again? I know I do…
Java packaging guidelines state that we have to include javadocs with all java packages. This means we have to repeat following code in almost all packages (except pom and resource projects):

...
%package javadoc
Summary: API documentation for %{name}
Group: Documentation
Requires: jpackage-utils

%description javadoc
%{summary}.
...

%install
...
# javadoc
install -d -m 755 %{buildroot}%{_javadocdir}/%{name}
cp -pr target/site/apidocs/* %{buildroot}%{_javadocdir}/%{name}

...

%files javadoc
%doc LICENSE
%doc %{_javadocdir}/%{name}
...
The code is practically the same in all packages so why not automate this? Well there were two main reasons why this wasn’t done before:
  • Copying of files needs to be done during install phase
  • If package contains license, javadoc has to have it too

We solved both of these in a fairly reasonable way. Resulting macro help looks like this:


# %create_javadoc_subpackage can be used to completely create
# javadoc subpackage for java projects.
# !!! Needs to be used at the end of %install section
# There are these variables that change its behaviour:
#
# %__javadoc_license - set if the license is in non-standard place to
# prevent Requires on main package
# %__apidocs_dir - set custom path to directory with javadocs
# (defaults to target/site/apidocs)
# %__javadoc_skip_requires - if defined javadoc subpackage will not
# require main package under any circumstances (useful
# if upstream doesn't provide separate license file)
#

Is it understandable enough? If you need to generate javadocs, just make them build and then add %create_javadoc_subpackage macro call at the end of %install section. Normally you shouldn’t have to change anything. We search in a few standard places for licenses. More specifically we look for LICENSE* COPYING* doc/LICENSE* doc/COPYING* license*. Do you have more ideas where to look? It’s easy to add. If we don’t find license we automatically add requires on main package and assume you put license in there. If upstream doesn’t provide separate license file you can do %global __javadoc_skip_requires t and we will ignore licensing completely.

I’d like this added to our packaging guidelines so we can start using it. My testing shows it works fairly well. I’d love to improve it so you could place it anywhere in the spec, not just %install section, but rpm macros are…complicated.
*Note*: For gory details head over to our git repository. For now it’s in separate feature branch.

Understood and agreed with

October 16, 2011 in en, lyrics, personal, reply, song

Dear OSS/Fedora/whatever reader. Stop right here.

Oh nothing’s going to change my love for you
I wanna spend my life with you
So we make love on the grass under the moon
No one call tell, damned if I do
Forever journey on golden avenues
I drift in your eyes since I love you
I got that beat in my veins for only rule
Love is to share, mine is for you

Making packaging Maven projects easier

September 12, 2011 in en, fedora, packaging

There are two recent changes to our Java guidelines in Fedora and use of Maven when packaging that I’d like to mention today

Maven dependency mapping macros

Thing I haven’t blogged about yet but it’s pretty important: We have new macros for maven depmaps in Fedora. In the past when you wanted to map certain groupId:artifactId to a file in _javadir, you had to include snippet like this in your spec:


%add_to_maven_depmap com.google.guava guava 05 JPP guava
%add_to_maven_depmap com.google.collections google-collections 05 JPP guava

This tells our maven that com:google:guava:guava and com.google.collections:google-collections can be found in one of the repositories as JPP/guava.jar. It meant you had to know the groupId:artifactId and other information, plus it was extremely easy to make a mistake here causing all sorts of trouble. Current code doing the same thing:


%add_maven_depmap JPP-guava.pom guava.jar -a "com.google.collections:google-collections"

We parse the pom file and get groupId:artifactId from it, plus we do additional sanity checks such as:

  • naming of pom and jar file have to be consistent
  • jar file has to exist if packaging type is not pom

If you need additional mappings you can easily add them. There are few other options for this new macro useful in certain situations.

Maven test deps skipping

Long story short: When you use -Dmaven.test.skip=true in Fedora packages you no longer need to patch those test dependencies out of pom.xml.

We’ve had Apache Maven in Fedora for quite some time and packaging using Maven has been getting easier over time due to small tweaks to our packaging macros and guidelines changes. However there has been one problem that’s been bugging all Java packagers and was especially confusing for those starting to package software built with Maven. The problem is that Maven creates a tree of dependencies before it starts building the project, but it includes test dependencies even when tests are being skipped.

Skipping tests is sometimes necessary due to problems with koji, or dependencies and up until now we had to either patch those tests dependencies from pom.xml or use custom dependency mappings (ugly concept in itself).

Last week I decided it’s about time someone did something about this, so I dug in the Maven code and created a solution (more of a hack really) that is already included in Fedora. If you want the gory details, you can read the patch itself (I advise against it). I’ll try to make the patch work properly so that it can be included in mainstream code.

I can just hope that packagers will find these changes helpful, but general feedback has been positive.

Fixing fedpkg clog output to be git-friendly

August 26, 2011 in fedora git fedpkg patch bug

I have been doing proxy-maintainer now for a few people and I found
strange problem with fedpkg clog in relation to git
format-patch
and git am.

If you have a changelog message like this:


* Mon Feb 28 2011 Stanislav Ochotnicky - 2.1.1-1
- Update to 2.1.1
- Update patch
- Disable guice-eclipse for now

fedpkg commit -c would create git commit message like this:


commit 22b5306036b6ef1022498b63e40324370ff7159b (HEAD, f15)
Author: Stanislav Ochotnicky
AuthorDate: Fri Aug 26 11:45:54 2011 +0200

Update to 2.1.1
Update patch
Disable guice-eclipse for now

This works fine and mighty as long as you don’t try to produce patch
from this commit. Let’s see what happens with git format-patch
HEAD~1
.


$ head 0001-Update-to-2.1.1.patch f15 [22b5306]
From 22b5306036b6ef1022498b63e40324370ff7159b Mon Sep 17 00:00:00 2001
From: Stanislav Ochotnicky
Date: Fri, 26 Aug 2011 11:45:54 +0200
Subject: [PATCH] Update to 2.1.1 Update patch Disable guice-eclipse for now

After adding this patch to repository using git am the line
breaks would disappear. This is because git expects empty line after
subject and description of the commit afterwards.

I decided to try and fix fedpkg clog a bit. Given the previous
changelog, now it creates git message like this:


commit 768964ce2145ef2b472fc5ef8781fb036d586b0e (HEAD, f15)
Author: Stanislav Ochotnicky
AuthorDate: Fri Aug 26 11:57:20 2011 +0200

Update to 2.1.1

- Update patch
- Disable guice-eclipse for now

This means that git format-patch can do the right thing. I
filed bug
report
for fedora-package so hopefully we can have this
fixed sometime.

Addition of fedpkg rpmlint

July 27, 2011 in fedora, packaging, rpmlint

*Edit:* Yes, there is fedpkg lint but it somewhat limited so read on. Instead of addition of “rpmlint” command Pingou will improve current lint

Recently I was trying to help OpenSuSE guys with some updates to their Java stack and I was sent link to their build system. I noticed a file called jpackage-utils-rpmlintrc and this got me thinking…

What if we added rpmlint command to fedpkg with per-package rpmlint ignore settings? Turns out Pingou took my idea and implemented it in under an hour :-)

An example run:


$ fedpkg rpmlint
plexus-interpolation.spec: W: invalid-url Source0: plexus-interpolation-1.14.tar.xz
0 packages and 1 specfiles checked; 0 errors, 1 warnings.

plexus-interpolation.spec: W: invalid-url Source0: plexus-interpolation-1.14.tar.xz
plexus-interpolation.src: W: spelling-error %description -l en_US interpolator -> interpolate, interpolation, interrogator
plexus-interpolation.src: W: invalid-url Source0: plexus-interpolation-1.14.tar.xz
1 packages and 1 specfiles checked; 0 errors, 3 warnings.
2 packages run
rpmlint has not been run on rpm files but should

OK, so we can run rpmlint on spec, srpm and binary rpms with single command. But I don’t like to see the same warnings all the time, because that means I will probably miss real problems when they appear. For this fedpkg rpmlint uses .rpmlint file as additional rpmlint config. So after creating:


$ cat > .rpmlint << EOF
# we have scm checkout with comment in spec
addFilter('invalid-url')
# false positive
addFilter('spelling-error.*interpolator')
EOF
$ fedpkg rpmlint
0 packages and 1 specfiles checked; 0 errors, 0 warnings.

1 packages and 1 specfiles checked; 0 errors, 0 warnings.
2 packages run
rpmlint has not been run on rpm files but should

Cool right? Pierre sent patch with this feature to fedpkg developers, so hopefully we’ll see this addition soon. I then plan to add custom .rpmlint configurations to all my packages so that they will be warning-free.

Print expanded SourceX: urls from spec files

July 26, 2011 in fedora, packaging, python, rpm, script

I’ve noticed quite a few times that people add comments to their Source0: urls without macros to seemingly simplify manual downloading. It looks like this:

Name: jsoup
Version: 1.6.1
...
# http://jsoup.org/packages/jsoup-1.6.1-sources.jar
Source0: http://%{name}.org/packages/%{name}-%{version}-sources.jar

This creates burden on maintainers to keep those urls up-to-date as version changes, so I created simple python script for printing out Source urls from spec files:


#!/usr/bin/python

import rpm
import sys

ts=rpm.TransactionSet()
spec_obj = ts.parseSpec(sys.argv[1])

sources = spec_obj.sources

for url, num, flags in sources:
print url

Chmod this +x, put into your PATH and enjoy by giving it path to spec file.
*Edit*: Probably much nicer way to do the same thing already present on your system (courtesy of Alexander Kurtakov):

spectool X.spec

I knew there was something like this, but forgot what it was. Oh well…2 minutes lost.

FOSDEM 2011: Java Packaging for Developers – Video

May 17, 2011 in video fosdem2011 java packaging guide

I’ve mentioned before that I attended FOSDEM this year. It was more than 3 months ago and I finally got my hands on video from my presentation. Courtesy of Andrew John Hughes, licensed under CC-BY-ND. As a refresher, slides are available here.
You can play or download rest of the videos by going to Andrew’s page.

Getting your Java Application in Linux: Guide for Developers (Part 2)

April 20, 2011 in fedora, howto, java, packaging

Ant and Maven

Last time I have written about general rules of engagement for Java developers if they want to make lives of packagers easier. Today I’ll focus on specifics of two main build systems in use today: Ant and Maven, but more so on Maven for reasons I’ll state in a while.

Ant

Ant is (or at least used to be) most widely deployed build system in Java ecosystem. There are probably multiple reasons for it, but generally it’s because Ant is relatively simple. In *NIX world Ant is equivalent of pure make (and build.xml of Makefile). build.xml is just that: an XML, and it has additional extensions to simplify common tasks (calling javac, javadoc, etc.). So the question is:

I am starting a new java project. How can I use Ant properly to make life easier for you?

The most simple answer? DON’T! It might seem harsh and ignorant of bigger picture and it probably is. But I believe it’s also true that Ant is generally harder to package than Maven. Ant build.xml files are almost always unique pieces of art in themselves and as such can be a pain to package. I am always reminded of following quote when I have to dig through some smart build.xml system:

Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it.

  –Brian Kernighan

And I have a feeling some people try to be really clever when writing their build.xml files. That said, I understand there are times when using Ant is just too tempting so I’ll include a few tips for it anyway.

Use apache-ivy extension for dependencies

One of main problem with and is handling of various dependencies. Usually, they are in some subdirectory of main tree, some jars versioned, some not, some patched without any note about it…in other words nightmare in itself. Apache-ivy extension helps here because it works with dependency metadata that packagers can use to figure out real build dependencies including versions. We can also be sure that no dependencies are patched in one way or the other.

Ivy is nice for developers as well. It will make your source tarballs much smaller (You do have source tarballs right?!) and your build.xml nicer. I won’t include any examples here because I believe that Ivy documentation is indeed very good.

One lib/ to rule them all

In case you really don’t want to use Ivy, make sure you place all your dependencies in one directory in top level of your project (don’t scatter your dependencies, even if you are using multiple sub-projects). This directory should ideally be called lib/. It should contain your dependencies named as ${name}-${version}.jar. Most of the time you should include license files for every dependency you bundle, because you are becoming distributors and for most licenses this means you have to provide full text of the license. For licenses use identical name as jar filenames, but use “.license” suffix. All in all, make it easy to figure out your build dependencies and play with them.

Don’t be too clever

I can’t stress this enough. Try to keep your build.xml files to the bare minimum. Understanding ten 30 KiB big build.xml files with multiple-phase build and tests spread through 10 directories is no fun. Please think of poor packager when you write your build.xml files. I don’t mind having grey hair that much, but I’d rather if it came later rather than sooner.

Maven

And now we are coming to my favourite part. Maven is a build and project management tool that has extensive plugin support able to do almost anything developer might ask for. And all that while providing formal project structure, so that once you learn how Maven works in one project you can re-use your knowledge in other projects.

Maven goodies

Maven provides several good things for packagers such as providing clear dependencies and preventing simple patched dependencies from sneaking in. Most important advantage for packagers coming with Maven is the fact that problems are the same in all projects. Once you understand how certain Maven plugin works, you will know what to expect and what to look for. But Maven is nice not just for packagers, but also for developers.

Declarative instead of descriptive

You don’t tell Maven:

Add jar A, jar B to the classpath, then use this properies file to set-up test resources. Then compile tests (Have you compiled sources yet?) and then … and run them with X

Instead you place test files and resources into appropriate directories and Maven will take care of everything. You just need to specify your test dependencies in nice and tidy pom.xml.

Project metadata in one place

With Maven you have all project information in one place:

  • Developer contact information
  • Homepage
  • SCM URLs
  • Mailinglists
  • Issue tracker URL
  • Project reports/site generation
  • Dependencies
  • Ability modify behaviour according to architecture, OS or other property

Need I say more? Fill it out, keep it up-to-date and we will all be happy.

Great integration with other tools

Ecosystem around Maven has been growing in past years and now you will find good support for handling your pom.xml files in any major java IDE. But that is just the tip of the iceberg. There are Maven plugins adding all kinds of additional tool support. Running checkstyle on your code, helping with licensing, integration with gpg, ssh, jflex and making releases. There are plugins for that and more.

Support for Ant

If you are in process of migrating your build system from Ant to Maven, you can do it in phases. For parts of your builds you can easily run Ant with maven-ant-plugin. Good example of such migration in progress is checkstyle. In version 5.2 they introduced Maven build system while preserving their old layout and running Ant for tests.

Maven messier side

A.K.A What you need to be aware of. It’s generally quite hard to do something bad in Maven, because it won’t let you do that easily. That said, there are plugins that can make it hard for us to package your software.

maven-dependency-plugin:copy-dependencies

This specific goal can potentially cause problems because it allows to copy classes from dependencies into resulting jar files. As I wrote last time, this is unacceptable because it creates possible licensing, security and maintenance nightmares. If you need even just one class from another project, rather than copying it, add it as a dependency into pom.xml

maven-shade-plugin

Shade plugin is a very shady plugin (pun intended). It can be used to weave depdencies inside your jars while changing their package names and doing all kinds of modifications in the process. I’ll give you a small test now :-) Let’s say you have jar file with following contents:


META-INF/
META-INF/MANIFEST.MF
META-INF/maven/
META-INF/maven/org.packager/
META-INF/maven/org.packager/Pack/
META-INF/maven/org.packager/Pack/pom.properties
META-INF/maven/org.packager/Pack/pom.xml
org/
org/packager/
org/packager/signature/
org/packager/signature/SignatureReader.class
org/packager/signature/SignatureVisitor.class
org/packager/signature/SignatureWriter.class
org/packager/Pack.class

Can you tell, from looking at jar contents where is org.packager.signature subpackage coming from? Take your time, think about it. Nothing? Well here’s a hint:



org.apache.maven.plugins
maven-shade-plugin



org.objectweb.asm
org.packager




I believe this demonstrates why usage of shade plugin is evil (in 99% of cases at least). This is especially problematic if the shaded packages are part of public API of your project, because we won’t be able to simply fix this in one package, but it will cascade up the dependency chain.

maven-bundle-plugin

Bundle is one of the more controversial plugins, because it can be used both for good and bad :-) One of the most important good use cases for bundle plugin is generating OSGI bundles. Every project can easily make their jar files OSGI compatible by doing something like this:


...
bundle
...



org.apache.felix
maven-bundle-plugin
true



...

Easy right? Now to the darker side of bundle plugin. I have another example to test your skills. This one should be easier than shade plugin:


META-INF/MANIFEST.MF
META-INF/
META-INF/maven/
META-INF/maven/org.packager/
META-INF/maven/org.packager/Pack/
META-INF/maven/org.packager/Pack/pom.properties
META-INF/maven/org.packager/Pack/pom.xml
org/
org/objectweb/
org/objectweb/asm/
org/objectweb/asm/signature/
org/objectweb/asm/signature/SignatureReader.class
org/objectweb/asm/signature/SignatureVisitor.class
org/objectweb/asm/signature/SignatureWriter.class
org/packager/
org/packager/Pack.class

Problem is the same as with shade plugin (bundling of dependencies), but at least here it’s more visible in the contents of the jar and it will not poison API of the jar. Just for the record, this is how it was created:



org.apache.felix
maven-bundle-plugin
true


org.objectweb.asm.signature



Summary

Today I wrote about:

  • Ant and why you shouldn’t use it (that much)
  • Ant and how to use it if you have to
  • Maven and why it rocks for packagers and developers
  • Maven and its plugins and why they suck for packagers sometimes

There are a lot more things that can cause problems, but these are the most obvious and easily fixed. I’ll try to gather more information about things we (packagers) can do to help you (developers) a bit more and perhaps include one final part for this guide.

Getting your Java Application in Linux: Guide for Developers (Part 1)

April 8, 2011 in fedora, howto, java, packaging

Introduction to packaging Java

Packaging Java libraries and applications in Fedora has been my daily bread for almost a year now. I realized now is the time to share some of my thoughts on the matter and perhaps share a few ideas that upstream developers might find useful when dealing with Linux distributions.

This endeavour is going to be split into several posts, because there are more sub-topics I want to write about. Most of this is going to be based on my talk I did @ FOSDEM 2011. Originally I was hoping to just post the video, but it seems to be taking more time than I expected :-)

If you are not entirely familiar with status of Java on Linux systems it would be a good idea to first read a great article by Thierry Carrez called The real problem with Java in Linux distros. A short quote from that blog:

The problem is that Java open source upstream projects do not really release code. Their main artifact is a complete binary distribution, a bundle including their compiled code and a set of third-party libraries they rely on.

There is no simple solution and my suggestions are only mid-term workarounds and ways to make each other’s (upstream ↔ downstream) lives easier. Sometimes I am quite terse in suggestions, but if need be I’ll expand them later on.

Part 1: General rules of engagement

Today I am going to focus on general rules that apply to all Java projects wishing to be packaged in Linux distributions:

  • Making source releases
  • Handling Dependencies
  • Bugfix releases

For full understanding a short summary of general requirements for packages to be added to most Linux distributions:

  • All packages have to be built from source
  • No bundled dependencies used for building/running
  • Have single version of each library that all packages use

There are a lot of reasons for these rules and they have been flogged to death multiple times in various places. It mostly boils down to severe maintenance and security problems when these rules are not followed.

Making source releases

As I mentioned previously most Linux distributions rebuild packages from source even when there is an upstream release that is binary compatible. To do this we need sources obviously :-) Unfortunately quite a few (mostly Maven) projects don’t do source release tarballs. Some projects provide source releases without build scripts (build.xml or pom.xml files). Most notable examples are Apache Maven plugins. For each and every update of one of these plugins we have to checkout the source from upstream repository and generate the tarball ourselves.
All projects using Maven build system can simply make packagers’ lives easier by having following snippet in their pom.xml files:




...

maven-assembly-plugin


project




make-assembly
package

single




...


This will create -project.zip/tar.gz files containing all the files needed to rebuild package from source. I have no real advice for projects using Ant for now, but I’ll summarise them next time.

Handling dependencies

I have a feeling that most Java projects don’t spend too much time thinking about dependencies. This should change so here are a few things to think about when adding new dependencies to your project.

Verify if the dependency isn’t provided by JVM

Often packages contain unnecessary dependencies that are provided by all recent JVMs. Think twice if you really need another XML parser.

Try to pick dependencies from major projects

Major projects (apache-commons libraries, eclipse, etc.) are much more likely to be packaged and supported properly in Linux distributions. If you use some unknown small library packagers will have to package that first and this can sometimes lead to such frustrating dependency chains they will give up before packaging your software.

Do NOT patch your dependencies

Sometimes a project A does almost exactly what you want, but not quite…So you patch it and ship it with your project B as a dependency. This will cause problems for Linux distributions because you basically forked the original project A. What you should do instead is work with the developers of project A to add features you need or fix those pesky bugs.

Bugfix releases

Every software project has bugs, so sooner or later you will have to do a bugfix release. As always there are certain rules you should try to uphold when doing bugfix releases.

Use correct version numbers

This depends on your versioning scheme. I’ll assume you are using standard X.Y.Z versions for your releases. Changes in Z are smallest released changes of your project. They should mostly contain only bugfixes and unobtrusive and simple feature additions if necessary. If you want to add bigger features you should change Y part of the version.

Backward compatible

Bugfix releases have to be backwards compatible at all times. No API changes are allowed.

No changes in dependencies

You should not change dependencies or add new ones in bugfix releases. Even updating dependency to a new version can cause massive recursive need for updates or new dependencies. The only time it’s acceptable to change/add dependency version in bugfix release is when new dependency is required to fix the bug.

An excellent example of how NOT to do things was Apache Maven update from 3.0 to 3.0.1. This update changed requirements from Aether 1.7 to Aether 1.8. Aether 1.8 had new dependency on async-http-client. Async-http-client depends on netty, jetty 7.x and more libraries. So what should have been simple bugfix update turned into need for major update of 1 package and 2 new package additions. If this update contained security fixes it would cause serious problems to resolve in timely manner.

Summary

  • Create source releases containing build scripts
  • Think about your dependencies carefully
  • Handle micro releases gracefully

Next time I’ll look into some Ant and Maven specifics that are causing problems for packagers and how to resolve them in your projects.

Accessing WebDAV calendar from commandline

March 11, 2011 in open source, programming, projects, python, software

Have you tried to access Zimbra or Google Calendar from command line? I have. And I couldn’t find any normal command line client that would be able to read and write these calendars, display alerts etc. Well there is a googlecl project, but it’s specific for Google Calendar and is not using standard WebDav iCal access methods.

Thus I set out to create console application that would fulfil my needs. What are my requirements?:

  • Read/write access to Google Calendar and Zimbra (at least)
  • Multiple remote calendars
  • Working alerts
  • Nice ncurses UI (but also ability to just display some info and quit)
  • Correct handling of timezones
  • Integration with mail client (open ics files received by email)
  • I guess a lot more :-)

I had a look at existing python modules that work on iCalendar, WebDAV and combination of both. There are quite a few of them, but I just didn’t like their APIs. They were usually complex and required knowledge of iCal specification. So I decided to create simplified module that would be easy to understand (even if not so powerful).

I named the project pywebcal (yes, unimaginative) and it’s now on github. I would LOVE some input. I know it’s far from perfect (or complete), but let’s see. For now it offers read-only support for Zimbra (Google should work too but I haven’t tested in a while).

You can have a look at the example directory that contains one simple example you can run in-place and see if it works :-) I did my best to create proper test cases covering problems with timezones and whatnot, and this helped me quite a lot with recent refactoring. I am now using vobject library as my backend and it is rather nice to use. Plan is to allow access to vobject components so that my simplified API is not preventing some advanced modifications.

Next step is obviously to start working on ncurses application itself. Anyone wants to help?