Archiv für die Kategorie ‘Java’

My UPNP Stack

2. März 2014 Keine Kommentare


i have already witten, that I am ripping my whole CD collection (see discogs for the already ripped collection). I have written a for tagging my collection and this is working great.

To play music on my stereo, I thought, that UPNP is a great protocol for it. The stack I am currently using is involving the following toolset:

Please note, that parts of this stack are replaceable through other components, but this stack is right now the best working (at least for me). All components do use the UPNP enhancements from OpenHome, which is an OpenSource Project of Linn, IIRC.

The MediaRenderer could be replaced by UPMPDCli, a nice UPNP Frontend for MPD (my favourite Music Player. But then you should also use BubbleUPNPServer to enjoy all the benefits of the Openhome extensions.

MediaPlayer is using MPD or MPlayer to be able to play the music. MPD do offer quite some extensions, which still can be used with the above mentioned environment.

One Extension is LCD4Linux, which allows to show some information about the current played song on a small LCD. This is working on my Raspberry, but unfortunately this also seems to have some problems, in that the Display just freezes and the whole Box needs to get restarted. Since the used display is also very small (see Pearl Display) I decided to invest some more time and money into something slightly larger (see TaoTronics Display, Power Supply, HDMI to Component Adapter as well as a couple of additional needed cables (MiniHDMI, Component…)). I do hope that this is going to work out. For this stack, LCD4Linux is not needed anymore, since this is a “normal” Screen. Therefor I plan to integrate a Full-Screen Display component into the MediaPlayer. As soon as this is finished, I will report back, right now I am still waiting, that all the above mentioned components do arrive.

On the beginning of my UPNP discoveries, I stumbled across the X10, which is also a nice toy, but unfortunately does not support gapless UPNP playback (see X10 Forum (German)). Unfortunately I needed to buy this device to discover this one ;-( It is still a nice playing toy, but right now is just used for Internet Radio Streaming, since even the Tagging I did with DiscogsTagger is totatlly screwed on this device and the X10 is showing me the albums in totally different format then shown eg. on minimserver.

So, you could buy yourself some expensive devices from Linn, Naim or …, or you spend you money on some decent Hardware like Raspberry PI (uh, the sound of this device is not realy good, without the addition of a good DAC like HifiBerry) or a Cubox-I and invest some time in installing the above mentioned stack, then all should be fine, without spending too much Money as well as time.

KategorienCommon, Java, Linux, Projects Tags:

What do I want from my personal NAS

22. Juli 2013 Keine Kommentare

In this post I would like to gather some personal requirements for a NAS System I am going to build.

Right now, I am in the process of ripping all my CDs (around 950 unique releases – More than half of it is already finished). The target is to store all these releases on a personal NAS with the ability to stream those to my stereo. For this I have already selected minimserver as the UPNP-Server. This server has the requirement of a JDK to let it run. Therefore the NAS I am going to build must have the ability to run JAVA.

Since I am already using Linux quite heavily I do not like to let FreeNAS or NAS4Free run on this NAS, also I am very interested in the underlying File System (ZFS) and/or FreeBSD.

Since Linux offers a “similar” File System (btrfs) I would like to use this one for the NAS.

The services which I would like to run on the NAS are then the following:

There some other options, which would be nice, but are not as “necessary”. There is e.g. Ajenti, which provides a nice WebGUI for the administration of the NAS, but this does not really correspond to the way Arch Linux works ;-) A possibility would be to use e.g. CentOS or
Ubuntu as a distro, but I am unsure, if this is really going to work out, just for a nice GUI????

The above mentioned requirements are not really tough for todays hardware and therefor I would like to stick to the Stack provided in the nas-portal forum (see here).

Since I am going to use a filesystem, which seems to be picky about power outages, I am in need of a UPS, and I am currently thinking about this one.

So I am going to explain some more interesting stuff about the tagging of my FLACs for the minimserver and about the used tool (discogstagger) in some future posts. Stay seated, so that you can see how an absolute Hardware Noob tries to build his own NAS ;-)

KategorienArch Linux, Common, Java, Linux, Projects, Ubuntu Tags:

dbUnit in a Spring Environment with MSSQL

20. Oktober 2011 Keine Kommentare

Today I needed to export and import some data from a database for testing purposes. I do knew dbUnit already and this seemed to be the tool of choice. I stumbled about a couple of problems, because another example of the usage of dbUnit with the Spring Framework was using Oracle and furthermore the FlatXmlDataSet. I wanted to use the XmlDataSet, because this seems to be easier to maintain manually.

In the following I will try to show, how to integrate dbUnit into the project. First of all, I needed to put the Maven Plugin into place:

                <!--jar file that has the jdbc driver -->


After this I was able to call mvn dbunit:export, which gave me an export of the current datastructure inside the database. It will generate a file target/dbunit/export.xml, which could then be used in the TestCases (we are using JUnit).

The TestCases are now looking something like this:

@ContextConfiguration(locations = {"classpath:/META-INF/spring/testAC.xml"})
public class ServiceTest {
	private DataSource dataSource;

Here we are autowiring the dataSource from the Application Context, this is needed to extract all necessary information for the database connection of dbUnit.

	// the following methods do make sure, that the database is setup correctly with
	// dbUnit
	private IDatabaseConnection getConnection() throws Exception {
    	// get connection
        Connection con = dataSource.getConnection();
        DatabaseMetaData databaseMetaData = con.getMetaData();
        IDatabaseConnection connection = new DatabaseConnection(con);
        DatabaseConfig config = connection.getConfig();
        config.setProperty(DatabaseConfig.PROPERTY_DATATYPE_FACTORY, new MsSqlDataTypeFactory());
        return connection;

Here we are making sure, that the connection is setup correctly and that the right DataTypeFactory is used. This should be the same factory as used in the pom.xml (see above).

	private IDataSet getDataSet() throws IOException, DataSetException {
		File file = new File("src/test/resources/dbexport.xml");
		Reader reader = new FileReader(file);
		return new XmlDataSet(reader);

Fetching the dataset from a file and use it in the DatabaseOperation of dbUnit.

	public void testDelete() throws Exception {
        DatabaseOperation.CLEAN_INSERT.execute(getConnection(), getDataSet());
		EntityVersionId id = new VersionId("9a5a8eb1f02b4e06ba9117a771f2b69c", 2L);
		Entity entity = this.entityService.find(id);

Please note, that we are using a special EntityVersionId, which is part of our Framework and contains two values. This is a combined ID. The usual ID is an UUID (String) and a “version” of type long. I guess, you will most probably not use something like this in your project.

Thats it, now everything works like expected ;-)

KategorienJava, Projects Tags:

Getting into CXF

9. Oktober 2011 Keine Kommentare

For my new Job (oh, i did not blog about this one, but more information will follow soon) I am currently investigating a couple of EAI frameworks. One of them is Apache CXF. For this investigation, I am implementing a very easy task and use an easy tutorial (Creating a REST service with CXF and Spring in 10 minutes. Well, to be totally honest, it took me more then 10 minutes to get this up and running. Of course, this is mainly due to the fact, that I wanted to gather some more information about JXF and used my own services.
The first issue I stumbled upon was a problem with the error message: “no resource classes found”. Problem here was, that my ResourceService (call it Controller or whatever) implemented an interface and the JAX-RS Annotations were defined on the interface, instead of the concrete class. This was not working correctly. No I define all the annotations on the concrete class and everything went fine.
Another issue, I gathered on Tomcat, but not on Jetty, was a problem with the @Path-Annotation. I defined a @Path(“/folder”) on the class, and on the concrete method I defined another @Path(“/{id}”). This throw an exception on starting up Tomcat. After removing the “/” from the second @Path (so: @Path(“{id}”). This was another step into the right direction.

KategorienJava, Projects Tags:

Java 7 – minor classloading difficulties

13. August 2011 Keine Kommentare

Since I am using Arch Linux, I am more then accustomed to using the latest and greates versions of all the stuff. Unfortunately this is not always very good. During the last couple of days I experienced a couple of class loading issues with Java 7 (as opposed to Java 6).
I am currently testing Broadleaf Commerce and had to report an issue to theses guys because of some problems I did receive during compilation and running this application. Something similar happened to me on my project at work as well. I call this “class loading issues”, but it is probably slightly more. I do have problems loading configuration data correctly. (see issue 96 on the Broadleaf JIRA)-
To work around this issue, I just installed Java 6 again. Now it is working like a charm.

KategorienArch Linux, Common, Java, Linux Tags:

JAXB Experiences II

13. August 2011 1 Kommentar

Like already stated, we do have a large object tree, which we are exporting to xml using JAXB. Now we are also importing this tree again. Because of several Bi-/Uni-Directional Depedencies this object tree is kind of hard to reflect in XML. We do have one Root-Element, and several objects depending on this element uni-directional. Since the export is started fro the Root element, we are calling the corresponding service to find all uni-directional dependencies. These dependencies are exported into separate XML-files.
Since we do have in the child a reference to the root, we need to reflect this via an Reference on the property, otherwise the XML-file would be quite large and reflect the whole objects more then once, which leads to problems as well.
Problem is now, that the XMLIDRef is not really working, because the XMLIDRef is not referencing another object in the same file (the root element is defined in the original file). To work around this, we are using @XmlJavaTypeAdapter on the root property in the child object. This Adapter needs to get initialized and assigned to the unmarshaller. Please take a look onto the example below:

	private Object unmarshall(Class clazz, Map<Class, XmlAdapter> adapters, String fileName) throws Exception {
        // Create a JAXB context passing in the class of the object we want to marshal/unmarshal
        final JAXBContext context = JAXBContext.newInstance(clazz);

        // Create the marshaller, this is the nifty little thing that will actually transform the object into XML
		final Unmarshaller unmarshaller = context.createUnmarshaller();
		unmarshaller.setEventHandler(new javax.xml.bind.helpers.DefaultValidationEventHandler());
		for (Class adapterClass : adapters.keySet()) {
			unmarshaller.setAdapter(adapterClass, adapters.get(adapterClass));
		}"Starting unmarshaller");
		Object unmarshalledObject = unmarshaller.unmarshal(new FileInputStream(fileName));"Finished unmarshaller");

        return unmarshalledObject;

The concrete adapters are added in the calling method:

	public Child unmarshallEntityWrapper(RootElement element, String fileName) throws Exception {
		Map<Class, XmlAdapter> adapters = new HashMap<Class, XmlAdapter>();

		determineAdapter(element, adapters);
        return (Child)this.unmarshall(Child.class, adapters, fileName);

	private void determineAdapter(RootElement element, Map<Class, XmlAdapter> adapters) {
		RootElementAdapter adapter = new RootElementAdapter();
		for (SubElement subElement : element.getSubElements()) {
			adpater.put(subElement.getId(), subElement);

		adapters.put(Adapter.class, adapter);

The Adapter itself is pretty straight forward:

public class RootElementAdapter extends XmlAdapter<String, RootElement> {

	private Map<String, RootElement> rootElements = new HashMap<String, RootElement>();

	public Map<String, RootElement> getRootElements() {
		return rootElements;

	public RootElement unmarshal(String id) throws Exception {
		return rootElements.get(id);

	public String marshal(RootElement rootElement) throws Exception {
		return rootElement.getId();

The classes itself are pretty straight forward as well. Please notice, that the ChildElement is stored in another XML-file and because of this, we do need the Adapter, like already stated above. The RootElement does not have any relationship to the Child-elements:

@XmlRootElement(name = "RootElement")
public class RootElement implements Serializable {

public class ChildElement implements Serializable {
   public RootElement getRootElement() {

In the service, which is handling the marshalling, we are now fetching all ChildElements belonging to the RootElement via a service method like so:

public List<ChildElement> findChildsByRootElement(RootElement rootElement) {

Because we would like to marshall these elements into their own XML-file, we do have to create a Wrapper Object, which basically wraps all elements:

@XmlSeeAlso({ChildElement.class, ChildA.class, ChildB.class})
public class ChildElementWrapper {

	private Collection<ChildElement> childElements;

	public ChildElementWrapper() {

	public ChildElementWrapper(Collection<ChildElement> childElements) {
		this.childElements = childElements;

	@XmlElementWrapper(name = "childElements")
	@XmlElement(name = "childElement")
	public Collection<ChildElement> getChildElements() {
		return childElements;

	public void setChildElements(Collection<ChildElement> childElements) {
		this.childElements = childElements;

Another important thing, we learned during the implementation of this Import/Export layer was that the usual Inheritance from Hibernate using Discriminators is not really nicely handled. The Wrapper-Class has to use the @XmlSeeAlso-Annotation, like stated above. Now all the concrete implementations of the ChildElement-Class are marked with the type-attribute in the XML-file. Since we are using hibernates replicate and we need the Discriminator Value, we are using the following solution:

			switch (childElement.getElementType()) {
				case A:
					ChildA childA = (ChildA)childElement;;
					this.jcDao.updateDiscriminator(childA, "DiscriminatorA");
				case B:
					ChildB childB = (childB)childElement;;
					this.jcDao.updateDiscriminator(childB, "DiscriminatorB");

	public Object store(Object object) {
		if (this.entityManager == null) throw new IllegalStateException("Entity manager has not been injected");

		Session session = (Session)this.entityManager.getDelegate();

		session.replicate(object, ReplicationMode.OVERWRITE);

		return object;

	public void updateDiscriminator(Object object, String discriminator) {
		if (this.entityManager == null) throw new IllegalStateException("Entity manager has not been injected");

		Session session = (Session)this.entityManager.getDelegate();

		String hqlUpdate = "update ChildElement set type = :discriminator where id = :id";
		int updateEntities = session.createQuery(hqlUpdate).setString("id", object.toString())
									 .setString("discriminator", discriminator)

I hope, that this explanation is easy enough to follow ;-)

KategorienJava Tags:

Experiences with JAXB for large data structure

29. Juli 2011 1 Kommentar

We are going to implement a Data-Export/Import out of a Spring-Hibernate Application with JAXB. During the Development of this feature, I learned quite some lessons and would like to let you participate.

First of all, we have a entity structure using inheritance all over the place. All entities are inheriting from a common “BaseEntity”. Furthermore we are using Bi-Directional relationships with Hibernate for most of the entities as well. Therefor the data-export with JAXB grows quite large and furthermore we received a “OutOfMemoryError: Java heap space”-Exception. During Trial/Error kind of learning JAXB, we received the “Class has two properties of the same name” issue. How to solve all these issues?

First of all we tried and implemented the CycleRecoverable-Interface on our BaseEntity. Therefor the BaseEntity looks like the following:

public class BaseEntity implements Serializable, GenericEntity<String>, CycleRecoverable {

    @GenericGenerator(name="system-uuid", strategy = "uuid")
    protected String id;

    protected Integer version;

    public Object onCycleDetected(Context context) {
	    BaseEntity entity = new BaseEntity();
	    return entity;

    public String getId() {

// more getter and setter

On looking closely to the above code, you will recognize the method onCycleDetected. This method is defined in the interface “com.sun.xml.bind.CycleRecoverable”.

HINT: Do not use the sometimes suggested “com.sun.xml.internal.bind.CycleRecoverable” interface, if you do not have the “correct” interface, add the following dependency to your pom:


Furthermore you will recognize, that the @XmlID-Annotation is defined on the method and not on the property. This is due to the fact, that this method is already defined in the interface GenericEntity and otherwise you would receive the “Class has two properties of the same name”-exception. This is one lesson learned, declare the @XmlID and @XmlIDREF annontations on the methods and not on the properties ;-)

One Level above the above mentioned BaseEntity we do have another Entity called BaseEntityParent, which defines a relationship to Child Objects, which contain some language specific settings:

public abstract class BaseEntityParent<T extends BaseEntityDetail> extends
		BaseEntity {

	@OneToMany(cascade = CascadeType.ALL, fetch = FetchType.EAGER,
		   mappedBy = "parent", orphanRemoval = true)
	@MapKey(name = "isoLanguageCode")
	private Map<String, T> details;

   // more stuff in here

This object did cause the “Class has two properties of the same name”-exception in the first place, because the Child Object references the parent itself as well. Therefor we implemented the CycleRecoverable interface in the BaseEntity.

public class BaseEntityDetail<T extends BaseEntityParent> extends BaseEntity {

	@ManyToOne(cascade = {CascadeType.PERSIST, CascadeType.MERGE})
	private T parent;

	@Size(max = 10)
	@Column(nullable = false, length = 10)
	private String isoLanguageCode;

After the implementation of the CycleRecoverable-Interface this problem was gone. We still received, like already stated the Out-Of-Memory exception on an Object with a large set of dependent Objects. Therefor we are now using the @XmlIDREF annotation in the related bi-directional objects. In the following we do have the Object module, which does have some bi-directional relationships to other objects:

@XmlRootElement(name = Module.MODEL_NAME)
public class Module extends BaseEntityParent<ModuleDetail> implements
		Serializable {

	@OneToMany(fetch = FetchType.LAZY, mappedBy = "module", orphanRemoval = true)
	@XmlElementWrapper(name = Document.MODEL_NAME_PLURAL)
	@XmlElement(name = Document.MODEL_NAME)
	private Set<Document> documents = new HashSet<Document>();

// more properties
@XmlRootElement(name = Document.MODEL_NAME)
public class Document extends BaseEntityParent<DocumentDetail> {

	@ManyToOne(cascade = {CascadeType.MERGE, CascadeType.PERSIST})
	private Module module;

	public Module getModule() {
		return module;

The Document defines now the @XmlIDREF on the method, like stated above (otherwise we did receive the “Class has two properties of the same name”-exception). This makes sure, that the module is just marshalled once and all references to this object are now just marshalled with the ID of the object.

Most probably there is still some space for improvements, but this approach (after implemented on all related objects) did save enough space to get rid of the “Out of Memory”-Exception. On implementing this “pattern”, we lowered the memory used for the marshalled XML-file quite a lot. In the first successful run, the XML-file was around 33MB, now it is just around 2MB for the same set of data.

One more word about the implementation of the CycleRecoverable interface. You can remove this implementation as soon as you have put all @XmlID and @XmlIDREF in place. For me it was really easy to find all missing parts, because I have had already a marshalled object tree in place (with CycleRecoverable) and could easily find out the missing parts (without CycleRecoverable) because of the error-messages, which are like:

MarshalException: A cycle is detected in the object graph. 
This will cause infinitely deep XML: 
ff80808131616f9b01316172b9840001 -> ff80808131616f9b013161797cda0019 -> 

I was able to search for the Entities using the ids in the already marshalled file ;-)

I hope you do find this one helpful. Give it a try for yourself, and report back (in the comments or via email) if you have problems.

So, basically you do not need to implement the CycleRecoverable interface, if you put @XmlID and @XmlIDREF in place, thats definitly another lesson I learned during this implementation.

I think, our entity structure is not as unsual and we do have implemented some other nice stuff (eg. the Language specific stuff with the details) which could be useful for you as well. So, if you have any question about this, I am more then willing to help you out as well.

KategorienJava, Projects Tags:

Spring WebMVC Ajax Post-Requests

21. Juni 2011 Keine Kommentare

Today I tried to improve the performance of our newest webapp. We are using a DOJO Tree for displaying the structure of a document on the left side. This tree can be quite big and the loading of this tree takes quite some time. Therefor I decided to use AJAX to load only fragments of the page if possible.

This went quite well for most of the pages, because for the GET Requests, you can easily use something like

     var xhrArgs = {
         url: url,
         content: { "fragments": "body" },
         headers: {"Accept": "text/html;type=ajax"},
         handleAs: "text",
         preventCache: false,
         sync: true,
         handle: function(response, ioargs) {
         load: Spring.remoting.handleResponse,
         error: Spring.remoting.handleError

     //Call the asynchronous xhrGet
     var foo = dojo.xhrGet(xhrArgs);

Unfortunately this is not as easy for Post-Requests. Therefor you can easily use (I found it after quite some googleing) this one. Please take a close look onto the end of the page and you will see an example showing exactly this one. You have to use the formId parameter, otherwise it will not work as well.

KategorienJava Tags:

Hudson vs. Jenkins – Jenkins releasing often

9. März 2011 Keine Kommentare

This seems to be one of the main differences between the two projects, the Release Cycle. While Hudson is right now at 1.396 Jenkins is already at 1.400. I wander if the bugfixes in Jenkins are all merged back into Hudson, since there are quite some “nice” fixes in there, especially some for Maven projects (see the Jenkins Changelog). Both projects have still not fixed my personal highlight bug (Error Generating Documentation). Hope that one of those will fix it soon. I promise that I will use the corresponding fork and will not look back ;-)

Lets see, which fork is going to win this race.

KategorienJava, Projects Tags:

Hudson vs. Jenkins – Hudsons future

8. März 2011 Keine Kommentare

Yesterday Sonatype announced a free webinar about the future of Hudson. I am very interested in the presentation from Jason van Zyl and the outcome therefor. So, I guess up until then, there will be no real news about this topic.

The presentation seems to be focused on Sonatypes (positive) influence on Hudson and the changes they would like to do. I guess, that this will contain also some output from the survey Sonatype has taken on the “Future of Hudson”.

Since JSR330 (Dependency Inejction) is now already implemented in Hudson, I guess that the Plugin API will change slightly, to use this new concept (which is great IMHO). Furthermore there will be (IMHO) some changes to the UI of Hudson to make a clearer distinction between Hudson and Jenkins.

So, IMHO Hudson is going to be an integral part of the offerings from Sonatype, let’s see where this leads the project and the community.

KategorienJava, Projects Tags: