Thursday, June 29, 2006

Spring; Hibernate; A Super Simple DAO Layer

This note provides a Super Simple example of Spring and Hibernate usage. The main advantage of the technique that I discuss is that the same DAO layer can be used to create, read, update, and delete (CRUD) any Java object. So if your project doesn't need relationships between tables controlled by the DAO layer, you'll be able to use this technique. One caveat that I should mention; each action (create, update, etc.) happens inside its own transaction. So if you need to perform more than one database operation inside a transaction, you'll need to extend this technique slightly. When database operations are discussed, I'll come back to this topic. We'll use the following database table because numeric, varchar, and timestamp probably cover 99% of the data that needs to be stored.
CREATE TABLE super_simple (
  id serial PRIMARY KEY
  name varchar(100)
   ,count numeric
   ,created timestamp
)
I'll use PostgreSQL for this example, but any database can be used. Now that the database table has been defined, the next step is to define a value object to hold the Java representation of the database record.
package com.codebits.vo;

/** This class serves as a template for Java objects saved via Hibernate. It
* mainly ensures that all records in the database have the same primary key
* semantics.
*
* @author medined
*/
abstract class HibernateValueObject {

private int id = -1;

public int getId() {
  return this.id;
}

public void setId(int _id) {
  this.id = _id;
}
}
First I create an abstract class to hold behaviors that I want all my value objects to have. In this case, all of my objects will use a non-domain id so that all object parameters can be changed at will. This style of database design is not necessarily a best practice - but it is super easy!
package com.codebits.vo;

import java.util.Date;

public class SuperSimpleRecord extends HibernateValueObject {

private int count;

private Date created = null;

private String name = null;

// standard getters and setters.

}
Now the database record is defined in Java. Let's go ahead and define the record to Hibernate. Somewhere on the classpath, place a file called super_simple.hbm.xml.
<?xml version="1.0"?>

<!DOCTYPE hibernate-mapping PUBLIC
      "-//Hibernate/Hibernate Mapping DTD//EN"
      "http://hibernate.sourceforge.net/hibernate-mapping-3.0.dtd">

<hibernate-mapping>

  <class name="com.codebits.vo.SuperSimpleRecord" table="super_simple">
      <id name="id" column="id" type="int" unsaved-value="-1">
          <generator class="hilo" />
      </id>
      <property name="name" column="name" type="string" length="100" not-null="true" />
      <property name="count" column="count" type="integer" not-null="false" />
      <property name="created" column="created" type="date" not-null="true" />
  </class>

</hibernate-mapping>
This XML file tells Hibernate that the ID value is the primary key. When the object is initially created, the ID value is -1. When the record is persisted, the ID value is updated with some value generated by Hiberate or the database - it doesn't matter where or how the value was generated. The key point to remember is that you never change it. What's next? We have defined the record to both Java and Hibernate. So let's get Spring involved. I call my Spring configuration file, applicationContext.xml. But the name isn't important. Whatever it's called, just make sure that it's on the classpath.
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE beans PUBLIC "-//SPRING//DTD BEAN//EN" "http://www.springframework.org/dtd/spring-beans.dtd">
<beans>

  <!--
      This is the information needed to connect to the database.
  -->
  <bean id="dataSource" class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close">
      <property name="driverClassName" value="org.postgresql.Driver"/>
      <property name="url" value="jdbc:postgresql://192.168.1.101:5432/play"/>
      <property name="username" value="play"/>
      <property name="password" value="play"/>
      <property name="defaultAutoCommit" value="false"/>
  </bean>

</beans>
Following the Super Simple philosophy, let's make sure you can talk to your database. Here is the Java class that reads applicationContext.xml which automatically creates the dataSource bean and tests the database connection. Here is the SuperSimpleDriver class:
package com.codebits.drivers;

import org.springframework.context.ApplicationContext;
import org.springframework.context.support.ClassPathXmlApplicationContext;

public class SuperSimpleDriver {

  public static void main(String[] args) {
      ApplicationContext ctx = null;
   
      try {
          ctx = new ClassPathXmlApplicationContext("applicationContext.xml");
      } catch (Exception e) {
          e.printStackTrace();
      } finally {
          System.out.println("Done.");         
      }

  }

}
Note: When Spring's BasicDataSource class can't connect to the database, the error is only shown in the log messages. Make sure that you skim through all of the log messages looking for errors. If the program can't connect to your database you'll see a SQLNestedException was thrown. For some reason that exception seems untrapable via the try-catch mechanism. I'm not happy about it and perhaps I have some incorrect settings. At some point, I'll grab the Spring source files and track down the problem. But that's not a super simple topic. So moving on! Run the SuperSimpleDriver program. If the applicationContext.xml file isn't found make sure that it's directory is on the classpath. When you have no errors, add the following bean to applicationContext.xml.
    <!--
      This is the information needed to configure Hibernate. Notice that
      the hbm.xml file is listed here. And there is a reference to the
      dataSource bean created above. The rest of the properties are mostly
      used to configure the cache used by Hibernate. Since we're being super
      simple, just use the supplied values - don't worry about what they
      actually do.
  -->
  <bean id="sessionFactory" class="org.springframework.orm.hibernate3.LocalSessionFactoryBean">
      <property name="dataSource" ref="dataSource" />
      <property name="mappingResources">
          <list>
              <value>super_simple.hbm.xml</value>
          </list>
      </property>
      <property name="hibernateProperties">
          <props>
              <prop key="hibernate.dialect">org.hibernate.dialect.PostgreSQLDialect</prop>
              <prop key="hibernate.show_sql">true</prop>
              <prop key="hibernate.c3p0.minPoolSize">5</prop>
              <prop key="hibernate.c3p0.maxPoolSize">200</prop>
              <prop key="hibernate.c3p0.timeout">1800</prop>
              <prop key="hibernate.c3p0.max_statement">50</prop>
              <prop key="hibernate.generate_statistics">true</prop>
              <prop key="hibernate.cache.use_query_cache">true</prop>
          </props>
      </property>
  </bean>
Run the SuperSimpleDriver program again. Hopefully, you'll still have no errors. The next topic is the Java DAO layer. The goal here is to provide a way to create, read, update, and delete objects - one at a time with each action being inside its own transaction. We'll start, as before, with an abstract class.
package com.codebits.dao.hibernate.actions;

import org.hibernate.Session;
import org.hibernate.SessionFactory;
import org.hibernate.Transaction;

abstract public class SimpleAction {

  private SessionFactory sessionFactory = null;

  abstract void doAction(final Session session, final Object o);

  public void execute(final Object o) {
      Session session = sessionFactory.openSession();
      Transaction tx = null;
      try {
          tx = session.beginTransaction();
          doAction(session, o);
          tx.commit();
      } catch (Exception e) {
          if (tx != null) {
              tx.rollback();
          }
      } finally {
          if (session != null) {
              session.close();
          }
      }
  }

  public SessionFactory getSessionFactory() {
      return this.sessionFactory;
  }

  public void setSessionFactory(SessionFactory _sessionFactory) {
      this.sessionFactory = _sessionFactory;
  }

}
This abstract class holds Hibernate's SessionFactory object and provides the transaction environment that our concrete actions need. The doAction abstract method will be overriden by each concrete class to handle the 'real' action. Here are the concrete classes for Create, Update and Delete. They are so super simple I'll present all three in a row.
package com.codebits.dao.hibernate.actions;

import org.hibernate.Session;

public class Create extends SimpleAction {
  public void doAction(final Session session, final Object o) {
      session.save(o);
  }
}

package com.codebits.dao.hibernate.actions;

import org.hibernate.Session;

public class Update extends SimpleAction {
  public void doAction(final Session session, final Object o) {
      session.update(o);
  }
}

package com.codebits.dao.hibernate.actions;

import org.hibernate.Session;

public class Delete extends SimpleAction {
  public void doAction(final Session session, final Object o) {
      session.delete(o);
  }
}
NOTE: Hibernate has a saveOrUpdate method which should allow use to combine the Create and Update classes into one Merge class. However, I could not get it to work properly. So, in true Super Simple fashion I ignored the problem and continued on my way. Before seeing how the above classes are defined in applicationContext.xml we'll look at the read part of CRUD. The general consensus seems to be that read classes should be referred to as finders. Here is my abstract finder.
package com.codebits.dao.hibernate.finders;

import java.util.List;

import org.hibernate.Session;
import org.hibernate.SessionFactory;

abstract public class FindAll {

  private SessionFactory sessionFactory = null;

  private String fullyQualifiedClassname = null;

  public List execute() {
      List rv = null;
      Session session = sessionFactory.openSession();
      try {
          rv = session.createQuery("from " + getFullyQualifiedClassname()).list();
      } finally {
          if (session != null) {
              session.close();
          }
      }
      return rv;
  }

  public SessionFactory getSessionFactory() {
      return this.sessionFactory;
  }

  public void setSessionFactory(SessionFactory _sessionFactory) {
      this.sessionFactory = _sessionFactory;
  }

  public String getFullyQualifiedClassname() {
      return this.fullyQualifiedClassname;
  }

  public void setFullyQualifiedClassname(String _fullyQualifiedClassname) {
      this.fullyQualifiedClassname = _fullyQualifiedClassname;
  }

}
Unfortunately, this class is only simple ... not super simple. But it's the simplest we can do. The class only has two parameters - the SessionFactory and the name of the class that we want to persist. It would not serve purposes of this blog entry, to explain the execute method. There are plenty of web sites that explain how to write a Hibernate query. I'll stop at saying the execute query returns a list of all objects of a given type. The abstract class needs to be made concrete in order to use it as shown below.
package com.codebits.dao.hibernate.finders;

import com.codebits.vo.SuperSimpleRecord;

public class FindAllSuperSimpleRecords extends FindAll {

  public FindAllSuperSimpleRecords() {
      super();
      setFullyQualifiedClassname(SuperSimpleRecord.class.getName());
  }

}
For those of you unfamilar with SuperSimpleRecord.class.getName(), it will return "com.codebits.vo.SuperSimpleRecord". Using the getName() technique ensures that the class name is spelled correctly since the compiler will complain if it is incorrect. Now that the Java actions are known, we need to define them to Spring. Add the following definitions to applicationContext.xml.
    <bean id="create" class="com.codebits.dao.hibernate.actions.Create">
      <property name="sessionFactory" ref="sessionFactory" />
  </bean>

  <bean id="update" class="com.codebits.dao.hibernate.actions.Update">
      <property name="sessionFactory" ref="sessionFactory" />
  </bean>

  <bean id="delete" class="com.codebits.dao.hibernate.actions.Delete">
      <property name="sessionFactory" ref="sessionFactory" />
  </bean>

  <bean id="findAllSuperSimpleRecords" class="com.codebits.dao.hibernate.finders.FindAllSuperSimpleRecords">
      <property name="sessionFactory" ref="sessionFactory" />
  </bean>
Everything is now defined and we just need to expand the driver program to do some DAO work. Add the following code inside the try-catch of SuperSimpleDriver.java but after the ctx initialization.
            /* Define the object that we want to persist. */
          SuperSimpleRecord o = new SuperSimpleRecord();
          o.setName("David Medinets");
          o.setCount(10);
          o.setCreated(new Date());

          // persist it.
          ((Create) ctx.getBean("create")).execute(o);

          // find all objects. Our object will be the only one in the list.
          List list = ((FindAllSuperSimpleRecords) ctx.getBean("findAllSuperSimpleRecords")).execute();
          System.out.println("Size: " + list.size());
Now execute the driver program. The size should be 1. Check the database. Hopefully, you'll be able to see the record. The other actions are called like this:
            // persist the change.         
          ((Update) ctx.getBean("update")).execute(o);

          ((Delete) ctx.getBean("delete")).execute(o);
We're done with the Super Simple example. You can create more complex actions by subclassing the SimpleAction class and doing whatever work you need to do inside the doAction method. For example, create a variation of SimpleAction.execute which takes a variable number of arguments and save multiple objects at once. Since the SimpleAction abstract class handles the transaction your concrete class doesn't need to deal with transactions, just focus on persisting whatever objects it cares about. Additional database tables can be easily added to this Super Simple system. Here are the steps to follow: 1. Create the database table. 2. Create the value object Java class. 3. Create the hbm.xml file. 4. Update the sessionFactory definition to load the hbm.xml file. 5. Create a finder class. 6. Create a driver class. A future blog entry of mine will show how to handle more sophisticated transactions. However, you might be suprised how complicated an application you can write just using this Super Simple DAO framework.

Wednesday, June 28, 2006

Postgresql; Connecting to PostgreSQL from a remote client.

I see many web pages that mention the pg_hba.conf file which control which hosts and user can connect to which database. However, there is a listen_addresses parameter in postgresql.conf which needs to be set. By default, it was commented out so that no remote client connections could be made - certainly a reasonable security precaution.

In order to accept connections from remote client you need to set the listen_addresses parameter to something other than localhost. For example, if your PostgreSQL server sits on 192.168.1.123 then that could be the value for listen_addresses. If your PostgreSQL server has multiple addresses, you could use * so that remote clients can connect to any of the IP addresses. Or use a comma-delimited list if only some of the IP address should be used when connecting to PostgreSQL.

Ubuntu Disk Partition Sizes

Various documentation that I read said that the root of Ubuntu needed less than 200Mb. However, I've found that not to be true. In my situation, I ran out of disk space. So I've bumped the root partition to 500Mb. On my 80Gb drive, Here are my partition sizes:

medined@thog:~$ df
Filesystem           1K-blocks      Used Available Use% Mounted on
/dev/hda1               481764    282766    173296  63% /
/dev/hda8             51675672    923812  48126844   2% /data
/dev/hda5              3099260   1593760   1348068  55% /usr
/dev/hda7             10317828    556548   9237164   6% /usr/local
/dev/hda6             10317828    337728   9455984   4% /var

I also have a 1,000Gb swap - mainly because when I tried to install Oracle it asked for a 750Mb swap file.

UPDATE: One mistake (there are probably others) that I made was that the /tmp path needs a lot of room because that is where installation place files. So two choices - either give the root partition another couple of gigabytes just for software installation. Or give /tmp its own 2Gb partition. Or, as I am doing, constantly tell each installation process to place temp files in a /data/tmp (or whatever) directory.

Spring; BeanStoreDefinitionException; Fix for 'Unexpected failure during bean definition parsing'

While working to get one of my projects working under Ubunu Linux which was working under Windows, I ran into the following error:

org.springframework.beans.factory.BeanDefinitionStoreException: Error ' element 
for property 'location' is only allowed to contain either 'ref' attribute OR 'value' 
attribute OR sub-element' in resource'class path resource [applicationContext.xml]' 
at: Bean 'propertyConfigurer'

The error was definitely wrong since the XML file hadn't changed between the two OSes.

The problem was that I was using GNU's Java v1.4 instead of Sun's Java v1.5. I don't know if it was GNU vs Sun or v1.4 vs v1.5 that was the real problem. But switching to Sun Java v1.5 made Spring work which is good enough for me.

Oh... one quick note. Don't simply copy your Windows JDK directory to Linux and expect it to work. Download the Linux version from Javasoft. I installed mine to /usr/local.

Sunday, June 25, 2006

Ubuntu, Oracle Installation: No response from /etc/init.d/oracle-xe configure

I've been working with virtual machines. Which turned out to be a good idea while trying to install Oracle on Ubuntu Linux. I ran into a few problems.

  • install the libaio1 package.
2008-Jan-07 Update: A reader mentioned that the package name ends in a one not an el.

Oracle needs this package installed before you install the .deb package. After the package is installed with the dpkg -i command, you are supposed to run the /etc/init.d/oracle-xe configure command. However this command produced no response. Nor did trying to stop and start Oracle. After a bit of playing, I did the followig:

ORACLE_HOME=/usr/lib/oracle/xe/app/oracle/product/10.2.0/server
export ORACLE_HOME
$ORACLE_HOME/config/script/XE.sh
I still can't get Oracle to work but at least that XE script did something.

Monday, June 19, 2006

Spring; ApplicationContext, SqlUpdate: Configuring SqlUpdate Inside ApplicationContext.

Configuring SqlUpdate and derived classes within the applicationContext.xml file is not straightforward because the class's constructor requires an integer array when a prepared statement is used. I got around this issue by using a factory class and method.

We'll start with the object that we actually need. The communicationsUpdate bean:

<bean id="communicationsUpdate" factory-bean="communicationsUpdateFactory" factory-method="createInstance"/>

.The Java class for this bean is nearly trivial:

public class CommunicationsUpdate extends SqlUpdate {
  public int run(final String communication_type, final int source_organization_id, final int destination_organization_id, final int address_to_person_id, final int sent_by_person_id, final int order_id, final int dsid) {
    Object[] params = new Object[] { communication_type, new Integer(source_organization_id), new Integer(destination_organization_id), new Integer(address_to_person_id), new Integer(sent_by_person_id), new Integer(order_id), new Integer(dsid) };
    return update(params);
  }
}

The run method's parameter includes every field that needs to be updated. It creates an object array and then passes the object array into the parent's update method (SqlUpdate is the parent class).

Next we turn our attention to the factory class. Here is the bean description:

<bean id="communicationsUpdateFactory" class="com.codebits.dao.CommunicationsUpdateFactory">
  <property name="dataSource" ref="dataSource" />
  <property name="sql" value="UPDATE communications SET communication_type=?,source_organization_id=?,destination_organization_id=?,address_to_person_id=?,sent_by_person_id=?,order_id=? WHERE dsid=?" />
  <property name="parameters">
    <list>
      <ref bean="communicationType_type"/>
      <ref bean="sourceOrganizationId_type"/>
      <ref bean="destinationOrganizationId_type"/>
      <ref bean="addresstoPersonId_type"/>
      <ref bean="sendByPersonId_type"/>
      <ref bean="orderId_type"/>
      <ref bean="dsId_type"/>
    </list>
  </property>
</bean>

You'll notice that the SQL for the update is specified right in the bean definition. As are the parameters for the prepared statement. The Java source behind the factory is quite generic. It looks like this:

public class CommunicationsUpdateFactory {
 
  private DataSource dataSource = null;
 
  private String sql = null;
 
  private List parameters = null;

  public CommunicationsUpdate createInstance() {
    CommunicationsUpdate action = new CommunicationsUpdate();
    action.setDataSource(getDataSource());
    action.setSql(getSql());
    for (Iterator iterator = parameters.iterator(); iterator.hasNext(); ) {
      action.declareParameter((SqlParameter)iterator.next());  
    }
    action.compile();
      return action;
  }
  ... standard getters and setters ...
}

The SQLParameters are also defined in the applicationContext.xml file:

<bean id="communicationType_type" class="com.codebits.dao.sqlparameter.CommunicationType"/>
<bean id="sourceOrganizationId_type" class="com.codebits.dao.sqlparameter.SourceOrganizationId"/>
<bean id="destinationOrganizationId_type" class="com.codebits.dao.sqlparameter.DestinationOrganizationId"/>
<bean id="addresstoPersonId_type" class="com.codebits.dao.sqlparameter.AddressToPersonId"/>
<bean id="sendByPersonId_type" class="com.codebits.dao.sqlparameter.SentByPersonId"/>
<bean id="orderId_type" class="com.codebits.dao.sqlparameter.OrderId"/>
<bean id="dsId_type" class="com.codebits.dao.sqlparameter.DsId"/>

They all look the same except that the Types constant may change.

public class CommunicationType extends SqlParameter {

  public CommunicationType() { super(Types.VARCHAR); }

}

I posted this example as much to demonstrate how a factory method can be defined as to show how the SqlUpdate class is used. I'm not sure how popular SqlUdate is in this era of Hibernate!

Thursday, June 01, 2006

Segmentation Fault Caused by /dev/zero Not Being Writable.

On Solaris, I am running a Java application inside a chroot using sudo to change the userid to be non-root. However, I repeatedly ran into segementation faults. The solultion to the problem was making the /dev/zero file writable by everyone (user, group, and world).